Science.gov

Sample records for camera narrow angle

  1. Reconditioning of Cassini Narrow-Angle Camera

    NASA Technical Reports Server (NTRS)

    2002-01-01

    These five images of single stars, taken at different times with the narrow-angle camera on NASA's Cassini spacecraft, show the effects of haze collecting on the camera's optics, then successful removal of the haze by warming treatments.

    The image on the left was taken on May 25, 2001, before the haze problem occurred. It shows a star named HD339457.

    The second image from left, taken May 30, 2001, shows the effect of haze that collected on the optics when the camera cooled back down after a routine-maintenance heating to 30 degrees Celsius (86 degrees Fahrenheit). The star is Maia, one of the Pleiades.

    The third image was taken on October 26, 2001, after a weeklong decontamination treatment at minus 7 C (19 F). The star is Spica.

    The fourth image was taken of Spica January 30, 2002, after a weeklong decontamination treatment at 4 C (39 F).

    The final image, also of Spica, was taken July 9, 2002, following three additional decontamination treatments at 4 C (39 F) for two months, one month, then another month.

    Cassini, on its way toward arrival at Saturn in 2004, is a cooperative project of NASA, the European Space Agency and the Italian Space Agency. The Jet Propulsion Laboratory, a division of the California Institute of Technology in Pasadena, manages the Cassini mission for NASA's Office of Space Science, Washington, D.C.

  2. Flight Calibration of the LROC Narrow Angle Camera

    NASA Astrophysics Data System (ADS)

    Humm, D. C.; Tschimmel, M.; Brylow, S. M.; Mahanti, P.; Tran, T. N.; Braden, S. E.; Wiseman, S.; Danton, J.; Eliason, E. M.; Robinson, M. S.

    2016-04-01

    Characterization and calibration are vital for instrument commanding and image interpretation in remote sensing. The Lunar Reconnaissance Orbiter Camera Narrow Angle Camera (LROC NAC) takes 500 Mpixel greyscale images of lunar scenes at 0.5 meters/pixel. It uses two nominally identical line scan cameras for a larger crosstrack field of view. Stray light, spatial crosstalk, and nonlinearity were characterized using flight images of the Earth and the lunar limb. These are important for imaging shadowed craters, studying ˜1 meter size objects, and photometry respectively. Background, nonlinearity, and flatfield corrections have been implemented in the calibration pipeline. An eight-column pattern in the background is corrected. The detector is linear for DN = 600--2000 but a signal-dependent additive correction is required and applied for DN<600. A predictive model of detector temperature and dark level was developed to command dark level offset. This avoids images with a cutoff at DN=0 and minimizes quantization error in companding. Absolute radiometric calibration is derived from comparison of NAC images with ground-based images taken with the Robotic Lunar Observatory (ROLO) at much lower spatial resolution but with the same photometric angles.

  3. Extracting Accurate and Precise Topography from Lroc Narrow Angle Camera Stereo Observations

    NASA Astrophysics Data System (ADS)

    Henriksen, M. R.; Manheim, M. R.; Speyerer, E. J.; Robinson, M. S.; LROC Team

    2016-06-01

    The Lunar Reconnaissance Orbiter Camera (LROC) includes two identical Narrow Angle Cameras (NAC) that acquire meter scale imaging. Stereo observations are acquired by imaging from two or more orbits, including at least one off-nadir slew. Digital terrain models (DTMs) generated from the stereo observations are controlled to Lunar Orbiter Laser Altimeter (LOLA) elevation profiles. With current processing methods, digital terrain models (DTM) have absolute accuracies commensurate than the uncertainties of the LOLA profiles (~10 m horizontally and ~1 m vertically) and relative horizontal and vertical precisions better than the pixel scale of the DTMs (2 to 5 m). The NAC stereo pairs and derived DTMs represent an invaluable tool for science and exploration purposes. We computed slope statistics from 81 highland and 31 mare DTMs across a range of baselines. Overlapping DTMs of single stereo sets were also combined to form larger area DTM mosaics, enabling detailed characterization of large geomorphic features and providing a key resource for future exploration planning. Currently, two percent of the lunar surface is imaged in NAC stereo and continued acquisition of stereo observations will serve to strengthen our knowledge of the Moon and geologic processes that occur on all the terrestrial planets.

  4. Two Years of Digital Terrain Model Production Using the Lunar Reconnaissance Orbiter Narrow Angle Camera

    NASA Astrophysics Data System (ADS)

    Burns, K.; Robinson, M. S.; Speyerer, E.; LROC Science Team

    2011-12-01

    One of the primary objectives of the Lunar Reconnaissance Orbiter Camera (LROC) is to gather stereo observations with the Narrow Angle Camera (NAC). These stereo observations are used to generate digital terrain models (DTMs). The NAC has a pixel scale of 0.5 to 2.0 meters but was not designed for stereo observations and thus requires the spacecraft to roll off-nadir to acquire these images. Slews interfere with the data collection of the other instruments, so opportunities are currently limited to four per day. Arizona State University has produced DTMs from 95 stereo pairs for 11 Constellation Project (CxP) sites (Aristarchus, Copernicus crater, Gruithuisen domes, Hortensius domes, Ina D-caldera, Lichtenberg crater, Mare Ingenii, Marius hills, Reiner Gamma, South Pole-Aitkin Rim, Sulpicius Gallus) as well as 30 other regions of scientific interest (including: Bhabha crater, highest and lowest elevation points, Highland Ponds, Kugler Anuchin, Linne Crater, Planck Crater, Slipher crater, Sears Crater, Mandel'shtam Crater, Virtanen Graben, Compton/Belkovich, Rumker Domes, King Crater, Luna 16/20/23/24 landing sites, Ranger 6 landing site, Wiener F Crater, Apollo 11/14/15/17, fresh craters, impact melt flows, Larmor Q crater, Mare Tranquillitatis pit, Hansteen Alpha, Moore F Crater, and Lassell Massif). To generate DTMs, the USGS ISIS software and SOCET SET° from BAE Systems are used. To increase the absolute accuracy of the DTMs, data obtained from the Lunar Orbiter Laser Altimeter (LOLA) is used to coregister the NAC images and define the geodetic reference frame. NAC DTMs have been used in examination of several sites, e.g. Compton-Belkovich, Marius Hills and Ina D-caldera [1-3]. LROC will continue to acquire high-resolution stereo images throughout the science phase of the mission and any extended mission opportunities, thus providing a vital dataset for scientific research as well as future human and robotic exploration. [1] B.L. Jolliff (2011) Nature

  5. Narrow Angle movie

    NASA Technical Reports Server (NTRS)

    1999-01-01

    This brief three-frame movie of the Moon was made from three Cassini narrow-angle images as the spacecraft passed by the Moon on the way to its closest approach with Earth on August 17, 1999. The purpose of this particular set of images was to calibrate the spectral response of the narrow-angle camera and to test its 'on-chip summing mode' data compression technique in flight. From left to right, they show the Moon in the green, blue and ultraviolet regions of the spectrum in 40, 60 and 80 millisecond exposures, respectively. All three images have been scaled so that the brightness of Crisium basin, the dark circular region in the upper right, is the same in each image. The spatial scale in the blue and ultraviolet images is 1.4 miles per pixel (2.3 kilometers). The original scale in the green image (which was captured in the usual manner and then reduced in size by 2x2 pixel summing within the camera system) was 2.8 miles per pixel (4.6 kilometers). It has been enlarged for display to the same scale as the other two. The imaging data were processed and released by the Cassini Imaging Central Laboratory for Operations (CICLOPS) at the University of Arizona's Lunar and Planetary Laboratory, Tucson, AZ.

    Photo Credit: NASA/JPL/Cassini Imaging Team/University of Arizona

    Cassini, launched in 1997, is a joint mission of NASA, the European Space Agency and Italian Space Agency. The mission is managed by NASA's Jet Propulsion Laboratory, Pasadena, CA, for NASA's Office of Space Science, Washington DC. JPL is a division of the California Institute of Technology, Pasadena, CA.

  6. NFC - Narrow Field Camera

    NASA Astrophysics Data System (ADS)

    Koukal, J.; Srba, J.; Gorková, S.

    2015-01-01

    We have been introducing a low-cost CCTV video system for faint meteor monitoring and here we describe the first results from 5 months of two-station operations. Our system called NFC (Narrow Field Camera) with a meteor limiting magnitude around +6.5mag allows research on trajectories of less massive meteoroids within individual parent meteor showers and the sporadic background. At present 4 stations (2 pairs with coordinated fields of view) of NFC system are operated in the frame of CEMeNt (Central European Meteor Network). The heart of each NFC station is a sensitive CCTV camera Watec 902 H2 and a fast cinematographic lens Meopta Meostigmat 1/50 - 52.5 mm (50 mm focal length and fixed aperture f/1.0). In this paper we present the first results based on 1595 individual meteors, 368 of which were recorded from two stations simultaneously. This data set allows the first empirical verification of theoretical assumptions for NFC system capabilities (stellar and meteor magnitude limit, meteor apparent brightness distribution and accuracy of single station measurements) and the first low mass meteoroid trajectory calculations. Our experimental data clearly showed the capabilities of the proposed system for low mass meteor registration and for calculations based on NFC data to lead to a significant refinement in the orbital elements for low mass meteoroids.

  7. High-resolution topomapping of candidate MER landing sites with Mars Orbiter Camera narrow-angle images

    USGS Publications Warehouse

    Kirk, R.L.; Howington-Kraus, E.; Redding, B.; Galuszka, D.; Hare, T.M.; Archinal, B.A.; Soderblom, L.A.; Barrett, J.M.

    2003-01-01

    We analyzed narrow-angle Mars Orbiter Camera (MOC-NA) images to produce high-resolution digital elevation models (DEMs) in order to provide topographic and slope information needed to assess the safety of candidate landing sites for the Mars Exploration Rovers (MER) and to assess the accuracy of our results by a variety of tests. The mapping techniques developed also support geoscientific studies and can be used with all present and planned Mars-orbiting scanner cameras. Photogrammetric analysis of MOC stereopairs yields DEMs with 3-pixel (typically 10 m) horizontal resolution, vertical precision consistent with ???0.22 pixel matching errors (typically a few meters), and slope errors of 1-3??. These DEMs are controlled to the Mars Orbiter Laser Altimeter (MOLA) global data set and consistent with it at the limits of resolution. Photoclinometry yields DEMs with single-pixel (typically ???3 m) horizontal resolution and submeter vertical precision. Where the surface albedo is uniform, the dominant error is 10-20% relative uncertainty in the amplitude of topography and slopes after "calibrating" photoclinometry against a stereo DEM to account for the influence of atmospheric haze. We mapped portions of seven candidate MER sites and the Mars Pathfinder site. Safety of the final four sites (Elysium, Gusev, Isidis, and Meridiani) was assessed by mission engineers by simulating landings on our DEMs of "hazard units" mapped in the sites, with results weighted by the probability of landing on those units; summary slope statistics show that most hazard units are smooth, with only small areas of etched terrain in Gusev crater posing a slope hazard.

  8. Methane Band and Continuum Band Imaging of Titan's Atmosphere Using Cassini ISS Narrow Angle Camera Pictures from the CURE/Cassini Imaging Project

    NASA Astrophysics Data System (ADS)

    Shitanishi, Jennifer; Gillam, S. D.

    2009-05-01

    The study of Titan's atmosphere, which bears resemblance to early Earth's, may help us understand more of our own. Constructing a Monte Carlo model of Titan's atmosphere is helpful to achieve this goal. Methane (MT) and continuum band (CB) images of Titan taken by the CURE/Cassini Imaging Project, using the Cassini Narrow Angle Camera (NAC) were analyzed. They were scheduled by Cassini Optical Navigation. Images were obtained at phase 53°, 112°, 161°, and 165°. They include 22 total MT1(center wavelength at 619nm), MT2(727nm), MT3(889nm), CB1(635nm), CB2(751nm), and CB3(938nm) images. They were reduced with previously written scripts using the National Optical Astronomy Observatory Image Reduction and Analysis Facility scientific analysis suite. Correction for horizontal and vertical banding and cosmic ray hits were made. The MT images were registered with corresponding CB images to ensure that subsequently measured fluxes ratios came from the same parts of the atmosphere. Preliminary DN limb-to-limb scans and loci of the haze layers will be presented. Accurate estimates of the sub-spacecraft points on each picture will be presented. Flux ratios (FMT/FCB=Q0) along the scans and total absorption coefficients along the lines of sight from the spacecraft through the pixels (and into Titan) will also be presented.

  9. Photogrammetric measurement of 3D freeform millimetre-sized objects with micro features: an experimental validation of the close-range camera calibration model for narrow angles of view

    NASA Astrophysics Data System (ADS)

    Percoco, Gianluca; Sánchez Salmerón, Antonio J.

    2015-09-01

    The measurement of millimetre and micro-scale features is performed by high-cost systems based on technologies with narrow working ranges to accurately control the position of the sensors. Photogrammetry would lower the costs of 3D inspection of micro-features and would be applicable to the inspection of non-removable micro parts of large objects too. Unfortunately, the behaviour of photogrammetry is not known when photogrammetry is applied to micro-features. In this paper, the authors address these issues towards the application of digital close-range photogrammetry (DCRP) to the micro-scale, taking into account that in literature there are research papers stating that an angle of view (AOV) around 10° is the lower limit to the application of the traditional pinhole close-range calibration model (CRCM), which is the basis of DCRP. At first a general calibration procedure is introduced, with the aid of an open-source software library, to calibrate narrow AOV cameras with the CRCM. Subsequently the procedure is validated using a reflex camera with a 60 mm macro lens, equipped with extension tubes (20 and 32 mm) achieving magnification of up to 2 times approximately, to verify literature findings with experimental photogrammetric 3D measurements of millimetre-sized objects with micro-features. The limitation experienced by the laser printing technology, used to produce the bi-dimensional pattern on common paper, has been overcome using an accurate pattern manufactured with a photolithographic process. The results of the experimental activity prove that the CRCM is valid for AOVs down to 3.4° and that DCRP results are comparable with the results of existing and more expensive commercial techniques.

  10. Fluorescence nanoscopy by polarization modulation and polarization angle narrowing.

    PubMed

    Hafi, Nour; Grunwald, Matthias; van den Heuvel, Laura S; Aspelmeier, Timo; Chen, Jian-Hua; Zagrebelsky, Marta; Schütte, Ole M; Steinem, Claudia; Korte, Martin; Munk, Axel; Walla, Peter J

    2014-05-01

    When excited with rotating linear polarized light, differently oriented fluorescent dyes emit periodic signals peaking at different times. We show that measurement of the average orientation of fluorescent dyes attached to rigid sample structures mapped to regularly defined (50 nm)(2) image nanoareas can provide subdiffraction resolution (super resolution by polarization demodulation, SPoD). Because the polarization angle range for effective excitation of an oriented molecule is rather broad and unspecific, we narrowed this range by simultaneous irradiation with a second, de-excitation, beam possessing a polarization perpendicular to the excitation beam (excitation polarization angle narrowing, ExPAN). This shortened the periodic emission flashes, allowing better discrimination between molecules or nanoareas. Our method requires neither the generation of nanometric interference structures nor the use of switchable or blinking fluorescent probes. We applied the method to standard wide-field microscopy with camera detection and to two-photon scanning microscopy, imaging the fine structural details of neuronal spines. PMID:24705472

  11. A camera for a narrow and deep welding groove

    NASA Astrophysics Data System (ADS)

    Vehmanen, Miika S.; Korhonen, Mika; Mäkynen, Anssi J.

    2008-06-01

    In this paper welding seam imaging in a very narrow and deep groove is presented. Standard camera optics can not be used as it does not reach the bottom of the groove. Therefore, selecting suitable imaging optics and components was the main challenge of the study. The implementation is based on image transmission via a borescope. The borescope has a long and narrow tube with graded index relay optics inside. To avoid excessive heating, the borescope tube is enclosed in a cooling pipe. The performance of the imaging system was tested by measuring its modulation transfer function (MTF) and visually evaluated its distortion. The results show that a borescope providing VGA resolution is adequate for the application. The spectrum of the welding processes was studied to determine optimum window to observe the welding seam and electrode. Optimal bandwidth was found in region of 700nm-1000nm.

  12. Inflight Calibration of the Lunar Reconnaissance Orbiter Camera Wide Angle Camera

    NASA Astrophysics Data System (ADS)

    Mahanti, P.; Humm, D. C.; Robinson, M. S.; Boyd, A. K.; Stelling, R.; Sato, H.; Denevi, B. W.; Braden, S. E.; Bowman-Cisneros, E.; Brylow, S. M.; Tschimmel, M.

    2016-04-01

    The Lunar Reconnaissance Orbiter Camera (LROC) Wide Angle Camera (WAC) has acquired more than 250,000 images of the illuminated lunar surface and over 190,000 observations of space and non-illuminated Moon since 1 January 2010. These images, along with images from the Narrow Angle Camera (NAC) and other Lunar Reconnaissance Orbiter instrument datasets are enabling new discoveries about the morphology, composition, and geologic/geochemical evolution of the Moon. Characterizing the inflight WAC system performance is crucial to scientific and exploration results. Pre-launch calibration of the WAC provided a baseline characterization that was critical for early targeting and analysis. Here we present an analysis of WAC performance from the inflight data. In the course of our analysis we compare and contrast with the pre-launch performance wherever possible and quantify the uncertainty related to various components of the calibration process. We document the absolute and relative radiometric calibration, point spread function, and scattered light sources and provide estimates of sources of uncertainty for spectral reflectance measurements of the Moon across a range of imaging conditions.

  13. A wide-angle camera module for disposable endoscopy

    NASA Astrophysics Data System (ADS)

    Shim, Dongha; Yeon, Jesun; Yi, Jason; Park, Jongwon; Park, Soo Nam; Lee, Nanhee

    2016-08-01

    A wide-angle miniaturized camera module for disposable endoscope is demonstrated in this paper. A lens module with 150° angle of view (AOV) is designed and manufactured. All plastic injection-molded lenses and a commercial CMOS image sensor are employed to reduce the manufacturing cost. The image sensor and LED illumination unit are assembled with a lens module. The camera module does not include a camera processor to further reduce its size and cost. The size of the camera module is 5.5 × 5.5 × 22.3 mm3. The diagonal field of view (FOV) of the camera module is measured to be 110°. A prototype of a disposable endoscope is implemented to perform a pre-clinical animal testing. The esophagus of an adult beagle dog is observed. These results demonstrate the feasibility of a cost-effective and high-performance camera module for disposable endoscopy.

  14. A wide-angle camera module for disposable endoscopy

    NASA Astrophysics Data System (ADS)

    Shim, Dongha; Yeon, Jesun; Yi, Jason; Park, Jongwon; Park, Soo Nam; Lee, Nanhee

    2016-06-01

    A wide-angle miniaturized camera module for disposable endoscope is demonstrated in this paper. A lens module with 150° angle of view (AOV) is designed and manufactured. All plastic injection-molded lenses and a commercial CMOS image sensor are employed to reduce the manufacturing cost. The image sensor and LED illumination unit are assembled with a lens module. The camera module does not include a camera processor to further reduce its size and cost. The size of the camera module is 5.5 × 5.5 × 22.3 mm3. The diagonal field of view (FOV) of the camera module is measured to be 110°. A prototype of a disposable endoscope is implemented to perform a pre-clinical animal testing. The esophagus of an adult beagle dog is observed. These results demonstrate the feasibility of a cost-effective and high-performance camera module for disposable endoscopy.

  15. WIDE-ANGLE, NARROW-ANGLE, AND IMAGING BASELINES OF OPTICAL LONG-BASELINE INTERFEROMETERS

    SciTech Connect

    Woillez, J.; Lacour, S. E-mail: sylvestre.lacour@obspm.fr

    2013-02-10

    For optical interferometers, the baseline is typically defined as the vector joining two perfectly identical telescopes. However, when the telescopes are naturally different or when the requirements on the baseline vector challenge the telescope perfection, the baseline definition depends on how the interferometer is used. This is where the notions of wide-angle, narrow-angle, and imaging baselines come into play. This article explores this variety of baselines, with the purpose of presenting a coherent set of definitions, describing how they relate to each other, and suggesting baseline metrology requirements. Ultimately, this work aims at supporting upcoming long-baseline optical interferometers with narrow-angle astrometry and phase-referenced imaging capabilities at the microarcsecond level.

  16. Associations between Narrow Angle and Adult Anthropometry: The Liwan Eye Study

    PubMed Central

    Jiang, Yuzhen; He, Mingguang; Friedman, David S.; Khawaja, Anthony P.; Lee, Pak Sang; Nolan, Winifred P.; Yin, Qiuxia; Foster, Paul J.

    2015-01-01

    Purpose To assess the associations between narrow angle and adult anthropometry. Methods Chinese adults aged 50 years and older were recruited from a population-based survey in the Liwan District of Guangzhou, China. Narrow angle was defined as the posterior trabecular meshwork not visible under static gonioscopy in at least three quadrants (i.e. a circumference of at least 270°). Logistic regression models were used to examine the associations between narrow angle and anthropomorphic measures (height, weight and body mass index, BMI). Results Among the 912 participants, lower weight, shorter height, and lower BMI were significantly associated with narrower angle width (tests for trend: mean angle width in degrees vs weight p<0.001; vs height p<0.001; vs BMI p = 0.012). In univariate analyses, shorter height, lower weight and lower BMI were all significantly associated with greater odds of narrow angle. The crude association between height and narrow angle was largely attributable to a stronger association with age and sex. Lower BMI and weight remained significantly associated with narrow angle after adjustment for height, age, sex, axial ocular biometric measures and education. In analyses stratified by sex, the association between BMI and narrow angle was only observed in women. Conclusion Lower BMI and weight were associated with significantly greater odds of narrow angle after adjusting for age, education, axial ocular biometric measures and height. The odds of narrow angle increased 7% per 1 unit decrease in BMI. This association was most evident in women. PMID:24707840

  17. Association between choroidal thickness and anterior chamber segment in eyes with narrow or open-angle

    PubMed Central

    Li, Song-Feng; Wu, Ge-Wei; Chen, Chang-Xi; Shen, Ling; Zhang, Zhi-Bao; Gao, Fei; Wang, Ning-Li

    2016-01-01

    AIM To investigate the relationship between choroidal thickness and anterior chamber segment in subjects with eyes with narrow or open-angle. METHODS The subfoveal choroidal thickness was measured with enhanced depth-imaging optical coherence tomography and anterior chamber parameters were measured with ultrasound biomicroscopy in one eye of 23 subjects with open-angle eyes and 38 subjects with narrow-angle eyes. The mean age was 59.52±7.04y for narrow-angle subjects and 60.76±7.23y for open-angle subjects (P=0.514). Multivariate linear regression analysis was performed to assess the association between choroidal thickness and narrow-angle parameters. RESULTS There were no differences in subfoveal choroidal thickness between open- and narrow-angle subjects (P=0.231). Anterior chamber parameters, including central anterior chamber depth, trabecular iris angle, iris thickness 500 µm from the scleral spur (IT500), and ciliary body thickness at 1 mm and 2 mm from the scleral spur (CBT1, CBT2) showed significant differences between the two groups (P<0.05). Subfoveal choroidal thickness showed negative correlation (β=-0.496, P=0.016) only with anterior chamber depth in the open-angle group and with age (β=-0.442, P=0.003) and IT500 (β=-0.399, P=0.008) in the narrow-angle group. However, subfoveal choroidal thickness was not correlated with trabecular iris angle, anterior chamber depth, ciliary body thickness, or central corneal thickness in the narrow-angle group. CONCLUSION Choroidal thickness does not differ in the two groups and has not correlated with anterior chamber parameters in narrow-angle subjects, suggesting a lack of relationship between choroidal thickness and primary angle-closure glaucoma. PMID:27588269

  18. The Effect of Vertical Camera Angle on the Perceived Credibility of a Televised Speaker.

    ERIC Educational Resources Information Center

    Avery, Robert K.; Long, John F.

    In order to test the effects of high camera angle on credibility, high- and low-camera-angle versions of a televised speaker were recorded on videotape. Semantic differential ratings of the speaker were obtained from 176 college students randomly assigned to high- or low-camera-angle treatment groups. Factors identified as dynamism, competence,…

  19. Improved wide-angle, fisheye and omnidirectional camera calibration

    NASA Astrophysics Data System (ADS)

    Urban, Steffen; Leitloff, Jens; Hinz, Stefan

    2015-10-01

    In this paper an improved method for calibrating wide-angle, fisheye and omnidirectional imaging systems is presented. We extend the calibration procedure proposed by Scaramuzza et al. by replacing the residual function and joint refinement of all parameters. In doing so, we achieve a more stable, robust and accurate calibration (up to factor 7) and can reduce the number of necessary calibration steps from five to three. After introducing the camera model and highlighting the differences from the current calibration procedure, we perform a comprehensive performance evaluation using several data sets and show the impact of the proposed calibration procedure on the calibration results.

  20. Narrow Field-Of Visual Odometry Based on a Focused Plenoptic Camera

    NASA Astrophysics Data System (ADS)

    Zeller, N.; Quint, F.; Stilla, U.

    2015-03-01

    In this article we present a new method for visual odometry based on a focused plenoptic camera. This method fuses the depth data gained by a monocular Simultaneous Localization and Mapping (SLAM) algorithm and the one received from a focused plenoptic camera. Our algorithm uses the depth data and the totally focused images supplied by the plenoptic camera to run a real-time semi-dense direct SLAM algorithm. Based on this combined approach, the scale ambiguity of a monocular SLAM system can be overcome. Furthermore, the additional light-field information highly improves the tracking capabilities of the algorithm. Thus, visual odometry even for narrow field of view (FOV) cameras is possible. We show that not only tracking profits from the additional light-field information. By accumulating the depth information over multiple tracked images, also the depth accuracy of the focused plenoptic camera can be highly improved. This novel approach improves the depth error by one order of magnitude compared to the one received from a single light-field image.

  1. Improved iris localization by using wide and narrow field of view cameras for iris recognition

    NASA Astrophysics Data System (ADS)

    Kim, Yeong Gon; Shin, Kwang Yong; Park, Kang Ryoung

    2013-10-01

    Biometrics is a method of identifying individuals by their physiological or behavioral characteristics. Among other biometric identifiers, iris recognition has been widely used for various applications that require a high level of security. When a conventional iris recognition camera is used, the size and position of the iris region in a captured image vary according to the X, Y positions of a user's eye and the Z distance between a user and the camera. Therefore, the searching area of the iris detection algorithm is increased, which can inevitably decrease both the detection speed and accuracy. To solve these problems, we propose a new method of iris localization that uses wide field of view (WFOV) and narrow field of view (NFOV) cameras. Our study is new as compared to previous studies in the following four ways. First, the device used in our research acquires three images, one each of the face and both irises, using one WFOV and two NFOV cameras simultaneously. The relation between the WFOV and NFOV cameras is determined by simple geometric transformation without complex calibration. Second, the Z distance (between a user's eye and the iris camera) is estimated based on the iris size in the WFOV image and anthropometric data of the size of the human iris. Third, the accuracy of the geometric transformation between the WFOV and NFOV cameras is enhanced by using multiple matrices of the transformation according to the Z distance. Fourth, the searching region for iris localization in the NFOV image is significantly reduced based on the detected iris region in the WFOV image and the matrix of geometric transformation corresponding to the estimated Z distance. Experimental results showed that the performance of the proposed iris localization method is better than that of conventional methods in terms of accuracy and processing time.

  2. Spectral data of specular reflectance, narrow-angle transmittance and angle-resolved surface scattering of materials for solar concentrators

    PubMed Central

    Good, Philipp; Cooper, Thomas; Querci, Marco; Wiik, Nicolay; Ambrosetti, Gianluca; Steinfeld, Aldo

    2015-01-01

    The spectral specular reflectance of conventional and novel reflective materials for solar concentrators is measured with an acceptance angle of 17.5 mrad over the wavelength range 300−2500 nm at incidence angles 15–60° using a spectroscopic goniometry system. The same experimental setup is used to determine the spectral narrow-angle transmittance of semi-transparent materials for solar collector covers at incidence angles 0–60°. In addition, the angle-resolved surface scattering of reflective materials is recorded by an area-scan CCD detector over the spectral range 350–1050 nm. A comprehensive summary, discussion, and interpretation of the results are included in the associated research article “Spectral reflectance, transmittance, and angular scattering of materials for solar concentrators” in Solar Energy Materials and Solar Cells. PMID:26862556

  3. Characterizing Geometric Distortion of the Lunar Reconnaissance Orbiter Wide Angle Camera

    NASA Astrophysics Data System (ADS)

    Speyerer, E.; Wagner, R.; Robinson, M. S.; Becker, K. J.; Anderson, J.; Thomas, P. C.

    2011-12-01

    Each month the Lunar Reconnaissance Orbiter (LRO) Wide Angle Camera (WAC) provides 100 m scale images of nearly the entire Moon, each month with different range of lighting conditions [1]. Pre-flight calibration efforts provided a baseline for correcting the geometric distortion present in the WAC. However, residual errors of 1-2 pixels existed with this original model. In-flight calibration enables the derivation of a precise correction for geometric distortion to provide sub-pixel map projection accuracy. For the in-flight calibration, we compared WAC images to high-resolution (0.5 - 2.0 meter scale) images provided by the Narrow Angle Camera (NAC). Since the NAC has very narrow field of view (2.86°) its geometric accuracy is well characterized. The additions of the WAC-derived 100 m/pixel digital terrain model (GLD100) [2] and refined ephemeris provided by LOLA [3] have improved our efforts to remove small distortion artifacts in the WAC camera model. Since the NAC field of view is always in the same cross-track location in the WAC frame, NAC and WAC images of the same regions, under similar lighting conditions, were map projected. Hundreds of NAC (truth image) and WAC images were then co-registered using an automatic registration algorithm in ISIS [4]. This output was fed into a second ISIS program (fplanemap) that converted the registration offsets to focal plane coordinates for the distorted (original) and undistorted (corrected location derived from the truth image) pixel [4]. With this dataset, offsets in the WAC distortion model were identified and accounted for with a new 2D Taylor series function that has been added to the existing radial model. This technique improves the accurate placement of each pixel across the sensor in target space. We have applied this correction to the 643 nm band and will derive the coefficients for the remaining bands. Once this study is complete, a new camera model, instrument kernel (IK), and frames kernel (FK) will be

  4. 11. 22'X34' original blueprint, VariableAngle Launcher, 'SIDE VIEW CAMERA WOOD ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    11. 22'X34' original blueprint, Variable-Angle Launcher, 'SIDE VIEW CAMERA WOOD FRAME SUPERSTRUCTURE' drawn at 1/2'=1'-0'. (BOURD Sketch # 209125). - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  5. 12. 22'X34' original blueprint, VariableAngle Launcher, 'GENERAL SIDE VIEW CAMERA ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    12. 22'X34' original blueprint, Variable-Angle Launcher, 'GENERAL SIDE VIEW CAMERA STATIONS' drawn at 1/2'=1'-0'. (BUORD Sketch # 209111). - Variable Angle Launcher Complex, Camera Stations, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  6. 13. 22'X34' original vellum, VariableAngle Launcher, 'SIDEVIEW CAMERA CAR TRACK ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    13. 22'X34' original vellum, Variable-Angle Launcher, 'SIDEVIEW CAMERA CAR TRACK DETAILS' drawn at 1/4'=1'-0' (BUORD Sketch # 208078, PAPW 908). - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  7. 10. 22'X34' original blueprint, VariableAngle Launcher, 'SIDE VIEW CAMERA CARSTEEL ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    10. 22'X34' original blueprint, Variable-Angle Launcher, 'SIDE VIEW CAMERA CAR-STEEL FRAME AND AXLES' drawn at 1/2'=1'-0'. (BOURD Sketch # 209124). - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  8. 12. 22'X34' original vellum, VariableAngle Launcher, 'SIDE VIEW CAMERA TRACK ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    12. 22'X34' original vellum, Variable-Angle Launcher, 'SIDE VIEW CAMERA TRACK H-20 BRIDGE MODIFICATIONS' drawn at 3/16'=1'-0' and 1/2'1'-0'. (BUORD Sketch # 208784, PAPW 907). - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  9. High-Precision Narrow Angle Astrometry with a Space-Borne Interferometer

    NASA Technical Reports Server (NTRS)

    Milman, Mark H.; Murphy, Dave

    2008-01-01

    This paper develops an observing and processing scheme for narrow angle astrometry using a single baseline interferometer without the aid of "grid" stars to characterize the interferometer baseline vector in inertial space. The basic concept derives from the recognition that over a narrow field the set of fundamental unknown instrument parameters that arise because the interferometer baseline vector has large uncertainties (since there are no grid star measurements) is indistinguishable from a particular set of unobservable errors in the determination of star positions within the field. Reference stars within the narrow field of regard are used to circumvent the unobservable modes. Feasibility of the approach is demonstrated through analysis and example simulations.

  10. The Wide Angle Camera for the Rosetta Mission

    NASA Astrophysics Data System (ADS)

    Debei, S.; Angrilli, F.; Barbieri, C.; Bianchini, G.; da Deppo, V.; de Cecco, M.; Fornasier, S.; Guizzo, G.; Naletto, G.; Ragazzoni, R.; Saggin, B.; Tondello, G.; Zaccariotto, M.; Brunello, F.; Peron, F.

    1999-12-01

    The Wide Angle Camera (WAC) for the Rosetta mission had to fulfil many scientific requirements: Field of View of 12x12 sq deg and focal length of 140 mm, excellent optical throughput in the range 240 - 900 nm after 10 years in space, Encircled Energy of 80 ratio of 10E+4 in order to detect faint gaseous emission features around a bright nucleus, minimum exposure times of 10 msec with photometric accuracy better than 5x102, scattered light rejection for sources out of the FoV (e.g. Sun) and in the FoV (e.g. cometary nucleus), a cover to close off the cometary dust, an optical bench capable to maintain the optical alignment in a passive way and to support the shutter, the baffle, a double filter wheel and the Focal Plane Assembly. To these initial requirements several other constraints were added in the course of the design, in particular a very complex thermal profile, a massive shielding of the front FoV in order to protect the CCD from the cosmic radiation, and a very strict total mass envelope. These requirements called for an unconventional optical design, with 2 aspherical mirrors in an off-axis configuration (the primary mirror being convex), and a carefully studied 3-stage baffle. Both the shutter and the front cover provided extremely challenging technological goals, for the mechanical and for the electronics aspects of them. The paper describes all the main elements of the WAC. At present, the Structural Thermo Model has been delivered after successful completion of vibration and vacuum tests. The STM optical bench configuration has been slightly revised for the flight Model in order to provide more attenuation to the internal baffle.

  11. Multi-frequency properties of an narrow angle tail radio galaxy J 0037+18

    NASA Astrophysics Data System (ADS)

    Patra, Dusmanta; Chakrabarti, Sandip Kumar; Pal, Sabyasachi; Konar, Chiranjib

    2016-07-01

    We will present multi-frequency properties of narrow angle tailed radio galaxy J 0037+18 using data from Giant Metrewave Radio Telescope (GMRT) and Jansky Very Large Array (JVLA). The angle between two lobes is only 38 degree. We will discuss magnetic field and particle life time of the jet. Spectral properties of the source will be discussed. We also used optical and X-ray data to investigate host environment.

  12. High-precision early mission narrow angle sciene with the Space Interferometry Mission

    NASA Technical Reports Server (NTRS)

    Shaklan, S.; Milman, M. H.; Pan, X.

    2002-01-01

    We have developed a technique that allows SIM to measure relative stellar positions with an accuracy of 1 micro-arcsecond at any time during its 5-yr mission. Unlike SIM's standard narrow-angle approach, Gridless Narrow Angle Astrometry (GNAA) does not rely on the global reference frame of grid stars that reaches full accuracy after 5 years. GNAA is simply the application of traditional single-telescope narrow angle techniques to SIM's narrow angle optical path delay measurements. In GNAA, a set of reference stars and a target star are observed at several baseline orientations. A linearized model uses delay measurements to solve for star positions and baseline orientations. A conformal transformation maps observations at different epochs to a common reference frame. The technique works on short period signals (P=days to months), allowing it to be applied to many of the known extra-solar planets, intriguing radio/X- ray binaries, and other periodic sources. The technique's accuracy is limited in the long-term by false acceleration due to a combination of reference star and target star proper motion. The science capability 1 micro-arcsecond astrometric precision - is unique to SIM.

  13. Ultra-narrow bandpass filters for infrared applications with improved angle of incidence performance

    NASA Astrophysics Data System (ADS)

    Rahmlow, Thomas D.; Fredell, Markus; Chanda, Sheetal; Johnson, Robert

    2016-05-01

    Narrow band-pass optical interference filters are used for a variety of applications to improve signal quality in laser based systems. Applications include LIDAR, sensor processing and free space communications. A narrow band width optical filter allows for passage of the laser signal while rejecting ambient light. The more narrow the bandwidth, the better the signal to noise. However, the bandwidth of a design for a particular application is typically limited by a number of factors including spectral shift over the operational angles of incidence, thermal shift over the range of operating temperature and, in the case of laser communication, rejection of adjacent laser channels. The trade-off of these parameters can significantly impact system design and performance. This paper presents design and material approaches to maximize the performance of narrow bandpass filters in the infrared.

  14. Sheath effects on current collection by particle detectors with narrow acceptance angles

    NASA Technical Reports Server (NTRS)

    Singh, N.; Baugher, C. R.

    1981-01-01

    Restriction of the aperture acceptance angle of an ion or electron trap on an attracting spacecraft significantly alters the volt-ampere characteristics of the instrument in a low Mach number plasma. It is shown when the angular acceptance of the aperture is restricted the current to the collector tends to be independent of the Debye length. Expressions for the RPA characteristics for both a thin sheath and a thick sheath are derived; and it is shown that as the aperture is narrowed the curves tend toward equivalence.

  15. On-Orbit Geometric Calibration of the Lunar Reconnaissance Orbiter Wide Angle Camera

    NASA Astrophysics Data System (ADS)

    Speyerer, E. J.; Wagner, R.; Robinson, M. S.

    2013-12-01

    Lunar Reconnaissance Orbiter (LRO) is equipped with a single Wide Angle Camera (WAC) [1] designed to collect monochromatic and multispectral observations of the lunar surface. Cartographically accurate image mosaics and stereo image based terrain models requires the position of each pixel in a given image be known to a corresponding point on the lunar surface with a high degree of accuracy and precision. The Lunar Reconnaissance Orbiter Camera (LROC) team initially characterized the WAC geometry prior to launch at the Malin Space Science Systems calibration facility. After lunar orbit insertion, the LROC team recognized spatially varying geometric offsets between color bands. These misregistrations made analysis of the color data problematic and showed that refinements to the pre-launch geometric analysis were necessary. The geometric parameters that define the WAC optical system were characterized from statistics gathered from co-registering over 84,000 image pairs. For each pair, we registered all five visible WAC bands to a precisely rectified Narrow Angle Camera (NAC) image (accuracy <15 m) [2] to compute key geometric parameters. In total, we registered 2,896 monochrome and 1,079 color WAC observations to nearly 34,000 NAC observations and collected over 13.7 million data points across the visible portion of the WAC CCD. Using the collected statistics, we refined the relative pointing (yaw, pitch and roll), effective focal length, principal point coordinates, and radial distortion coefficients. This large dataset also revealed spatial offsets between bands after orthorectification due to chromatic aberrations in the optical system. As white light enters the optical system, the light bends at different magnitudes as a function of wavelength, causing a single incident ray to disperse in a spectral spread of color [3,4]. This lateral chromatic aberration effect, also known as 'chromatic difference in magnification' [5] introduces variation to the effective focal

  16. Simulating a dual beam combiner at SUSI for narrow-angle astrometry

    NASA Astrophysics Data System (ADS)

    Kok, Yitping; Maestro, Vicente; Ireland, Michael J.; Tuthill, Peter G.; Robertson, J. Gordon

    2013-08-01

    The Sydney University Stellar Interferometer (SUSI) has two beam combiners, i.e. the Precision Astronomical Visible Observations (PAVO) and the Microarcsecond University of Sydney Companion Astrometry (MUSCA). The primary beam combiner, PAVO, can be operated independently and is typically used to measure properties of binary stars of less than 50 milliarcsec (mas) separation and the angular diameters of single stars. On the other hand, MUSCA was recently installed and must be used in tandem with the former. It is dedicated for microarcsecond precision narrow-angle astrometry of close binary stars. The performance evaluation and development of the data reduction pipeline for the new setup was assisted by an in-house computer simulation tool developed for this and related purposes. This paper describes the framework of the simulation tool, simulations carried out to evaluate the performance of each beam combiner and the expected astrometric precision of the dual beam combiner setup, both at SUSI and possible future sites.

  17. Numerical Simulations of Narrow Angle Tailed Radio Sources : The Jones and Owen Model

    NASA Astrophysics Data System (ADS)

    Norman, Michael L.; Balsara, Dinshaw; O'Dea, Chris

    1994-05-01

    In previous work, Balsara and Norman (1992), we had focussed on simulating narrow angle tailed radio sources as fluid beams that are bent by the ram pressure of a cross flowing ICM. This allowed a comparison with several observable attributes of NAT sources. In particular we were able to make estimates of bending rate, beam profile, radial evolution and formation and interspacing of knots. The previous work suffered from the fact that the ISM of the host galaxy was not represented. In this work we remove that restriction by representing the host galaxy's ISM self-consistently using the prescription given in Balsara, Livio and O'Dea (1994). We then run jets through it and compare with observations.

  18. Narrow-angle tail radio sources and evidence for radial orbits in Abell clusters

    NASA Technical Reports Server (NTRS)

    O'Dea, Christopher P.; Owen, Frazer N.; Sarazin, Craig L.

    1986-01-01

    Published observational data on the tail orientations (TOs) of 60 narrow-angle-tail (NAT) radio sources in Abell clusters of galaxies are analyzed statistically using a maximum-likelihood approach. The results are presented in a table, and it is found that the observed TO distributions in the whole sample and in subsamples of morphologically regular NATs and NATs with pericentric distances d greater than 500 kpc are consistent with isotropic orbits, whereas the TOs for NATs with d less than 500 kpc are consistent with highly radial orbits. If radial orbits were observed near the centers of other types of cluster galaxies as well, it could be inferred that violent relaxation during cluster formation was incomplete, and that clusters form by spherical collapse and secondary infall, as proposed by Gunn (1977).

  19. Narrow Angle Wide Spectral Range Radiometer Design FEANICS/REEFS Radiometer Design Report

    NASA Technical Reports Server (NTRS)

    Camperchioli, William

    2005-01-01

    A critical measurement for the Radiative Enhancement Effects on Flame Spread (REEFS) microgravity combustion experiment is the net radiative flux emitted from the gases and from the solid fuel bed. These quantities are measured using a set of narrow angle, wide spectral range radiometers. The radiometers are required to have an angular field of view of 1.2 degrees and measure over the spectral range of 0.6 to 30 microns, which presents a challenging design effort. This report details the design of this radiometer system including field of view, radiometer response, radiometric calculations, temperature effects, error sources, baffling and amplifiers. This report presents some radiometer specific data but does not present any REEFS experiment data.

  20. In situ measurements of particle friction angles in steep, narrow channels

    NASA Astrophysics Data System (ADS)

    Prancevic, J.; Lamb, M. P.

    2013-12-01

    vary in a consistent manner with bed slope (φ = 51°, 67°, and 65°, respectively). At an individual site the degree of interlocking is the primary control on particle friction angle. However, the degree of interlocking was not higher in the steep (θ = 9.0°), narrow (W/D50 = 12.5) channel. This indicates that increased grain stability may not play a crucial role in increasing the threshold shear stresses required for sediment motion on very steep slopes.

  1. A two camera video imaging system with application to parafoil angle of attack measurements

    NASA Astrophysics Data System (ADS)

    Meyn, Larry A.; Bennett, Mark S.

    1991-01-01

    This paper describes the development of a two-camera, video imaging system for the determination of three-dimensional spatial coordinates from stereo images. This system successfully measured angle of attack at several span-wise locations for large-scale parafoils tested in the NASA Ames 80- by 120-Foot Wind Tunnel. Measurement uncertainty for angle of attack was less than 0.6 deg. The stereo ranging system was the primary source for angle of attack measurements since inclinometers sewn into the fabric ribs of the parafoils had unknown angle offsets acquired during installation. This paper includes discussions of the basic theory and operation of the stereo ranging system, system measurement uncertainty, experimental set-up, calibration results, and test results. Planned improvements and enhancements to the system are also discussed.

  2. The first demonstration of the concept of "narrow-FOV Si/CdTe semiconductor Compton camera"

    NASA Astrophysics Data System (ADS)

    Ichinohe, Yuto; Uchida, Yuusuke; Watanabe, Shin; Edahiro, Ikumi; Hayashi, Katsuhiro; Kawano, Takafumi; Ohno, Masanori; Ohta, Masayuki; Takeda, Shin`ichiro; Fukazawa, Yasushi; Katsuragawa, Miho; Nakazawa, Kazuhiro; Odaka, Hirokazu; Tajima, Hiroyasu; Takahashi, Hiromitsu; Takahashi, Tadayuki; Yuasa, Takayuki

    2016-01-01

    The Soft Gamma-ray Detector (SGD), to be deployed on board the ASTRO-H satellite, has been developed to provide the highest sensitivity observations of celestial sources in the energy band of 60-600 keV by employing a detector concept which uses a Compton camera whose field-of-view is restricted by a BGO shield to a few degree (narrow-FOV Compton camera). In this concept, the background from outside the FOV can be heavily suppressed by constraining the incident direction of the gamma ray reconstructed by the Compton camera to be consistent with the narrow FOV. We, for the first time, demonstrate the validity of the concept using background data taken during the thermal vacuum test and the low-temperature environment test of the flight model of SGD on ground. We show that the measured background level is suppressed to less than 10% by combining the event rejection using the anti-coincidence trigger of the active BGO shield and by using Compton event reconstruction techniques. More than 75% of the signals from the field-of-view are retained against the background rejection, which clearly demonstrates the improvement of signal-to-noise ratio. The estimated effective area of 22.8 cm2 meets the mission requirement even though not all of the operational parameters of the instrument have been fully optimized yet.

  3. Narrow Angle Diversity using ACTS Ka-band Signal with Two USAT Ground Stations

    NASA Technical Reports Server (NTRS)

    Kalu, A.; Emrich, C.; Ventre, J.; Wilson, W.; Acosta, R.

    1998-01-01

    Two ultra small aperture terminal (USAT) ground stations, separated by 1.2 km in a narrow angle diversity configuration, received a continuous Ka-band tone sent from Cleveland Link Evaluation Terminal (LET). The signal was transmitted to the USAT ground stations via NASA's Advanced Communications Technology Satellite (ACTS) steerable beam. Received signal power at the two sites was measured and analyzed. A dedicated datalogger at each site recorded time-of-tip data from tipping bucket rain gauges, providing rain amount and instantaneous rain rate. WSR-88D data was also obtained for the collection period. Eleven events with ground-to-satellite slant-path precipitation and resultant signal attenuation were observed during the data collection period. Fade magnitude and duration were compared at the two sites and diversity gain was calculated. These results exceeded standard diversity gain model predictions by several decibels. Rain statistics from tipping bucket data and from radar data were also compared to signal attenuation. The nature of Florida's subtropical rainfall, specifically its impact on signal attenuation at the sites, was addressed.

  4. Synthesizing wide-angle and arbitrary view-point images from a circular camera array

    NASA Astrophysics Data System (ADS)

    Fukushima, Norishige; Yendo, Tomohiro; Fujii, Toshiaki; Tanimoto, Masayuki

    2006-02-01

    We propose a technique of Imaged-Based Rendering(IBR) using a circular camera array. By the result of having recorded the scene as surrounding the surroundings, we can synthesize a more dynamic arbitrary viewpoint images and a wide angle images like a panorama . This method is based on Ray- Space, one of the image-based rendering, like Light Field. Ray-Space is described by the position (x, y) and a direction (θ, φ) of the ray's parameter which passes a reference plane. All over this space, when the camera has been arranged circularly, the orbit of the point equivalent to an Epipor Plane Image(EPI) at the time of straight line arrangement draws a sin curve. Although described in a very clear form, in case a rendering is performed, pixel of which position of which camera being used and the work for which it asks become complicated. Therefore, the position (u, v) of the position (s, t) pixel of a camera like Light Filed redescribes space expression. It makes the position of a camera a polar-coordinates system (r, theta), and is making it close to description of Ray-Space. Thereby, although the orbit of a point serves as a complicated periodic function of periodic 2pi, the handling of a rendering becomes easy. From such space, the same as straight line arrangement, arbitrary viewpoint picture synthesizing is performed only due to a geometric relationship between cameras. Moreover, taking advantage of the characteristic of concentrating on one circular point, we propose the technique of generating a wide-angle picture like a panorama. When synthesizing a viewpoint, since it is overlapped and is recording the ray of all the directions of the same position, this becomes possible. Having stated until now is the case where it is a time of the camera fully having been arranged and a plenoptic sampling being filled. The discrete thing which does not fill a sampling is described from here. When arranging a camera in a straight line and compounding a picture, in spite of

  5. Optical design of the wide angle camera for the Rosetta mission.

    PubMed

    Naletto, Giampiero; Da, Deppo Vania; Pelizzo, Maria Guglielmina; Ragazzoni, Roberto; Marchetti, Enrico

    2002-03-01

    The final optical design of the Wide Angle Camera for the Rosetta mission to the P/Wirtanen comet is described. This camera is an F/5.6 telescope with a rather large 12 degrees x 12 degrees field of view. To satisfy the scientific requirements for spatial resolution, contrast capability, and spectral coverage, a two-mirror, off-axis, and unobstructed optical design, believed to be novel, has been adopted. This configuration has been simulated with a ray-tracing code, showing that theoretically more than 80% of the collimated beam energy falls within a single pixel (20" x 20") over the whole camera field of view and that the possible contrast ratio is smaller than 1/1000. Moreover, this novel optical design is rather simple from a mechanical point of view and is compact and relatively easy to align. All these characteristics make this type of camera rather flexible and also suitable for other space missions with similar performance requirements. PMID:11900025

  6. Visible–infrared achromatic imaging by wavefront coding with wide-angle automobile camera

    NASA Astrophysics Data System (ADS)

    Ohta, Mitsuhiko; Sakita, Koichi; Shimano, Takeshi; Sugiyama, Takashi; Shibasaki, Susumu

    2016-09-01

    We perform an experiment of achromatic imaging with wavefront coding (WFC) using a wide-angle automobile lens. Our original annular phase mask for WFC was inserted to the lens, for which the difference between the focal positions at 400 nm and at 950 nm is 0.10 mm. We acquired images of objects using a WFC camera with this lens under the conditions of visible and infrared light. As a result, the effect of the removal of the chromatic aberration of the WFC system was successfully determined. Moreover, we fabricated a demonstration set assuming the use of a night vision camera in an automobile and showed the effect of the WFC system.

  7. Mesosphere light scattering depolarization during the Perseids activity epoch by wide-angle polarization camera measurements

    NASA Astrophysics Data System (ADS)

    Ugolnikov, Oleg S.; Maslov, Igor A.

    2014-03-01

    The paper describes the study of scattered radiation field in the mesosphere basing on wide-angle polarization camera (WAPC) measurements of the twilight sky background and single scattering separation procedure. Mid-August observations in 2012 and 2013 show the decrease of single scattering polarization value probably related with Perseids meteor dust moderation in the upper mesosphere. Effect correlates with activity of tiny fraction of Perseids shower. Polarization and temperature analysis allows estimating the altitude of dust layer and character polarization of dust scattering.

  8. Possible association of glaucoma with pectinate ligament dysplasia and narrowing of the iridocorneal angle in Shiba Inu dogs in Japan.

    PubMed

    Kato, Kumiko; Sasaki, Nobuo; Matsunaga, Satoru; Mochizuki, Manabu; Nishimura, Ryohei; Ogawa, Hiroyuki

    2006-01-01

    The purpose of the study was to describe the shape of the pectinate ligament (PL) and to estimate the width of the iridocorneal angle (ICA) in glaucomatous and non-glaucomatous eye of Shiba Inu dogs in Japan. One hundred fourteen Shiba Inu dogs were presented to the Veterinary Medical Center at the University of Tokyo between June 1998 and June 2003. Among these, 46 dogs had glaucoma; the remaining 68 dogs were presented for routine vaccinations or heartworm testing and were used as a control population. Complete ophthalmic examination and gonioscopy were performed in all the dogs. PL and ICA were evaluated by gonioscopy and goniophotographs. Of the 46 dogs with glaucoma, 17 (37%) were affected bilaterally, and 29 (63%) were affected unilaterally. Of the 29 dogs with unilateral glaucoma, the ICA of the normotensive eye was slightly narrow in 2 eyes (7%), narrow in 12 eyes (41%), and closed in 15 eyes (52%). Among the normotensive control group, the ICA was open in 13 dogs (19%), slightly narrow in 29 (43%), narrow in 22 (32%), and closed in 4 (6%). In dogs with narrow and slightly narrow ICAs, the PLs were thickened, and in some cases, formed a solid sheet. The majority of dogs with glaucoma had changes in both ICA and PL. A narrow or slightly narrow ICA was also detected in the majority of normotensive control dogs. This study suggests that ICA narrowing and PL thickening is a common abnormality in Shiba Inu dogs in Japan and may predispose those dogs to glaucoma. PMID:16497230

  9. A New Approach to Micro-arcsecond Astrometry with SIM Allowing Early Mission Narrow Angle Measurements of Compelling Astronomical Targets

    NASA Technical Reports Server (NTRS)

    Shaklan, Stuart; Pan, Xiaopei

    2004-01-01

    The Space Interferometry Mission (SIM) is capable of detecting and measuring the mass of terrestrial planets around stars other than our own. It can measure the mass of black holes and the visual orbits of radio and x-ray binary sources. SIM makes possible a new level of understanding of complex astrophysical processes. SIM achieves its high precision in the so-called narrow-angle regime. This is defined by a 1 degree diameter field in which the position of a target star is measured with respect to a set of reference stars. The observation is performed in two parts: first, SIM observes a grid of stars that spans the full sky. After a few years, repeated observations of the grid allow one to determine the orientation of the interferometer baseline. Second, throughout the mission, SIM periodically observes in the narrow-angle mode. Every narrow-angle observation is linked to the grid to determine the precise attitude and length of the baseline. The narrow angle process demands patience. It is not until five years after launch that SIM achieves its ultimate accuracy of 1 microarcsecond. The accuracy is degraded by a factor of approx. 2 at mid-mission. Our work proposes a technique for narrow angle astrometry that does not rely on the measurement of grid stars. This technique, called Gridless Narrow Angle Astrometry (GNAA) can obtain microarcsecond accuracy and can detect extra-solar planets and other exciting objects with a few days of observation. It can be applied as early as during the first six months of in-orbit calibration (IOC). The motivations for doing this are strong. First, and obviously, it is an insurance policy against a catastrophic mid-mission failure. Second, at the start of the mission, with several space-based interferometers in the planning or implementation phase, NASA will be eager to capture the public's imagination with interferometric science. Third, early results and a technique that can duplicate those results throughout the mission will

  10. The faster the narrower: characteristic bulk velocities and jet opening angles of gamma-ray bursts

    NASA Astrophysics Data System (ADS)

    Ghirlanda, G.; Ghisellini, G.; Salvaterra, R.; Nava, L.; Burlon, D.; Tagliaferri, G.; Campana, S.; D'Avanzo, P.; Melandri, A.

    2013-01-01

    The jet opening angle θjet and the bulk Lorentz factor Γ0 are crucial parameters for the computation of the energetics of gamma-ray bursts (GRBs). From the ˜30 GRBs with measured θjet or Γ0 it is known that (i) the real energetic Eγ, obtained by correcting the isotropic equivalent energy Eiso for the collimation factor ˜ θ2jet, is clustered around 1050-1051 erg and it is correlated with the peak energy Ep of the prompt emission and (ii) the comoving frame E'p and E'γ are clustered around typical values. Current estimates of Γ0 and θjet are based on incomplete data samples and their observed distributions could be subject to biases. Through a population synthesis code we investigate whether different assumed intrinsic distributions of Γ0 and θjet can reproduce a set of observational constraints. Assuming that all bursts have the same E'p and E'γ in the comoving frame, we find that Γ0 and θjet cannot be distributed as single power laws. The best agreement between our simulation and the available data is obtained assuming (a) log-normal distributions for θjet and Γ0 and (b) an intrinsic relation between the peak values of their distributions, i.e. θjet2.5Γ0 = const. On average, larger values of Γ0 (i.e. the `faster' bursts) correspond to smaller values of θjet (i.e. the `narrower'). We predict that ˜6 per cent of the bursts that point to us should not show any jet break in their afterglow light curve since they have sin θjet < 1/Γ0. Finally, we estimate that the local rate of GRBs is ˜0.3 per cent of all local Type Ib/c supernova (SNIb/c) and ˜4.3 per cent of local hypernovae, i.e. SNIb/c with broad lines.

  11. Evaluation of the Quality of Action Cameras with Wide-Angle Lenses in Uav Photogrammetry

    NASA Astrophysics Data System (ADS)

    Hastedt, H.; Ekkel, T.; Luhmann, T.

    2016-06-01

    The application of light-weight cameras in UAV photogrammetry is required due to restrictions in payload. In general, consumer cameras with normal lens type are applied to a UAV system. The availability of action cameras, like the GoPro Hero4 Black, including a wide-angle lens (fish-eye lens) offers new perspectives in UAV projects. With these investigations, different calibration procedures for fish-eye lenses are evaluated in order to quantify their accuracy potential in UAV photogrammetry. Herewith the GoPro Hero4 is evaluated using different acquisition modes. It is investigated to which extent the standard calibration approaches in OpenCV or Agisoft PhotoScan/Lens can be applied to the evaluation processes in UAV photogrammetry. Therefore different calibration setups and processing procedures are assessed and discussed. Additionally a pre-correction of the initial distortion by GoPro Studio and its application to the photogrammetric purposes will be evaluated. An experimental setup with a set of control points and a prospective flight scenario is chosen to evaluate the processing results using Agisoft PhotoScan. Herewith it is analysed to which extent a pre-calibration and pre-correction of a GoPro Hero4 will reinforce the reliability and accuracy of a flight scenario.

  12. Detection of microcalcification clusters by 2D-mammography and narrow and wide angle digital breast tomosynthesis

    NASA Astrophysics Data System (ADS)

    Hadjipanteli, Andria; Elangovan, Premkumar; Looney, Padraig T.; Mackenzie, Alistair; Wells, Kevin; Dance, David R.; Young, Kenneth C.

    2016-03-01

    The aim of this study was to compare the detection of microcalcification clusters by human observers in breast images using 2D-mammography and narrow (15°/15 projections) and wide (50°/25 projections) angle digital breast tomosynthesis (DBT). Simulated microcalcification clusters with a range of microcalcification diameters (125 μm-275 μm) were inserted into 6 cm thick simulated compressed breasts. Breast images were produced with and without inserted microcalcification clusters using a set of image modelling tools, which were developed to represent clinical imaging by mammography and tomosynthesis. Commercially available software was used for image processing and image reconstruction. The images were then used in a series of 4-alternative forced choice (4AFC) human observer experiments conducted for signal detection with the microcalcification clusters as targets. The minimum detectable calcification diameter was found for each imaging modality: (i) 2D-mammography: 164+/-5 μm (ii) narrow angle DBT: 210+/-5 μm, (iii) wide angle DBT: 255+/-4 μm. A statistically significant difference was found between the minimum detectable calcification diameters that can be detected by the three imaging modalities. Furthermore, it was found that there was not a statistically significant difference between the results of the five observers that participated in this study. In conclusion, this study presents a method that quantifies the threshold diameter required for microcalcification detection, using high resolution, realistic images with observers, for the comparison of DBT geometries with 2D-mammography. 2Dmammography can visualise smaller detail diameter than both DBT imaging modalities and narrow-angle DBT can visualise a smaller detail diameter than wide-angle DBT.

  13. Empirical Photometric Normalization for the Seven Band UV-VIS Lunar Reconnaissance Orbiter Wide Angle Camera

    NASA Astrophysics Data System (ADS)

    Boyd, A. K.; Robinson, M. S.; Nuno, R. G.; Sato, H.

    2014-12-01

    We present results on a near-global (80°S to 80°N) seven color Wide Angle Camera (WAC) photometric normalization and color analysis. Over 100,000 WAC color observations were calibrated to reflectance (radiance factor: IoF), and photometric angles (i,e,g), latitude, and longitude were calculated and stored for each WAC pixel. Photometric angles were calculated using the WAC GLD100 [1], and a six-dimensional data set (3 spatial and 3 photometric) was reduced to three by photometrically normalizing the IoF with a global wavelength-dependent, 3rd-order multivariate polynomial. The multispectral mosaic was normalized to a standard viewing geometry (incidence angle=30°, emission angle=0°, phase angle=30°).The WAC has a 60° cross-track field-of-view in color mode, which allows the acquisition of a near global data set each month; however, the phase angle can change by as much as 60° across each image. These large changes in viewing geometry present challenges to the required photometric normalization. In the ratio of the 321 nm and 689 nm wavelengths, the Moon has a standard deviation less than 3% in the highlands and 7% globally; thus to allow confident identification of true color differences, the photometric normalization must be precise. Pyroclastic deposits in Marius Hills, Sinus Aestuum, and Mare Serenitatis are among the least reflective materials, with 643 nm normalized reflectance values less than 0.036.Low-reflectance deposits are generally concentrated close to the equator on the nearside, whereas high-reflectance materials are dispersed globally. The highest reflectance materials occur at Giordano Bruno and Virtanen craters and are attributed to exposure of immature materials. Immature ejecta has shallower spectral slope compared to the mean highlands spectra (321 nm to 689 nm), and UV weathering characteristics can be seen when comparing different aged Copernican ejecta [2]. Copernican ejecta is found to have 643 nm reflectance values greater than 0

  14. Small-angle approximation to the transfer of narrow laser beams in anisotropic scattering media

    NASA Technical Reports Server (NTRS)

    Box, M. A.; Deepak, A.

    1981-01-01

    The broadening and the signal power detected of a laser beam traversing an anisotropic scattering medium were examined using the small-angle approximation to the radiative transfer equation in which photons suffering large-angle deflections are neglected. To obtain tractable answers, simple Gaussian and non-Gaussian functions for the scattering phase functions are assumed. Two other approximate approaches employed in the field to further simplify the small-angle approximation solutions are described, and the results obtained by one of them are compared with those obtained using small-angle approximation. An exact method for obtaining the contribution of each higher order scattering to the radiance field is examined but no results are presented.

  15. Narrowing of the emission angle in high-intensity Compton scattering

    NASA Astrophysics Data System (ADS)

    Harvey, C. N.; Gonoskov, A.; Marklund, M.; Wallin, E.

    2016-02-01

    We consider the emission spectrum of high-energy electrons in an intense laser field. At high intensities (a0˜200 ) we find that the QED theory predicts a narrower angular spread of emissions than the classical theory. This is due to the classical theory overestimating the energy loss of the particles, resulting in them becoming more susceptible to reflection in the laser pulse.

  16. Automatic screening of narrow anterior chamber angle and angle-closure glaucoma based on slit-lamp image analysis by using support vector machine.

    PubMed

    Theeraworn, C; Kongprawechnon, W; Kondo, T; Bunnun, P; Nishihara, A; Manassakorn, A

    2013-01-01

    At present, Van Herick's method is a standard technique used to screen a Narrow Anterior Chamber Angle (NACA) and Angle-Closure Glaucoma (ACG). It can identify a patient who suffers from NACA and ACG by considering the width of peripheral anterior chamber depth (PACD) and corneal thickness. However, the screening result of this method often varies among ophthalmologists. So, an automatic screening of NACA and ACG based on slit-lamp image analysis by using Support Vector Machine (SVM) is proposed. SVM can automatically generate the classification model, which is used to classify the result as an angle-closure likely or an angle-closure unlikely. It shows that it can improve the accuracy of the screening result. To develop the classification model, the width of PACD and corneal thickness from many positions are measured and selected to be features. A statistic analysis is also used in the PACD and corneal thickness estimation in order to reduce the error from reflection on the cornea. In this study, it is found that the generated models are evaluated by using 5-fold cross validation and give a better result than the result classified by Van Herick's method. PMID:24111078

  17. Development of a large-angle pinhole gamma camera with depth-of-interaction capability for small animal imaging

    NASA Astrophysics Data System (ADS)

    Baek, C.-H.; An, S. J.; Kim, H.-I.; Choi, Y.; Chung, Y. H.

    2012-01-01

    A large-angle gamma camera was developed for imaging small animal models used in medical and biological research. The simulation study shows that a large field of view (FOV) system provides higher sensitivity with respect to a typical pinhole gamma cameras by reducing the distance between the pinhole and the object. However, this gamma camera suffers from the degradation of the spatial resolution at the periphery region due to parallax error by obliquely incident photons. We propose a new method to measure the depth of interaction (DOI) using three layers of monolithic scintillators to reduce the parallax error. The detector module consists of three layers of monolithic CsI(Tl) crystals with dimensions of 50.0 × 50.0 × 2.0 mm3, a Hamamatsu H8500 PSPMT and a large-angle pinhole collimator with an acceptance angle of 120°. The 3-dimensional event positions were determined by the maximum-likelihood position-estimation (MLPE) algorithm and the pre-generated look up table (LUT). The spatial resolution (FWHM) of a Co-57 point-like source was measured at different source position with the conventional method (Anger logic) and with DOI information. We proved that high sensitivity can be achieved without degradation of spatial resolution using a large-angle pinhole gamma camera: this system can be used as a small animal imaging tool.

  18. A triple axis double crystal multiple reflection camera for ultra small angle X-ray scattering

    NASA Astrophysics Data System (ADS)

    Lambard, Jacques; Lesieur, Pierre; Zemb, Thomas

    1992-06-01

    To extend the domain of small angle X-ray scattering requires multiple reflection crystals to collimate the beam. A double crystal, triple axis X-ray camera using multiple reflection channel cut crystals is described. Procedures for measuring the desmeared scattering cross-section on absolute scale are described as well as the measurement from several typical samples : fibrils of collagen, 0.3 μm diameter silica spheres, 0.16 μm diameter interacting latex spheres, porous lignite coal, liquid crystals in a surfactant-water system, colloidal crystal of 0.32 μm diameter silica spheres. L'extension du domaine de diffusion des rayons-X vers les petits angles demande l'emploi de cristaux à réflexions multiples pour collimater le faisceau. Nous décrivons une caméra à rayons-X à trois axes où les réflexions multiples sont réalisées dans deux cristaux à gorge. Nous donnons ensuite les procédures de déconvolution pour obtenir la section efficace de diffusion en échelle absolue, ainsi que les résultats des mesures effectuées avec plusieurs échantillons typiques : fibres de collagène, sphères de silice de 0,3 μm de diamètre, sphères de latex de 0,16 μm de diamètre en interaction, charbon lignite poreux, cristaux liquides formés dans un système eau-tensioactif, solution colloïdale de sphères de silice de 0,32 μm de diamètre.

  19. Wide angle and narrow-band asymmetric absorption in visible and near-infrared regime through lossy Bragg stacks.

    PubMed

    Shu, Shiwei; Zhan, Yawen; Lee, Chris; Lu, Jian; Li, Yang Yang

    2016-01-01

    Absorber is an important component in various optical devices. Here we report a novel type of asymmetric absorber in the visible and near-infrared spectrum which is based on lossy Bragg stacks. The lossy Bragg stacks can achieve near-perfect absorption at one side and high reflection at the other within the narrow bands (several nm) of resonance wavelengths, whereas display almost identical absorption/reflection responses for the rest of the spectrum. Meanwhile, this interesting wavelength-selective asymmetric absorption behavior persists for wide angles, does not depend on polarization, and can be ascribed to the lossy characteristics of the Bragg stacks. Moreover, interesting Fano resonance with easily tailorable peak profiles can be realized using the lossy Bragg stacks. PMID:27251768

  20. Wide angle and narrow-band asymmetric absorption in visible and near-infrared regime through lossy Bragg stacks

    NASA Astrophysics Data System (ADS)

    Shu, Shiwei; Zhan, Yawen; Lee, Chris; Lu, Jian; Li, Yang Yang

    2016-06-01

    Absorber is an important component in various optical devices. Here we report a novel type of asymmetric absorber in the visible and near-infrared spectrum which is based on lossy Bragg stacks. The lossy Bragg stacks can achieve near-perfect absorption at one side and high reflection at the other within the narrow bands (several nm) of resonance wavelengths, whereas display almost identical absorption/reflection responses for the rest of the spectrum. Meanwhile, this interesting wavelength-selective asymmetric absorption behavior persists for wide angles, does not depend on polarization, and can be ascribed to the lossy characteristics of the Bragg stacks. Moreover, interesting Fano resonance with easily tailorable peak profiles can be realized using the lossy Bragg stacks.

  1. Wide angle and narrow-band asymmetric absorption in visible and near-infrared regime through lossy Bragg stacks

    PubMed Central

    Shu, Shiwei; Zhan, Yawen; Lee, Chris; Lu, Jian; Li, Yang Yang

    2016-01-01

    Absorber is an important component in various optical devices. Here we report a novel type of asymmetric absorber in the visible and near-infrared spectrum which is based on lossy Bragg stacks. The lossy Bragg stacks can achieve near-perfect absorption at one side and high reflection at the other within the narrow bands (several nm) of resonance wavelengths, whereas display almost identical absorption/reflection responses for the rest of the spectrum. Meanwhile, this interesting wavelength-selective asymmetric absorption behavior persists for wide angles, does not depend on polarization, and can be ascribed to the lossy characteristics of the Bragg stacks. Moreover, interesting Fano resonance with easily tailorable peak profiles can be realized using the lossy Bragg stacks. PMID:27251768

  2. A New Lunar Atlas: Mapping the Moon with the Wide Angle Camera

    NASA Astrophysics Data System (ADS)

    Speyerer, E.; Robinson, M. S.; Boyd, A.; Sato, H.

    2012-12-01

    The Lunar Reconnaissance Orbiter (LRO) spacecraft launched in June 2009 and began systematically mapping the lunar surface and providing a priceless dataset for the planetary science community and future mission planners. From 20 September 2009 to 11 December 2011, the spacecraft was in a nominal 50 km polar orbit, except for two one-month long periods when a series of spacecraft maneuvers enabled low attitude flyovers (as low as 22 km) of key exploration and scientifically interesting targets. One of the instruments, the Lunar Reconnaissance Orbiter Camera (LROC) Wide Angle Camera (WAC) [1], captured nearly continuous synoptic views of the illuminated lunar surface. The WAC is a 7-band (321, 360, 415, 566, 604, 643, 689 nm) push frame imager with field of view of 60° in color mode and 90° in monochrome mode. This broad field of view enables the WAC to reimage nearly 50% (at the equator where the orbit tracks our spaced the furthest) of the terrain it imaged in the previous orbit. The visible bands of map projected WAC images have a pixel scale of 100 m, while UV bands have a pixel scale of 400 m due to 4x4 pixel on-chip binning that increases signal-to-noise. The nearly circular polar orbit and short (two hour) orbital periods enable seamless mosaics of broad areas of the surface with uniform lighting and resolution. In March of 2011, the LROC team released the first version of the global monochrome (643nm) morphologic map [2], which was comprised of 15,000 WAC images collected over three periods. With the over 130,000 WAC images collected while the spacecraft was in the 50 km orbit, a new set of mosaics are being produced by the LROC Team and will be released to the Planetary Data Systems. These new maps include an updated morphologic map with an improved set of images (limiting illumination variations and gores due to off-nadir observation of other instruments) and a new photometric correction derived from the LROC WAC dataset. In addition, a higher sun (lower

  3. Early direct-injection, low-temperature combustion of diesel fuel in an optical engine utilizing a 15-hole, dual-row, narrow-included-angle nozzle.

    SciTech Connect

    Gehrke, Christopher R.; Radovanovic, Michael S.; Milam, David M.; Martin, Glen C.; Mueller, Charles J.

    2008-04-01

    Low-temperature combustion of diesel fuel was studied in a heavy-duty, single-cylinder optical engine employing a 15-hole, dual-row, narrow-included-angle nozzle (10 holes x 70/mD and 5 holes x 35/mD) with 103-/gmm-diameter orifices. This nozzle configuration provided the spray targeting necessary to contain the direct-injected diesel fuel within the piston bowl for injection timings as early as 70/mD before top dead center. Spray-visualization movies, acquired using a high-speed camera, show that impingement of liquid fuel on the piston surface can result when the in-cylinder temperature and density at the time of injection are sufficiently low. Seven single- and two-parameter sweeps around a 4.82-bar gross indicated mean effective pressure load point were performed to map the sensitivity of the combustion and emissions to variations in injection timing, injection pressure, equivalence ratio, simulated exhaust-gas recirculation, intake temperature, intake boost pressure, and load. High-speed movies of natural luminosity were acquired by viewing through a window in the cylinder wall and through a window in the piston to provide quasi-3D information about the combustion process. These movies revealed that advanced combustion phasing resulted in intense pool fires within the piston bowl, after the end of significant heat release. These pool fires are a result of fuel-films created when the injected fuel impinged on the piston surface. The emissions results showed a strong correlation with pool-fire activity. Smoke and NO/dx emissions rose steadily as pool-fire intensity increased, whereas HC and CO showed a dramatic increase with near-zero pool-fire activity.

  4. A Wide-Angle Camera for the Mobile Asteroid Surface Scout (MASCOT) on Hayabusa-2

    NASA Astrophysics Data System (ADS)

    Schmitz, N.; Koncz, A.; Jaumann, R.; Hoffmann, H.; Jobs, D.; Kachlicki, J.; Michaelis, H.; Mottola, S.; Pforte, B.; Schroeder, S.; Terzer, R.; Trauthan, F.; Tschentscher, M.; Weisse, S.; Ho, T.-M.; Biele, J.; Ulamec, S.; Broll, B.; Kruselburger, A.; Perez-Prieto, L.

    2014-04-01

    JAXA's Hayabusa-2 mission, an asteroid sample return mission, is scheduled for launch in December 2014, for a rendezvous with the C-type asteroid 1999 JU3 in 2018. MASCOT, the Mobile Asteroid Surface Scout [1], is a small lander, designed to deliver ground truth for the orbiter remote measurements, support the selection of sampling sites, and provide context for the returned samples.MASCOT's main objective is to investigate the landing site's geomorphology, the internal structure, texture and composition of the regolith (dust, soil and rocks), and the thermal, mechanical, and magnetic properties of the surface. MASCOT comprises a payload of four scientific instruments: camera, radiometer, magnetometer and hyper-spectral microscope. The camera (MASCOT CAM) was designed and built by DLR's Institute of Planetary Research, together with Airbus DS Germany.

  5. Miniature Wide-Angle Lens for Small-Pixel Electronic Camera

    NASA Technical Reports Server (NTRS)

    Mouroulils, Pantazis; Blazejewski, Edward

    2009-01-01

    A proposed wideangle lens is shown that would be especially well suited for an electronic camera in which the focal plane is occupied by an image sensor that has small pixels. The design of the lens is intended to satisfy requirements for compactness, high image quality, and reasonably low cost, while addressing issues peculiar to the operation of small-pixel image sensors. Hence, this design is expected to enable the development of a new generation of compact, high-performance electronic cameras. The lens example shown has a 60 degree field of view and a relative aperture (f-number) of 3.2. The main issues affecting the design are also shown.

  6. Fovea-stereographic: a projection function for ultra-wide-angle cameras

    NASA Astrophysics Data System (ADS)

    Samy, Ahmed Mahmoud; Gao, Zhishan

    2015-04-01

    A new ultra-wide-angle projection function called fovea-stereographic is described and characterized by the relative relationship between the radial distortion level and the object field-of-view (FOV) angle, creating a high-resolution wide foveal image and adequate peripheral information to be processed within a limited computational time. The paper also provides the design results of an innovative fast fovea-stereographic fisheye lens system with a 170 deg of FOV that shows a more than 58.8% (100 deg) high-resolution central foveal image and at least 15% more peripheral information than any other light projection. Our lens distortion curve, in addition to its modulation transfer function, produces a high-resolution projection for real-time tracking and image transmission applications.

  7. On an assessment of surface roughness estimates from lunar laser altimetry pulse-widths for the Moon from LOLA using LROC narrow-angle stereo DTMs.

    NASA Astrophysics Data System (ADS)

    Muller, Jan-Peter; Poole, William

    2013-04-01

    Neumann et al. [1] proposed that laser altimetry pulse-widths could be employed to derive "within-footprint" surface roughness as opposed to surface roughness estimated from between laser altimetry pierce-points such as the example for Mars [2] and more recently from the 4-pointed star-shaped LOLA (Lunar reconnaissance Orbiter Laser Altimeter) onboard the NASA-LRO [3]. Since 2009, the LOLA has been collecting extensive global laser altimetry data with a 5m footprint and ?25m between the 5 points in a star-shape. In order to assess how accurately surface roughness (defined as simple RMS after slope correction) derived from LROC matches with surface roughness derived from LOLA footprints, publicly released LROC-NA (LRO Camera Narrow Angle) 1m Digital Terrain Models (DTMs) were employed to measure the surface roughness directly within each 5m footprint. A set of 20 LROC-NA DTMs were examined. Initially the match-up between the LOLA and LROC-NA orthorectified images (ORIs) is assessed visually to ensure that the co-registration is better than the LOLA footprint resolution. For each LOLA footprint, the pulse-width geolocation is then retrieved and this is used to "cookie-cut" the surface roughness and slopes derived from the LROC-NA DTMs. The investigation which includes data from a variety of different landforms shows little, if any correlation between surface roughness estimated from DTMs with LOLA pulse-widths at sub-footprint scale. In fact there is only any perceptible correlation between LOLA and LROC-DTMs at baselines of 40-60m for surface roughness and 20m for slopes. [1] Neumann et al. Mars Orbiter Laser Altimeter pulse width measurements and footprint-scale roughness. Geophysical Research Letters (2003) vol. 30 (11), paper 1561. DOI: 10.1029/2003GL017048 [2] Kreslavsky and Head. Kilometer-scale roughness of Mars: results from MOLA data analysis. J Geophys Res (2000) vol. 105 (E11) pp. 26695-26711. [3] Rosenburg et al. Global surface slopes and roughness of the

  8. Matching the Best Viewing Angle in Depth Cameras for Biomass Estimation Based on Poplar Seedling Geometry

    PubMed Central

    Andújar, Dionisio; Fernández-Quintanilla, César; Dorado, José

    2015-01-01

    In energy crops for biomass production a proper plant structure is important to optimize wood yields. A precise crop characterization in early stages may contribute to the choice of proper cropping techniques. This study assesses the potential of the Microsoft Kinect for Windows v.1 sensor to determine the best viewing angle of the sensor to estimate the plant biomass based on poplar seedling geometry. Kinect Fusion algorithms were used to generate a 3D point cloud from the depth video stream. The sensor was mounted in different positions facing the tree in order to obtain depth (RGB-D) images from different angles. Individuals of two different ages, e.g., one month and one year old, were scanned. Four different viewing angles were compared: top view (0°), 45° downwards view, front view (90°) and ground upwards view (−45°). The ground-truth used to validate the sensor readings consisted of a destructive sampling in which the height, leaf area and biomass (dry weight basis) were measured in each individual plant. The depth image models agreed well with 45°, 90° and −45° measurements in one-year poplar trees. Good correlations (0.88 to 0.92) between dry biomass and the area measured with the Kinect were found. In addition, plant height was accurately estimated with a few centimeters error. The comparison between different viewing angles revealed that top views showed poorer results due to the fact the top leaves occluded the rest of the tree. However, the other views led to good results. Conversely, small poplars showed better correlations with actual parameters from the top view (0°). Therefore, although the Microsoft Kinect for Windows v.1 sensor provides good opportunities for biomass estimation, the viewing angle must be chosen taking into account the developmental stage of the crop and the desired parameters. The results of this study indicate that Kinect is a promising tool for a rapid canopy characterization, i.e., for estimating crop biomass

  9. Matching the best viewing angle in depth cameras for biomass estimation based on poplar seedling geometry.

    PubMed

    Andújar, Dionisio; Fernández-Quintanilla, César; Dorado, José

    2015-01-01

    In energy crops for biomass production a proper plant structure is important to optimize wood yields. A precise crop characterization in early stages may contribute to the choice of proper cropping techniques. This study assesses the potential of the Microsoft Kinect for Windows v.1 sensor to determine the best viewing angle of the sensor to estimate the plant biomass based on poplar seedling geometry. Kinect Fusion algorithms were used to generate a 3D point cloud from the depth video stream. The sensor was mounted in different positions facing the tree in order to obtain depth (RGB-D) images from different angles. Individuals of two different ages, e.g., one month and one year old, were scanned. Four different viewing angles were compared: top view (0°), 45° downwards view, front view (90°) and ground upwards view (-45°). The ground-truth used to validate the sensor readings consisted of a destructive sampling in which the height, leaf area and biomass (dry weight basis) were measured in each individual plant. The depth image models agreed well with 45°, 90° and -45° measurements in one-year poplar trees. Good correlations (0.88 to 0.92) between dry biomass and the area measured with the Kinect were found. In addition, plant height was accurately estimated with a few centimeters error. The comparison between different viewing angles revealed that top views showed poorer results due to the fact the top leaves occluded the rest of the tree. However, the other views led to good results. Conversely, small poplars showed better correlations with actual parameters from the top view (0°). Therefore, although the Microsoft Kinect for Windows v.1 sensor provides good opportunities for biomass estimation, the viewing angle must be chosen taking into account the developmental stage of the crop and the desired parameters. The results of this study indicate that Kinect is a promising tool for a rapid canopy characterization, i.e., for estimating crop biomass

  10. Lunar Reconnaissance Orbiter Camera (LROC) instrument overview

    USGS Publications Warehouse

    Robinson, M.S.; Brylow, S.M.; Tschimmel, M.; Humm, D.; Lawrence, S.J.; Thomas, P.C.; Denevi, B.W.; Bowman-Cisneros, E.; Zerr, J.; Ravine, M.A.; Caplinger, M.A.; Ghaemi, F.T.; Schaffner, J.A.; Malin, M.C.; Mahanti, P.; Bartels, A.; Anderson, J.; Tran, T.N.; Eliason, E.M.; McEwen, A.S.; Turtle, E.; Jolliff, B.L.; Hiesinger, H.

    2010-01-01

    The Lunar Reconnaissance Orbiter Camera (LROC) Wide Angle Camera (WAC) and Narrow Angle Cameras (NACs) are on the NASA Lunar Reconnaissance Orbiter (LRO). The WAC is a 7-color push-frame camera (100 and 400 m/pixel visible and UV, respectively), while the two NACs are monochrome narrow-angle linescan imagers (0.5 m/pixel). The primary mission of LRO is to obtain measurements of the Moon that will enable future lunar human exploration. The overarching goals of the LROC investigation include landing site identification and certification, mapping of permanently polar shadowed and sunlit regions, meter-scale mapping of polar regions, global multispectral imaging, a global morphology base map, characterization of regolith properties, and determination of current impact hazards.

  11. HAWC+: A Detector, Polarimetry, and Narrow-Band Imaging Upgrade to SOFIA's Far-Infrared Facility Camera

    NASA Astrophysics Data System (ADS)

    Dowell, C. D.; Staguhn, J.; Harper, D. A.; Ames, T. J.; Benford, D. J.; Berthoud, M.; Chapman, N. L.; Chuss, D. T.; Dotson, J. L.; Irwin, K. D.; Jhabvala, C. A.; Kovacs, A.; Looney, L.; Novak, G.; Stacey, G. J.; Vaillancourt, J. E.; HAWC+ Science Collaboration

    2013-01-01

    HAWC, the High-resolution Airborne Widebandwidth Camera, is the facility far-infrared camera for SOFIA, providing continuum imaging from 50 to 250 microns wavelength. As a result of NASA selection as a SOFIA Second Generation Instruments upgrade investigation, HAWC will be upgraded with enhanced capability for addressing current problems in star formation and interstellar medium physics prior to commissioning in early 2015. We describe the capabilities of the upgraded HAWC+, as well as our initial science program. The mapping speed of HAWC is increased by a factor of 9, accomplished by using NASA/Goddard's Backshort-Under-Grid bolometer detectors in a 64x40 format. Two arrays are used in a dual-beam polarimeter format, and the full complement of 5120 transition-edge detectors is read using NIST SQUID multiplexers and U.B.C. Multi-Channel Electronics. A multi-band polarimeter is added to the HAWC opto-mechanical system, at the cryogenic pupil image, employing rotating quartz half-wave plates. Six new filters are added to HAWC+, bringing the full set to 53, 63, 89, 155, and 216 microns at R = 5 resolution and 52, 63, 88, 158, and 205 microns at R = 300 resolution. The latter filters are fixed-tuned to key fine-structure emission lines from [OIII], [OI], [CII], and [NII]. Polarimetry can be performed in any of the filter bands. The first-light science program with HAWC+ emphasizes polarimetry for the purpose of mapping magnetic fields in Galactic clouds. The strength and character of magnetic fields in molecular clouds before, during, and after the star formation phase are largely unknown, despite pioneering efforts on the KAO and ground-based telescopes. SOFIA and HAWC+ provide significant new capability: sensitivity to extended dust emission (to A_V ~ 1) which is unmatched, ~10 arcsec angular resolution combined with wide-field mapping which allows statistical estimates of magnetic field strength, and wavelength coverage spanning the peak of the far

  12. Study of Flow and Heat Transfer Characteristics of non-periodical attack angle in Narrow Rectangular Channel with Longitudinal Vortex generators

    NASA Astrophysics Data System (ADS)

    Wang, L.; Huang, J.

    2010-03-01

    The heat transfer enhancement of Longitudinal Vortex (LV) is a kind of technology with good efficiency and low resistance. LV is produced by Longitudinal Vortex Generators (LVGs) mounted on the heated surface. With relative long influence distance and simple structure, the LVGs can be used in narrow channels with flat surface. The dimension of narrow rectangular channel is 600 mm (length)×40 mm (width) ×3 mm (gap width), the single rectangular block LVGs is laid out in one heated plate. The dimension of LVGs is as follows: height is 1.8 mm, width is 2.2 mm, length is 14 mm, transverse distance is 4 mm, and longitudinal distance is 150 mm. The attack angle of LVGs is very important to extend this kind of technology in narrow rectangular channel with water medium. In previous study, the attack angle of LVGs of periodicity mounted was discussed and the optimal value was 440. In this paper, the attack angle of the first and the second LVG are changed and the others keep 440. Study of flow and heat transfer characteristic of non-periodicity attack angle is completed. The result shows that with the change of attack angle of the first and the second LVGs, the heat transfer enhancement of water medium is advantageous. This conclusion should be extended when the working medium is vapor-liquid two-phase. The results of this calculate method are compared with the experimental results of thermal infrared imager and phase doppler particle analyzer, and they are reasonable. FLUENT6.2 is used to simulate this question, and three velocity components of water flow have been used to define residual intensity ratio of LV.

  13. 2. VAL CAMERA CAR, VIEW OF CAMERA CAR AND TRACK ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    2. VAL CAMERA CAR, VIEW OF CAMERA CAR AND TRACK WITH CAMERA STATION ABOVE LOOKING WEST TAKEN FROM RESERVOIR. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  14. 7. VAL CAMERA CAR, DETAIL OF 'FLARE' OR TRAJECTORY CAMERA ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    7. VAL CAMERA CAR, DETAIL OF 'FLARE' OR TRAJECTORY CAMERA INSIDE CAMERA CAR. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  15. 6. VAL CAMERA CAR, DETAIL OF COMMUNICATION EQUIPMENT INSIDE CAMERA ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    6. VAL CAMERA CAR, DETAIL OF COMMUNICATION EQUIPMENT INSIDE CAMERA CAR WITH CAMERA MOUNT IN FOREGROUND. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  16. The rate and causes of lunar space weathering: Insights from Lunar Reconnaissance Orbiter Wide Angle Camera ultraviolet observations

    NASA Astrophysics Data System (ADS)

    Denevi, B. W.; Robinson, M. S.; Sato, H.; Hapke, B. W.; McEwen, A. S.; Hawke, B. R.

    2011-12-01

    Lunar Reconnaissance Orbiter Wide Angle Camera global ultraviolet and visible imaging provides a unique opportunity to examine the rate and causes of space weathering on the Moon. Silicates typically have a strong decrease in reflectance toward UV wavelengths (<~450 nm) due to strong bands at 250 nm and in the far UV. Metallic iron is relatively spectrally neutral, and laboratory spectra suggest that its addition to mature soils in the form of submicroscopic iron (also known as nanophase iron) flattens silicate spectra, significantly reducing spectral slope in the ultraviolet. Reflectance at ultraviolet wavelengths may be especially sensitive to the surface coatings that form due to exposure to space weathering because scattering from the surfaces of grains contributes a larger fraction to the reflectance spectrum at short wavelengths. We find that the UV slope (as measured by the 320/415 nm ratio) is a more sensitive measure of maturity than indexes based on visible and near-infrared wavelengths. Only the youngest features (less than ~100 Ma) retain a UV slope that is distinct from mature soils of the same composition. No craters >20 km have UV slopes that approach those observed in laboratory spectra of fresh lunar materials (powdered lunar rocks). While the 320/415 nm ratio increases by ~18% from powdered rocks to mature soils in laboratory samples, Giordano Bruno, the freshest large crater, only shows a 3% difference between fresh and mature materials. At the resolution of our UV data (400 m/pixel), we observe some small (<5 km) craters that show a ~14% difference in 320/415 nm ratio from their mature surroundings. UV observations show that Reiner Gamma has had significantly lower levels of space weathering than any of the Copernican craters we examined, and was the only region we found with a UV slope that approached laboratory values for fresh powdered rock samples. This is consistent with the hypothesis that its high albedo is due to magnetic shielding from

  17. Use of Multiple-Angle Snow Camera (MASC) Observations as a Constraint on Radar-Based Retrievals of Snowfall Rate

    NASA Astrophysics Data System (ADS)

    Cooper, S.; Garrett, T. J.; Wood, N.; L'Ecuyer, T. S.

    2015-12-01

    We use a combination of Ka-band Zenith Radar (KaZR) and Multiple-Angle Snow Camera (MASC) observations at the ARM North Slope Alaska Climate Facility Site at Barrow to quantify snowfall. The optimal-estimation framework is used to combine information from the KaZR and MASC into a common retrieval scheme, where retrieved estimates of snowfall are compared to observations at a nearby NWS measurement site for evaluation. Modified from the operational CloudSat algorithm, the retrieval scheme returns estimates of the vertical profile of exponential PSD slope parameter with a constant number density. These values, in turn, can be used to calculate surface snowrate (liquid equivalent) given knowledge of snowflake microphysical properties and fallspeeds. We exploit scattering models for a variety of ice crystal shapes including aggregates developed specifically from observations of snowfall properties at high-latitudes, as well as more pristine crystal shapes involving sector plates, bullet rosettes, and hexagonal columns. As expected, initial retrievals suggest large differences (300% for some events) in estimated snowfall accumulations given the use of the different ice crystal assumptions. The complex problem of how we can more quantitatively link MASC snowflake images to specific radar scattering properties is an ongoing line of research. Here, however, we do quantify the use of MASC observations of fallspeed and PSD parameters as constraint on our optimal-estimation retrieval approach. In terms of fallspeed, we find differences in estimated snowfall of nearly 50% arising from the use of MASC observed fallspeeds relative to those derived from traditional fallspeed parameterizations. In terms of snowflake PSD, we find differences of nearly 25% arising from the use of MASC observed slope parameters relative to those derived from field campaign observations of high-altitude snow events. Of course, these different sources of error conspire to make the estimate of snowfall

  18. MUSIC - Multifunctional stereo imaging camera system for wide angle and high resolution stereo and color observations on the Mars-94 mission

    NASA Astrophysics Data System (ADS)

    Oertel, D.; Jahn, H.; Sandau, R.; Walter, I.; Driescher, H.

    1990-10-01

    Objectives of the multifunctional stereo imaging camera (MUSIC) system to be deployed on the Soviet Mars-94 mission are outlined. A high-resolution stereo camera (HRSC) and wide-angle opto-electronic stereo scanner (WAOSS) are combined in terms of hardware, software, technology aspects, and solutions. Both HRSC and WAOSS are push-button instruments containing a single optical system and focal plates with several parallel CCD line sensors. Emphasis is placed on the MUSIC system's stereo capability, its design, mass memory, and data compression. A 1-Gbit memory is divided into two parts: 80 percent for HRSC and 20 percent for WAOSS, while the selected on-line compression strategy is based on macropixel coding and real-time transform coding.

  19. Why do I sometimes see bright speckles in an image of the Terrain product, particularly at the oblique camera angles?

    Atmospheric Science Data Center

    2014-12-08

    MISR Level 1B2 data products use various high data values to signify fill, and one of the fill values (16377) in the 14 MSB's of the scaled radiances signifies that this location on the SOM grid was obscured from the camera's view by...

  20. 3. VAL CAMERA CAR, VIEW OF CAMERA CAR AND TRACK ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    3. VAL CAMERA CAR, VIEW OF CAMERA CAR AND TRACK WITH THE VAL TO THE RIGHT, LOOKING NORTHEAST. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  1. 7. VAL CAMERA STATION, INTERIOR VIEW OF CAMERA MOUNT, COMMUNICATION ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    7. VAL CAMERA STATION, INTERIOR VIEW OF CAMERA MOUNT, COMMUNICATION EQUIPMENT AND STORAGE CABINET. - Variable Angle Launcher Complex, Camera Stations, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  2. 1. VARIABLEANGLE LAUNCHER CAMERA CAR, VIEW OF CAMERA CAR AND ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    1. VARIABLE-ANGLE LAUNCHER CAMERA CAR, VIEW OF CAMERA CAR AND TRACK WITH CAMERA STATION ABOVE LOOKING NORTH TAKEN FROM RESERVOIR. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  3. Large-Grazing-Angle, Multi-Image Kirkpatrick-Baez Microscope as the Front End to a High-Resolution Streak Camera for OMEGA

    SciTech Connect

    Gotchev, O.V.; Hayes, L.J.; Jaanimagi, P.A.; Knauer, J.P.; Marshall, F.J.; Meyerhofer, D. D.

    2003-11-25

    (B204)A new, high-resolution x-ray microscope with a large grazing angle has been developed, characterized, and fielded at the Laboratory for Laser Energetics. It increases the sensitivity and spatial resolution in planar direct-drive hydrodynamic stability experiments, relevant to inertial confinement fusion (ICF) research. It has been designed to work as the optical front end of the PJX-a high-current, high-dynamic-range x-ray streak camera. Optical design optimization, results from numerical ray tracing, mirror-coating choice, and characterization have been described previously [O. V. Gotchev, et al., Rev. Sci. Instrum. 74, 2178 (2003)]. This work highlights the optics' unique mechanical design and flexibility and considers certain applications that benefit from it. Characterization of the microscope's resolution in terms of its modulation transfer function (MTF) over the field of view is shown. Recent results from hydrodynamic stability experiments, diagnosed with the optic and the PJX, are provided to confirm the microscope's advantages as a high-resolution, high-throughput x-ray optical front end for streaked imaging.

  4. 9. VIEW OF CAMERA STATIONS UNDER CONSTRUCTION INCLUDING CAMERA CAR ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    9. VIEW OF CAMERA STATIONS UNDER CONSTRUCTION INCLUDING CAMERA CAR ON RAILROAD TRACK AND FIXED CAMERA STATION 1400 (BUILDING NO. 42021) ABOVE, ADJACENT TO STATE HIGHWAY 39, LOOKING WEST, March 23, 1948. (Original photograph in possession of Dave Willis, San Diego, California.) - Variable Angle Launcher Complex, Camera Stations, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  5. The limits of narrow and wide-angle AVA inversions for high Vp/Vs ratios: An application to elastic seabed characterization

    NASA Astrophysics Data System (ADS)

    Aleardi, Mattia; Tognarelli, Andrea

    2016-08-01

    Since its introduction in the oil and gas industry, amplitude versus angle (AVA) inversion has become a standard tool in deep hydrocarbon exploration. However, with the intensification of offshore construction activity, applications of this method have been extended to evaluate the elastic properties of seabed sediments and of the shallowest part of the subsurface. These regions are often characterized by undercompacted sediments with very low S-wave velocities (Vs) and high P-wave velocity to S-wave velocity (Vp/Vs) ratios. However, the importance of the Vp/Vs ratio is usually underrated in AVA inversion. In this study, we analyse the limits of the AVA method in cases of high Vp/Vs ratios and the benefits introduced by wide-angle reflections in constraining the inversion results. A simplified seabed model that is characterized by a high Vp/Vs ratio is used to study the influence of the elastic and viscoelastic parameters on the P-wave reflection coefficients and to compute the error function of the AVA inversion. In addition, a synthetic AVA inversion is performed on this simplified model, which enables us to apply the sensitivity analysis tools to the inversion kernel. These theoretical analyses show that in the case of high Vp/Vs ratios, the Vs contrast at the reflecting interface plays a very minor role in determining the P-wave reflection coefficients and that the viscoelastic effects can be neglected when only pre-critical angles are considered in the inversion. In addition, wide-angle reflections are essential to reducing both the cross-talk between the inverted elastic parameters and the uncertainties in the Vp and density estimations, but they are not sufficient to better constrain the Vs estimation. As an application to field data, we derive the elastic properties of the seabed interface by applying AVA inversion to a 2D seismic dataset from a well-site survey acquisition. The limited water depth, the maximum available source-to-receiver offset, and the

  6. Wide Angle Movie

    NASA Technical Reports Server (NTRS)

    1999-01-01

    This brief movie illustrates the passage of the Moon through the Saturn-bound Cassini spacecraft's wide-angle camera field of view as the spacecraft passed by the Moon on the way to its closest approach with Earth on August 17, 1999. From beginning to end of the sequence, 25 wide-angle images (with a spatial image scale of about 14 miles per pixel (about 23 kilometers)were taken over the course of 7 and 1/2 minutes through a series of narrow and broadband spectral filters and polarizers, ranging from the violet to the near-infrared regions of the spectrum, to calibrate the spectral response of the wide-angle camera. The exposure times range from 5 milliseconds to 1.5 seconds. Two of the exposures were smeared and have been discarded and replaced with nearby images to make a smooth movie sequence. All images were scaled so that the brightness of Crisium basin, the dark circular region in the upper right, is approximately the same in every image. The imaging data were processed and released by the Cassini Imaging Central Laboratory for Operations (CICLOPS)at the University of Arizona's Lunar and Planetary Laboratory, Tucson, AZ.

    Photo Credit: NASA/JPL/Cassini Imaging Team/University of Arizona

    Cassini, launched in 1997, is a joint mission of NASA, the European Space Agency and Italian Space Agency. The mission is managed by NASA's Jet Propulsion Laboratory, Pasadena, CA, for NASA's Office of Space Science, Washington DC. JPL is a division of the California Institute of Technology, Pasadena, CA.

  7. RECON 6: A real-time, wide-angle, solid-state reconnaissance camera system for high-speed, low-altitude aircraft

    NASA Technical Reports Server (NTRS)

    Labinger, R. L.

    1976-01-01

    The maturity of self-scanned, solid-state, multielement photosensors makes the realization of "real time" reconnaissance photography viable and practical. A system built around these sensors which can be constructed to satisfy the requirements of the tactical reconnaissance scenario is described. The concept chosen is the push broom strip camera system -- RECON 6 -- which represents the least complex and most economical approach for an electronic camera capable of providing a high level of performance over a 140 deg wide, continuous swath at altitudes from 200 to 3,000 feet and at minimum loss in resolution at higher altitudes.

  8. Pre-flight and On-orbit Geometric Calibration of the Lunar Reconnaissance Orbiter Camera

    NASA Astrophysics Data System (ADS)

    Speyerer, E. J.; Wagner, R. V.; Robinson, M. S.; Licht, A.; Thomas, P. C.; Becker, K.; Anderson, J.; Brylow, S. M.; Humm, D. C.; Tschimmel, M.

    2016-04-01

    The Lunar Reconnaissance Orbiter Camera (LROC) consists of two imaging systems that provide multispectral and high resolution imaging of the lunar surface. The Wide Angle Camera (WAC) is a seven color push-frame imager with a 90∘ field of view in monochrome mode and 60∘ field of view in color mode. From the nominal 50 km polar orbit, the WAC acquires images with a nadir ground sampling distance of 75 m for each of the five visible bands and 384 m for the two ultraviolet bands. The Narrow Angle Camera (NAC) consists of two identical cameras capable of acquiring images with a ground sampling distance of 0.5 m from an altitude of 50 km. The LROC team geometrically calibrated each camera before launch at Malin Space Science Systems in San Diego, California and the resulting measurements enabled the generation of a detailed camera model for all three cameras. The cameras were mounted and subsequently launched on the Lunar Reconnaissance Orbiter (LRO) on 18 June 2009. Using a subset of the over 793000 NAC and 207000 WAC images of illuminated terrain collected between 30 June 2009 and 15 December 2013, we improved the interior and exterior orientation parameters for each camera, including the addition of a wavelength dependent radial distortion model for the multispectral WAC. These geometric refinements, along with refined ephemeris, enable seamless projections of NAC image pairs with a geodetic accuracy better than 20 meters and sub-pixel precision and accuracy when orthorectifying WAC images.

  9. Critical Heat Flux in Inclined Rectangular Narrow Gaps

    SciTech Connect

    Jeong J. Kim; Yong H. Kim; Seong J. Kim; Sang W. Noh; Kune Y. Suh; Joy L. Rempe; Fan-Bill Cheung; Sang B. Kim

    2004-06-01

    In light of the TMI-2 accident, in which the reactor vessel lower head survived the attack by molten core material, the in-vessel retention strategy was suggested to benefit from cooling the debris through a gap between the lower head and the core material. The GAMMA 1D (Gap Apparatus Mitigating Melt Attack One Dimensional) tests were conducted to investigate the critical heat flux (CHF) in narrow gaps with varying surface orientations. The CHF in an inclined gap, especially in case of the downward-facing narrow gap, is dictated by bubble behavior because the departing bubbles are squeezed. The orientation angle affects the bubble layer and escape of the bubbles from the narrow gap. The test parameters include gap sizes of 1, 2, 5 and 10 mm and the open periphery, and the orientation angles range from the fully downward-facing (180o) to the vertical (90o) position. The 15 ×35 mm copper test section was electrically heated by the thin film resistor on the back. The heater assembly was installed to the tip of the rotating arm in the heated water pool at the atmospheric pressure. The bubble behavior was photographed utilizing a high-speed camera through the Pyrex glass spacer. It was observed that the CHF decreased as the surface inclination angle increased and as the gap size decreased in most of the cases. However, the opposing results were obtained at certain surface orientations and gap sizes. Transition angles, at which the CHF changed in a rapid slope, were also detected, which is consistent with the existing literature. A semi-empirical CHF correlation was developed for the inclined narrow rectangular channels through dimensional analysis. The correlation provides with best-estimate CHF values for realistically assessing the thermal margin to failure of the lower head during a severe accident involving relocation of the core material.

  10. HIGH SPEED CAMERA

    DOEpatents

    Rogers, B.T. Jr.; Davis, W.C.

    1957-12-17

    This patent relates to high speed cameras having resolution times of less than one-tenth microseconds suitable for filming distinct sequences of a very fast event such as an explosion. This camera consists of a rotating mirror with reflecting surfaces on both sides, a narrow mirror acting as a slit in a focal plane shutter, various other mirror and lens systems as well as an innage recording surface. The combination of the rotating mirrors and the slit mirror causes discrete, narrow, separate pictures to fall upon the film plane, thereby forming a moving image increment of the photographed event. Placing a reflecting surface on each side of the rotating mirror cancels the image velocity that one side of the rotating mirror would impart, so as a camera having this short a resolution time is thereby possible.

  11. The DSLR Camera

    NASA Astrophysics Data System (ADS)

    Berkó, Ernő; Argyle, R. W.

    Cameras have developed significantly in the past decade; in particular, digital Single-Lens Reflex Cameras (DSLR) have appeared. As a consequence we can buy cameras of higher and higher pixel number, and mass production has resulted in the great reduction of prices. CMOS sensors used for imaging are increasingly sensitive, and the electronics in the cameras allows images to be taken with much less noise. The software background is developing in a similar way—intelligent programs are created for after-processing and other supplementary works. Nowadays we can find a digital camera in almost every household, most of these cameras are DSLR ones. These can be used very well for astronomical imaging, which is nicely demonstrated by the amount and quality of the spectacular astrophotos appearing in different publications. These examples also show how much post-processing software contributes to the rise in the standard of the pictures. To sum up, the DSLR camera serves as a cheap alternative for the CCD camera, with somewhat weaker technical characteristics. In the following, I will introduce how we can measure the main parameters (position angle and separation) of double stars, based on the methods, software and equipment I use. Others can easily apply these for their own circumstances.

  12. 6. VAL CAMERA STATION, VIEW FROM INTERIOR OUT OF WINDOW ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    6. VAL CAMERA STATION, VIEW FROM INTERIOR OUT OF WINDOW OPENING TOWARD VAL FIRING RANGE LOOKING SOUTHEAST WITH CAMERA MOUNT IN FOREGROUND. - Variable Angle Launcher Complex, Camera Stations, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  13. 8. VAL CAMERA CAR, CLOSEUP VIEW OF 'FLARE' OR TRAJECTORY ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    8. VAL CAMERA CAR, CLOSE-UP VIEW OF 'FLARE' OR TRAJECTORY CAMERA ON SLIDING MOUNT. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  14. 84. VIEW FROM CAMERA TOWER LOOKING SOUTHWEST SHOWING VAL FIRING ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    84. VIEW FROM CAMERA TOWER LOOKING SOUTHWEST SHOWING VAL FIRING RANGE WITH OVERHEAD CAMERA AND CABLES, Date unknown, circa 1949. - Variable Angle Launcher Complex, Variable Angle Launcher, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  15. LROC - Lunar Reconnaissance Orbiter Camera

    NASA Astrophysics Data System (ADS)

    Robinson, M. S.; Eliason, E.; Hiesinger, H.; Jolliff, B. L.; McEwen, A.; Malin, M. C.; Ravine, M. A.; Thomas, P. C.; Turtle, E. P.

    2009-12-01

    The Lunar Reconnaissance Orbiter (LRO) went into lunar orbit on 23 June 2009. The LRO Camera (LROC) acquired its first lunar images on June 30 and commenced full scale testing and commissioning on July 10. The LROC consists of two narrow-angle cameras (NACs) that provide 0.5 m scale panchromatic images over a combined 5 km swath, and a wide-angle camera (WAC) to provide images at a scale of 100 m per pixel in five visible wavelength bands (415, 566, 604, 643, and 689 nm) and 400 m per pixel in two ultraviolet bands (321 nm and 360 nm) from the nominal 50 km orbit. Early operations were designed to test the performance of the cameras under all nominal operating conditions and provided a baseline for future calibrations. Test sequences included off-nadir slews to image stars and the Earth, 90° yaw sequences to collect flat field calibration data, night imaging for background characterization, and systematic mapping to test performance. LRO initially was placed into a terminator orbit resulting in images acquired under low signal conditions. Over the next three months the incidence angle at the spacecraft’s equator crossing gradually decreased towards high noon, providing a range of illumination conditions. Several hundred south polar images were collected in support of impact site selection for the LCROSS mission; details can be seen in many of the shadows. Commissioning phase images not only proved the instruments’ overall performance was nominal, but also that many geologic features of the lunar surface are well preserved at the meter-scale. Of particular note is the variety of impact-induced morphologies preserved in a near pristine state in and around kilometer-scale and larger young Copernican age impact craters that include: abundant evidence of impact melt of a variety of rheological properties, including coherent flows with surface textures and planimetric properties reflecting supersolidus (e.g., liquid melt) emplacement, blocks delicately perched on

  16. 1. VARIABLEANGLE LAUNCHER (VAL) CONCRETE 'A' FRAME STRUCTURE SHOWING CAMERA ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    1. VARIABLE-ANGLE LAUNCHER (VAL) CONCRETE 'A' FRAME STRUCTURE SHOWING CAMERA TOWER STRUCTURE LOOKING SOUTH AND ARCHED OPENING FOR ROADWAY. - Variable Angle Launcher Complex, Variable Angle Launcher, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  17. Space Camera

    NASA Technical Reports Server (NTRS)

    1983-01-01

    Nikon's F3 35mm camera was specially modified for use by Space Shuttle astronauts. The modification work produced a spinoff lubricant. Because lubricants in space have a tendency to migrate within the camera, Nikon conducted extensive development to produce nonmigratory lubricants; variations of these lubricants are used in the commercial F3, giving it better performance than conventional lubricants. Another spinoff is the coreless motor which allows the F3 to shoot 140 rolls of film on one set of batteries.

  18. Camera Calibration using the Damped Bundle Adjustment Toolbox

    NASA Astrophysics Data System (ADS)

    Börlin, N.; Grussenmeyer, P.

    2014-05-01

    Camera calibration is one of the fundamental photogrammetric tasks. The standard procedure is to apply an iterative adjustment to measurements of known control points. The iterative adjustment needs initial values of internal and external parameters. In this paper we investigate a procedure where only one parameter - the focal length is given a specific initial value. The procedure is validated using the freely available Damped Bundle Adjustment Toolbox on five calibration data sets using varying narrow- and wide-angle lenses. The results show that the Gauss-Newton-Armijo and Levenberg-Marquardt-Powell bundle adjustment methods implemented in the toolbox converge even if the initial values of the focal length are between 1/2 and 32 times the true focal length, even if the parameters are highly correlated. Standard statistical analysis methods in the toolbox enable manual selection of the lens distortion parameters to estimate, something not available in other camera calibration toolboxes. A standardised camera calibration procedure that does not require any information about the camera sensor or focal length is suggested based on the convergence results. The toolbox source and data sets used in this paper are available from the authors.

  19. 1. VARIABLEANGLE LAUNCHER CAMERA STATION 1400 (BUILDING NO. 42021), VIEW ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    1. VARIABLE-ANGLE LAUNCHER CAMERA STATION 1400 (BUILDING NO. 42021), VIEW OF EXTERIOR LOOKING NORTHEAST WITH CAMERA STATION IN 1100 (BUILDING NO. 42020) BACKGROUND. - Variable Angle Launcher Complex, Camera Stations, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  20. Infrared Camera

    NASA Technical Reports Server (NTRS)

    1997-01-01

    A sensitive infrared camera that observes the blazing plumes from the Space Shuttle or expendable rocket lift-offs is capable of scanning for fires, monitoring the environment and providing medical imaging. The hand-held camera uses highly sensitive arrays in infrared photodetectors known as quantum well infrared photo detectors (QWIPS). QWIPS were developed by the Jet Propulsion Laboratory's Center for Space Microelectronics Technology in partnership with Amber, a Raytheon company. In October 1996, QWIP detectors pointed out hot spots of the destructive fires speeding through Malibu, California. Night vision, early warning systems, navigation, flight control systems, weather monitoring, security and surveillance are among the duties for which the camera is suited. Medical applications are also expected.

  1. Faint Object Camera imaging and spectroscopy of NGC 4151

    NASA Technical Reports Server (NTRS)

    Boksenberg, A.; Catchpole, R. M.; Macchetto, F.; Albrecht, R.; Barbieri, C.; Blades, J. C.; Crane, P.; Deharveng, J. M.; Disney, M. J.; Jakobsen, P.

    1995-01-01

    We describe ultraviolet and optical imaging and spectroscopy within the central few arcseconds of the Seyfert galaxy NGC 4151, obtained with the Faint Object Camera on the Hubble Space Telescope. A narrowband image including (O III) lambda(5007) shows a bright nucleus centered on a complex biconical structure having apparent opening angle approximately 65 deg and axis at a position angle along 65 deg-245 deg; images in bands including Lyman-alpha and C IV lambda(1550) and in the optical continuum near 5500 A, show only the bright nucleus. In an off-nuclear optical long-slit spectrum we find a high and a low radial velocity component within the narrow emission lines. We identify the low-velocity component with the bright, extended, knotty structure within the cones, and the high-velocity component with more confined diffuse emission. Also present are strong continuum emission and broad Balmer emission line components, which we attribute to the extended point spread function arising from the intense nuclear emission. Adopting the geometry pointed out by Pedlar et al. (1993) to explain the observed misalignment of the radio jets and the main optical structure we model an ionizing radiation bicone, originating within a galactic disk, with apex at the active nucleus and axis centered on the extended radio jets. We confirm that through density bounding the gross spatial structure of the emission line region can be reproduced with a wide opening angle that includes the line of sight, consistent with the presence of a simple opaque torus allowing direct view of the nucleus. In particular, our modelling reproduces the observed decrease in position angle with distance from the nucleus, progressing initially from the direction of the extended radio jet, through our optical structure, and on to the extended narrow-line region. We explore the kinematics of the narrow-line low- and high-velocity components on the basis of our spectroscopy and adopted model structure.

  2. Development of broad-view camera unit for laparoscopic surgery.

    PubMed

    Kawahara, Tomohiro; Takaki, Takeshi; Ishii, Idaku; Okajima, Masazumi

    2009-01-01

    A disadvantage of laparoscopic surgery is the narrow operative field provided by the endoscope camera. This paper describes a newly developed broad-view camera unit for use with the Broad-View Camera System, which is capable of providing a wider view of the internal organs during laparoscopic surgery. The developed camera unit is composed of a miniature color CMOS camera, an indwelling needle, and an extra-thin connector. The specific design of the camera unit and the method for positioning it are shown. The performance of the camera unit has been confirmed through basic and animal experiments. PMID:19963983

  3. Electronic Still Camera

    NASA Technical Reports Server (NTRS)

    Holland, S. Douglas (Inventor)

    1992-01-01

    A handheld, programmable, digital camera is disclosed that supports a variety of sensors and has program control over the system components to provide versatility. The camera uses a high performance design which produces near film quality images from an electronic system. The optical system of the camera incorporates a conventional camera body that was slightly modified, thus permitting the use of conventional camera accessories, such as telephoto lenses, wide-angle lenses, auto-focusing circuitry, auto-exposure circuitry, flash units, and the like. An image sensor, such as a charge coupled device ('CCD') collects the photons that pass through the camera aperture when the shutter is opened, and produces an analog electrical signal indicative of the image. The analog image signal is read out of the CCD and is processed by preamplifier circuitry, a correlated double sampler, and a sample and hold circuit before it is converted to a digital signal. The analog-to-digital converter has an accuracy of eight bits to insure accuracy during the conversion. Two types of data ports are included for two different data transfer needs. One data port comprises a general purpose industrial standard port and the other a high speed/high performance application specific port. The system uses removable hard disks as its permanent storage media. The hard disk receives the digital image signal from the memory buffer and correlates the image signal with other sensed parameters, such as longitudinal or other information. When the storage capacity of the hard disk has been filled, the disk can be replaced with a new disk.

  4. 3. VAL CAMERA STATION, DETAIL OF ROOF OVERHANG AND EXPOSED ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    3. VAL CAMERA STATION, DETAIL OF ROOF OVERHANG AND EXPOSED CONCRETE SURFACES. - Variable Angle Launcher Complex, Camera Stations, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  5. 5. VAL CAMERA CAR, DETAIL OF HOIST AT SIDE OF ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    5. VAL CAMERA CAR, DETAIL OF HOIST AT SIDE OF BRIDGE AND ENGINE CAR ON TRACKS, LOOKING NORTHEAST. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  6. Nikon Camera

    NASA Technical Reports Server (NTRS)

    1980-01-01

    Nikon FM compact has simplification feature derived from cameras designed for easy, yet accurate use in a weightless environment. Innovation is a plastic-cushioned advance lever which advances the film and simultaneously switches on a built in light meter. With a turn of the lens aperture ring, a glowing signal in viewfinder confirms correct exposure.

  7. CCD Camera

    DOEpatents

    Roth, Roger R.

    1983-01-01

    A CCD camera capable of observing a moving object which has varying intensities of radiation eminating therefrom and which may move at varying speeds is shown wherein there is substantially no overlapping of successive images and wherein the exposure times and scan times may be varied independently of each other.

  8. CCD Camera

    DOEpatents

    Roth, R.R.

    1983-08-02

    A CCD camera capable of observing a moving object which has varying intensities of radiation emanating therefrom and which may move at varying speeds is shown wherein there is substantially no overlapping of successive images and wherein the exposure times and scan times may be varied independently of each other. 7 figs.

  9. Changes in local energy spectra with SPECT rotation for two Anger cameras

    SciTech Connect

    Koral, K.F.; Luo, J.Q.; Ahmad, W.; Buchbinder, S.; Ficaro, E.

    1995-08-01

    The authors investigated the shift of local energy spectra with SPECT rotation for the GE 400 AT and the Picker Prism 3000 tomographs. A Co-57 flood source was taped to the parallel-beam collimator of the GE 400 AT; a Tc-99m line source was placed at the focus of the fan-beam collimator of one head of the Picker Prism. The count-based method, which employs a narrow window (about 4 keV) on the maximum slope of the photopeak, was used with both systems. Non-linear, polynomial spectral fittings was applied to x-y-E data acquisitions with the GE camera. The fitting yielded either shifts or shifts and width changes. Results show (1) the shifts are pseudo-sinusoidal with angle and similar for different spatial locations, (2) the average of their absolute value is 0.71 keV and 0.13 keV for the GE and Picker cameras, respectively, (3) width changes for the GE camera are small and appear random, (4) the calculated shifts from the count-based method for the central part of the GE camera are correlated with those from the spectral fitting method. They are 12% smaller. The conclusion is that energy shift with angle may be present with many rotating cameras although they may be smaller with newer cameras. It might be necessary to account for them in schemes designed for high-accuracy compensation of Compton-scattered gamma rays although they possibly could be ignored for newer cameras.

  10. High-power, narrow-band, high-repetition-rate, 5.9 eV coherent light source using passive optical cavity for laser-based angle-resolved photoelectron spectroscopy.

    PubMed

    Omachi, J; Yoshioka, K; Kuwata-Gonokami, M

    2012-10-01

    We demonstrate a scheme for efficient generation of a 5.9 eV coherent light source with an average power of 23 mW, 0.34 meV linewidth, and 73 MHz repetition rate from a Ti: sapphire picosecond mode-locked laser with an output power of 1 W. Second-harmonic light is generated in a passive optical cavity by a BiB(3)O(6) crystal with a conversion efficiency as high as 67%. By focusing the second-harmonic light transmitted from the cavity into a β-BaB(2)O(4) crystal, we obtain fourth-harmonic light at 5.9 eV. This light source offers stable operation for at least a week. We discuss the suitability of the laser light source for high-resolution angle-resolved photoelectron spectroscopy by comparing it with other sources (synchrotron radiation facilities and gas discharge lamp). PMID:23188317

  11. Cassini Camera Contamination Anomaly: Experiences and Lessons Learned

    NASA Technical Reports Server (NTRS)

    Haemmerle, Vance R.; Gerhard, James H.

    2006-01-01

    We discuss the contamination 'Haze' anomaly for the Cassini Narrow Angle Camera (NAC), one of two optical telescopes that comprise the Imaging Science Subsystem (ISS). Cassini is a Saturn Orbiter with a 4-year nominal mission. The incident occurred in 2001, five months after Jupiter encounter during the Cruise phase and ironically at the resumption of planned maintenance decontamination cycles. The degraded optical performance was first identified by the Instrument Operations Team with the first ISS Saturn imaging six weeks later. A distinct haze of varying size from image to image marred the images of Saturn. A photometric star calibration of the Pleiades, 4 days after the incident, showed stars with halos. Analysis showed that while the halo's intensity was only 1 - 2% of the intensity of the central peak of a star, the halo contained 30 - 70% of its integrated flux. This condition would impact science return. In a review of our experiences, we examine the contamination control plan, discuss the analysis of the limited data available and describe the one-year campaign to remove the haze from the camera. After several long conservative heating activities and interim analysis of their results, the contamination problem as measured by the camera's point spread function was essentially back to preanomaly size and at a point where there would be more risk to continue. We stress the importance of the flexibility of operations and instrument design, the need to do early infight instrument calibration and continual monitoring of instrument performance.

  12. Initial Results of 3D Topographic Mapping Using Lunar Reconnaissance Orbiter Camera (LROC) Stereo Imagery

    NASA Astrophysics Data System (ADS)

    Li, R.; Oberst, J.; McEwen, A. S.; Archinal, B. A.; Beyer, R. A.; Thomas, P. C.; Chen, Y.; Hwangbo, J.; Lawver, J. D.; Scholten, F.; Mattson, S. S.; Howington-Kraus, A. E.; Robinson, M. S.

    2009-12-01

    The Lunar Reconnaissance Orbiter (LRO), launched June 18, 2009, carries the Lunar Reconnaissance Orbiter Camera (LROC) as one of seven remote sensing instruments on board. The camera system is equipped with a Wide Angle Camera (WAC) and two Narrow Angle Cameras (NAC) for systematic lunar surface mapping and detailed site characterization for potential landing site selection and resource identification. The LROC WAC is a pushframe camera with five 14-line by 704-sample framelets for visible light bands and two 16-line by 512-sample (summed 4x to 4 by 128) UV bands. The WAC can also acquire monochrome images with a 14-line by 1024-sample format. At the nominal 50-km orbit the visible bands ground scale is 75-m/pixel and the UV 383-m/pixel. Overlapping WAC images from adjacent orbits can be used to map topography at a scale of a few hundred meters. The two panchromatic NAC cameras are pushbroom imaging sensors each with a Cassegrain telescope of a 700-mm focal length. The two NAC cameras are aligned with a small overlap in the cross-track direction so that they cover a 5-km swath with a combined field-of-view (FOV) of 5.6°. At an altitude of 50-km, the NAC can provide panchromatic images from its 5,000-pixel linear CCD at a ground scale of 0.5-m/pixel. Calibration of the cameras was performed by using precision collimator measurements to determine the camera principal points and radial lens distortion. The orientation of the two NAC cameras is estimated by a boresight calibration using double and triple overlapping NAC images of the lunar surface. The resulting calibration results are incorporated into a photogrammetric bundle adjustment (BA), which models the LROC camera imaging geometry, in order to refine the exterior orientation (EO) parameters initially retrieved from the SPICE kernels. Consequently, the improved EO parameters can significantly enhance the quality of topographic products derived from LROC NAC imagery. In addition, an analysis of the spacecraft

  13. Calibration and Epipolar Geometry of Generic Heterogenous Camera Systems

    NASA Astrophysics Data System (ADS)

    Luber, A.; Rueß, D.; Manthey, K.; Reulke, R.

    2012-07-01

    The application of perspective camera systems in photogrammetry and computer vision is state of the art. In recent years nonperspective and especially omnidirectional camera systems were increasingly used in close-range photogrammetry tasks. In general perspective camera model, i. e. pinhole model, cannot be applied when using non-perspective camera systems. However, several camera models for different omnidirectional camera systems are proposed in literature. Using different types of cameras in a heterogeneous camera system may lead to an advantageous combination. The advantages of different camera systems, e. g. field of view and resolution, result in a new enhanced camera system. If these different kinds of cameras can be modeled, using a unified camera model, the total calibration process can be simplified. Sometimes it is not possible to give the specific camera model in advance. In these cases a generic approach is helpful. Furthermore, a simple stereo reconstruction becomes possible using a fisheye and a perspective camera for example. In this paper camera models for perspective, wide-angle and omnidirectional camera systems are evaluated. The crucial initialization of the model's parameters is conducted using a generic method that is independent of the particular camera system. The accuracy of this generic camera calibration approach is evaluated by calibration of a dozen of real camera systems. It will be shown, that a unified method of modeling, parameter approximation and calibration of interior and exterior orientation can be applied to derive 3D object data.

  14. Calibration of the Lunar Reconnaissance Orbiter Camera

    NASA Astrophysics Data System (ADS)

    Tschimmel, M.; Robinson, M. S.; Humm, D. C.; Denevi, B. W.; Lawrence, S. J.; Brylow, S.; Ravine, M.; Ghaemi, T.

    2008-12-01

    The Lunar Reconnaissance Orbiter Camera (LROC) onboard the NASA Lunar Reconnaissance Orbiter (LRO) spacecraft consists of three cameras: the Wide-Angle Camera (WAC) and two identical Narrow Angle Cameras (NAC-L, NAC-R). The WAC is push-frame imager with 5 visible wavelength filters (415 to 680 nm) at a spatial resolution of 100 m/pixel and 2 UV filters (315 and 360 nm) with a resolution of 400 m/pixel. In addition to the multicolor imaging the WAC can operate in monochrome mode to provide a global large- incidence angle basemap and a time-lapse movie of the illumination conditions at both poles. The WAC has a highly linear response, a read noise of 72 e- and a full well capacity of 47,200 e-. The signal-to-noise ratio in each band is 140 in the worst case. There are no out-of-band leaks and the spectral response of each filter is well characterized. Each NAC is a monochrome pushbroom scanner, providing images with a resolution of 50 cm/pixel from a 50-km orbit. A single NAC image has a swath width of 2.5 km and a length of up to 26 km. The NACs are mounted to acquire side-by-side imaging for a combined swath width of 5 km. The NAC is designed to fully characterize future human and robotic landing sites in terms of topography and hazard risks. The North and South poles will be mapped on a 1-meter-scale poleward of 85.5° latitude. Stereo coverage can be provided by pointing the NACs off-nadir. The NACs are also highly linear. Read noise is 71 e- for NAC-L and 74 e- for NAC-R and the full well capacity is 248,500 e- for NAC-L and 262,500 e- for NAC- R. The focal lengths are 699.6 mm for NAC-L and 701.6 mm for NAC-R; the system MTF is 28% for NAC-L and 26% for NAC-R. The signal-to-noise ratio is at least 46 (terminator scene) and can be higher than 200 (high sun scene). Both NACs exhibit a straylight feature, which is caused by out-of-field sources and is of a magnitude of 1-3%. However, as this feature is well understood it can be greatly reduced during ground

  15. Pinhole Cameras: For Science, Art, and Fun!

    ERIC Educational Resources Information Center

    Button, Clare

    2007-01-01

    A pinhole camera is a camera without a lens. A tiny hole replaces the lens, and light is allowed to come in for short amount of time by means of a hand-operated shutter. The pinhole allows only a very narrow beam of light to enter, which reduces confusion due to scattered light on the film. This results in an image that is focused, reversed, and…

  16. Anisotropic de Gennes Narrowing in Confined Fluids

    NASA Astrophysics Data System (ADS)

    Nygârd, Kim; Buitenhuis, Johan; Kagias, Matias; Jefimovs, Konstantins; Zontone, Federico; Chushkin, Yuriy

    2016-04-01

    The collective diffusion of dense fluids in spatial confinement is studied by combining high-energy (21 keV) x-ray photon correlation spectroscopy and small-angle x-ray scattering from colloid-filled microfluidic channels. We find the structural relaxation in confinement to be slower compared to the bulk. The collective dynamics is wave vector dependent, akin to the de Gennes narrowing typically observed in bulk fluids. However, in stark contrast to the bulk, the structure factor and de Gennes narrowing in confinement are anisotropic. These experimental observations are essential in order to develop a microscopic theoretical description of collective diffusion of dense fluids in confined geometries.

  17. 11. COMPLETED FIXED CAMERA STATION 1700 (BUILDING NO. 42022) LOOKING ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    11. COMPLETED FIXED CAMERA STATION 1700 (BUILDING NO. 42022) LOOKING WEST SHOWING WINDOW OPENING FOR CAMERA, March 31, 1948. (Original photograph in possession of Dave Willis, San Diego, California.) - Variable Angle Launcher Complex, Camera Stations, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  18. Omnidirectional underwater camera design and calibration.

    PubMed

    Bosch, Josep; Gracias, Nuno; Ridao, Pere; Ribas, David

    2015-01-01

    This paper presents the development of an underwater omnidirectional multi-camera system (OMS) based on a commercially available six-camera system, originally designed for land applications. A full calibration method is presented for the estimation of both the intrinsic and extrinsic parameters, which is able to cope with wide-angle lenses and non-overlapping cameras simultaneously. This method is valid for any OMS in both land or water applications. For underwater use, a customized housing is required, which often leads to strong image distortion due to refraction among the different media. This phenomena makes the basic pinhole camera model invalid for underwater cameras, especially when using wide-angle lenses, and requires the explicit modeling of the individual optical rays. To address this problem, a ray tracing approach has been adopted to create a field-of-view (FOV) simulator for underwater cameras. The simulator allows for the testing of different housing geometries and optics for the cameras to ensure a complete hemisphere coverage in underwater operation. This paper describes the design and testing of a compact custom housing for a commercial off-the-shelf OMS camera (Ladybug 3) and presents the first results of its use. A proposed three-stage calibration process allows for the estimation of all of the relevant camera parameters. Experimental results are presented, which illustrate the performance of the calibration method and validate the approach. PMID:25774707

  19. Omnidirectional Underwater Camera Design and Calibration

    PubMed Central

    Bosch, Josep; Gracias, Nuno; Ridao, Pere; Ribas, David

    2015-01-01

    This paper presents the development of an underwater omnidirectional multi-camera system (OMS) based on a commercially available six-camera system, originally designed for land applications. A full calibration method is presented for the estimation of both the intrinsic and extrinsic parameters, which is able to cope with wide-angle lenses and non-overlapping cameras simultaneously. This method is valid for any OMS in both land or water applications. For underwater use, a customized housing is required, which often leads to strong image distortion due to refraction among the different media. This phenomena makes the basic pinhole camera model invalid for underwater cameras, especially when using wide-angle lenses, and requires the explicit modeling of the individual optical rays. To address this problem, a ray tracing approach has been adopted to create a field-of-view (FOV) simulator for underwater cameras. The simulator allows for the testing of different housing geometries and optics for the cameras to ensure a complete hemisphere coverage in underwater operation. This paper describes the design and testing of a compact custom housing for a commercial off-the-shelf OMS camera (Ladybug 3) and presents the first results of its use. A proposed three-stage calibration process allows for the estimation of all of the relevant camera parameters. Experimental results are presented, which illustrate the performance of the calibration method and validate the approach. PMID:25774707

  20. 5. VAL CAMERA STATION, VIEW FROM INTERIOR OUT OF WINDOW ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    5. VAL CAMERA STATION, VIEW FROM INTERIOR OUT OF WINDOW OPENING TOWARD VAL FIRING RANGE LOOKING EAST WITH VARIABLE ANGLE LAUNCHER IN BACKGROUND. - Variable Angle Launcher Complex, Camera Stations, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  1. New two-dimensional photon camera

    NASA Technical Reports Server (NTRS)

    Papaliolios, C.; Mertz, L.

    1982-01-01

    A photon-sensitive camera, applicable to speckle imaging of astronomical sources, high-resolution spectroscopy of faint galaxies in a crossed-dispersion spectrograph, or narrow-band direct imaging of galaxies, is presented. The camera is shown to supply 8-bit by 8-bit photon positions (256 x 256 pixels) for as many as 10 to the 6th photons/sec with a maximum linear resolution of approximately 10 microns. The sequence of photon positions is recorded digitally with a VHS-format video tape recorder or formed into an immediate image via a microcomputer. The four basic elements of the camera are described in detail: a high-gain image intensifier with fast-decay output phosphor, a glass-prism optical-beam splitter, a set of Gray-coded masks, and a photomultiplier tube for each mask. The characteristics of the camera are compared to those of other photon cameras.

  2. Caught on Camera.

    ERIC Educational Resources Information Center

    Milshtein, Amy

    2002-01-01

    Describes the benefits of and rules to be followed when using surveillance cameras for school security. Discusses various camera models, including indoor and outdoor fixed position cameras, pan-tilt zoom cameras, and pinhole-lens cameras for covert surveillance. (EV)

  3. Narrowness and Liberality

    ERIC Educational Resources Information Center

    Agresto, John

    2003-01-01

    John Agresto, whose task has been to rebuild the war-ravaged infrastructure of a Middle-Eastern university system, is discouraged to see that narrow expertise is the only goal of education there, to the utter exclusion of intellectual breadth. He comments that, although it is not that bad in the U.S., he feels that doctoral programs as currently…

  4. Lobate Scarp Modeling with Lunar Reconnaissance Orbiter Camera Digital Terrain Models

    NASA Astrophysics Data System (ADS)

    Williams, N. R.; Watters, T. R.; Pritchard, M. E.; Banks, M. E.; Bell, J. F.; Robinson, M. S.; Tran, T.

    2011-12-01

    Lobate scarps are a type of contractional tectonic landform expressed on the Moon's surface in both highlands and maria. Typically only tens of meters in relief, these linear or curvilinear topographic rises are interpreted to be low-angle thrust fault scarps resulting from global radial contraction. Radial contraction of the Moon can be inferred from shortening across the population of lobate scarps and is estimated at ~100 m. However, the geometry and depth of the underlying faults and mechanical properties of the near-surface lunar crustal materials are not well constrained. The Lunar Reconnaissance Orbiter Camera (LROC) Narrow Angle Cameras (NACs) acquire 0.5 to 2.0 m/pixel panchromatic images and digital terrain models (DTMs) with spatial resolutions of 2 m are derived from NAC stereo pairs. Topographic data are being used to constrain models of the lobate scarp thrust faults. DTMs are analyzed for relief and morphology of the Slipher (48.3°N, 160.6°E), Racah X-1 (10°S, 178°E), and Simpelius-1 (73.5°S, 13°E) scarps. Profiles are extracted, detrended, and compared along strike. LROC Wide Angle Camera (WAC) 100 m/pixel image mosaics and topography provide regional contexts. Using elastic dislocation modeling, the fault dip angles, depths, slip, and taper are each varied until the predicted surface displacement best fits the DTM profiles for each lobate scarp. Preliminary best-fit dip angles vary from 30-40°, maximum fault depths extend to several hundred meters, and the amount of slip varies from 10 to 30 meters for the three scarps. The modeled maximum depths suggest that the thrust faults are not deeply rooted.

  5. System Synchronizes Recordings from Separated Video Cameras

    NASA Technical Reports Server (NTRS)

    Nail, William; Nail, William L.; Nail, Jasper M.; Le, Doung T.

    2009-01-01

    A system of electronic hardware and software for synchronizing recordings from multiple, physically separated video cameras is being developed, primarily for use in multiple-look-angle video production. The system, the time code used in the system, and the underlying method of synchronization upon which the design of the system is based are denoted generally by the term "Geo-TimeCode(TradeMark)." The system is embodied mostly in compact, lightweight, portable units (see figure) denoted video time-code units (VTUs) - one VTU for each video camera. The system is scalable in that any number of camera recordings can be synchronized. The estimated retail price per unit would be about $350 (in 2006 dollars). The need for this or another synchronization system external to video cameras arises because most video cameras do not include internal means for maintaining synchronization with other video cameras. Unlike prior video-camera-synchronization systems, this system does not depend on continuous cable or radio links between cameras (however, it does depend on occasional cable links lasting a few seconds). Also, whereas the time codes used in prior video-camera-synchronization systems typically repeat after 24 hours, the time code used in this system does not repeat for slightly more than 136 years; hence, this system is much better suited for long-term deployment of multiple cameras.

  6. What's Your Angle on Angles?

    ERIC Educational Resources Information Center

    Browning, Christine A.; Garza-Kling, Gina; Sundling, Elizabeth Hill

    2007-01-01

    Although the nature of the research varies, as do concepts of angle, research in general supports the supposition that angle is a complex idea, best understood from a variety of perspectives. In fact, the concept of angle tends to be threefold, consisting of: (1) the traditional, static notion of two rays meeting at a common vertex; (2) the idea…

  7. Proactive PTZ Camera Control

    NASA Astrophysics Data System (ADS)

    Qureshi, Faisal Z.; Terzopoulos, Demetri

    We present a visual sensor network—comprising wide field-of-view (FOV) passive cameras and pan/tilt/zoom (PTZ) active cameras—capable of automatically capturing closeup video of selected pedestrians in a designated area. The passive cameras can track multiple pedestrians simultaneously and any PTZ camera can observe a single pedestrian at a time. We propose a strategy for proactive PTZ camera control where cameras plan ahead to select optimal camera assignment and handoff with respect to predefined observational goals. The passive cameras supply tracking information that is used to control the PTZ cameras.

  8. Characterization of previously unidentified lunar pyroclastic deposits using Lunar Reconnaissance Orbiter Camera (LROC) data

    USGS Publications Warehouse

    Gustafson, J. Olaf; Bell, James F.; Gaddis, Lisa R.R.; Hawke, B. Ray Ray; Giguere, Thomas A.

    2012-01-01

    We used a Lunar Reconnaissance Orbiter Camera (LROC) global monochrome Wide-angle Camera (WAC) mosaic to conduct a survey of the Moon to search for previously unidentified pyroclastic deposits. Promising locations were examined in detail using LROC multispectral WAC mosaics, high-resolution LROC Narrow Angle Camera (NAC) images, and Clementine multispectral (ultraviolet-visible or UVVIS) data. Out of 47 potential deposits chosen for closer examination, 12 were selected as probable newly identified pyroclastic deposits. Potential pyroclastic deposits were generally found in settings similar to previously identified deposits, including areas within or near mare deposits adjacent to highlands, within floor-fractured craters, and along fissures in mare deposits. However, a significant new finding is the discovery of localized pyroclastic deposits within floor-fractured craters Anderson E and F on the lunar farside, isolated from other known similar deposits. Our search confirms that most major regional and localized low-albedo pyroclastic deposits have been identified on the Moon down to ~100 m/pix resolution, and that additional newly identified deposits are likely to be either isolated small deposits or additional portions of discontinuous, patchy deposits.

  9. Narrow Width Pentaquarks

    NASA Astrophysics Data System (ADS)

    Buccella, F.; Sorba, P.

    A general study of pentaquarks built with four quarks in a L=1 state and an antiquark in S-wave shows that several of such states are forbidden by a selection rule, which holds in the limit of flavor symmetry, to decay into a baryon and a meson final state. We identify the most promising /line{10} multiplet for the classification of the Θ+ and Ξ-- particles recently discovered with the prediction of a narrow width for both of them.

  10. Pre-hibernation performances of the OSIRIS cameras onboard the Rosetta spacecraft

    NASA Astrophysics Data System (ADS)

    Magrin, S.; La Forgia, F.; Da Deppo, V.; Lazzarin, M.; Bertini, I.; Ferri, F.; Pajola, M.; Barbieri, M.; Naletto, G.; Barbieri, C.; Tubiana, C.; Küppers, M.; Fornasier, S.; Jorda, L.; Sierks, H.

    2015-02-01

    Context. The ESA cometary mission Rosetta was launched in 2004. In the past years and until the spacecraft hibernation in June 2011, the two cameras of the OSIRIS imaging system (Narrow Angle and Wide Angle Camera, NAC and WAC) observed many different sources. On 20 January 2014 the spacecraft successfully exited hibernation to start observing the primary scientific target of the mission, comet 67P/Churyumov-Gerasimenko. Aims: A study of the past performances of the cameras is now mandatory to be able to determine whether the system has been stable through the time and to derive, if necessary, additional analysis methods for the future precise calibration of the cometary data. Methods: The instrumental responses and filter passbands were used to estimate the efficiency of the system. A comparison with acquired images of specific calibration stars was made, and a refined photometric calibration was computed, both for the absolute flux and for the reflectivity of small bodies of the solar system. Results: We found a stability of the instrumental performances within ±1.5% from 2007 to 2010, with no evidence of an aging effect on the optics or detectors. The efficiency of the instrumentation is found to be as expected in the visible range, but lower than expected in the UV and IR range. A photometric calibration implementation was discussed for the two cameras. Conclusions: The calibration derived from pre-hibernation phases of the mission will be checked as soon as possible after the awakening of OSIRIS and will be continuously monitored until the end of the mission in December 2015. A list of additional calibration sources has been determined that are to be observed during the forthcoming phases of the mission to ensure a better coverage across the wavelength range of the cameras and to study the possible dust contamination of the optics.

  11. Multispectral calibration to enhance the metrology performance of C-mount camera systems

    NASA Astrophysics Data System (ADS)

    Robson, S.; MacDonald, L.; Kyle, S. A.; Shortis, M. R.

    2014-06-01

    Low cost monochrome camera systems based on CMOS sensors and C-mount lenses have been successfully applied to a wide variety of metrology tasks. For high accuracy work such cameras are typically equipped with ring lights to image retro-reflective targets as high contrast image features. Whilst algorithms for target image measurement and lens modelling are highly advanced, including separate RGB channel lens distortion correction, target image circularity compensation and a wide variety of detection and centroiding approaches, less effort has been directed towards optimising physical target image quality by considering optical performance in narrow wavelength bands. This paper describes an initial investigation to assess the effect of wavelength on camera calibration parameters for two different camera bodies and the same `C-mount' wide angle lens. Results demonstrate the expected strong influence on principal distance, radial and tangential distortion, and also highlight possible trends in principal point, orthogonality and affinity parameters which are close to the parameter estimation noise level from the strong convergent self-calibrating image networks.

  12. Angle performance on optima MDxt

    SciTech Connect

    David, Jonathan; Kamenitsa, Dennis

    2012-11-06

    Angle control on medium current implanters is important due to the high angle-sensitivity of typical medium current implants, such as halo implants. On the Optima MDxt, beam-to-wafer angles are controlled in both the horizontal and vertical directions. In the horizontal direction, the beam angle is measured through six narrow slits, and any angle adjustment is made by electrostatically steering the beam, while cross-wafer beam parallelism is adjusted by changing the focus of the electrostatic parallelizing lens (P-lens). In the vertical direction, the beam angle is measured through a high aspect ratio mask, and any angle adjustment is made by slightly tilting the wafer platen prior to implant. A variety of tests were run to measure the accuracy and repeatability of Optima MDxt's angle control. SIMS profiles of a high energy, channeling sensitive condition show both the cross-wafer angle uniformity, along with the small-angle resolution of the system. Angle repeatability was quantified by running a channeling sensitive implant as a regular monitor over a seven month period and measuring the sheet resistance-to-angle sensitivity. Even though crystal cut error was not controlled for in this case, when attributing all Rs variation to angle changes, the overall angle repeatability was measured as 0.16 Degree-Sign (1{sigma}). A separate angle repeatability test involved running a series of V-curves tests over a four month period using low crystal cut wafers selected from the same boule. The results of this test showed the angle repeatability to be <0.1 Degree-Sign (1{sigma}).

  13. Multi-cameras calibration from spherical targets

    NASA Astrophysics Data System (ADS)

    Zhao, Chengyun; Zhang, Jin; Deng, Huaxia; Yu, Liandong

    2016-01-01

    Multi-cameras calibration using spheres is more convenient than using planar target because it has an obvious advantage in imaging in different angles. The internal and external parameters of multi-cameras can be obtained through once calibrat ion. In this paper, a novel mult i-cameras calibration method is proposed based on multiple spheres. A calibration target with fixed multiple balls is applied in this method and the geometric propert ies of the sphere projection model will be analyzed. During the experiment, the spherical target is placed in the public field of mult i-cameras system. Then the corresponding data can be stored when the cameras are triggered by signal generator. The contours of the balls are detected by Hough transform and the center coordinates are determined with sub-pixel accuracy. Then the center coordinates are used as input information for calibrat ion and the internal as well as external parameters can be calculated by Zhang's theory. When mult iple cameras are calibrated simultaneously from different angles using multiple spheres, the center coordinates of each sphere can be determined accurately even the target images taken out of focus. So this method can improve the calibration precision. Meanwhile, Zhang's plane template method is added to the contrast calibrat ion experiment. And the error sources of the experiment are analyzed. The results indicate that the method proposed in this paper is suitable for mult i-cameras calibrat ion.

  14. Determining Camera Gain in Room Temperature Cameras

    SciTech Connect

    Joshua Cogliati

    2010-12-01

    James R. Janesick provides a method for determining the amplification of a CCD or CMOS camera when only access to the raw images is provided. However, the equation that is provided ignores the contribution of dark current. For CCD or CMOS cameras that are cooled well below room temperature, this is not a problem, however, the technique needs adjustment for use with room temperature cameras. This article describes the adjustment made to the equation, and a test of this method.

  15. Two-Camera Acquisition and Tracking of a Flying Target

    NASA Technical Reports Server (NTRS)

    Biswas, Abhijit; Assad, Christopher; Kovalik, Joseph M.; Pain, Bedabrata; Wrigley, Chris J.; Twiss, Peter

    2008-01-01

    A method and apparatus have been developed to solve the problem of automated acquisition and tracking, from a location on the ground, of a luminous moving target in the sky. The method involves the use of two electronic cameras: (1) a stationary camera having a wide field of view, positioned and oriented to image the entire sky; and (2) a camera that has a much narrower field of view (a few degrees wide) and is mounted on a two-axis gimbal. The wide-field-of-view stationary camera is used to initially identify the target against the background sky. So that the approximate position of the target can be determined, pixel locations on the image-detector plane in the stationary camera are calibrated with respect to azimuth and elevation. The approximate target position is used to initially aim the gimballed narrow-field-of-view camera in the approximate direction of the target. Next, the narrow-field-of view camera locks onto the target image, and thereafter the gimbals are actuated as needed to maintain lock and thereby track the target with precision greater than that attainable by use of the stationary camera.

  16. Full Stokes polarization imaging camera

    NASA Astrophysics Data System (ADS)

    Vedel, M.; Breugnot, S.; Lechocinski, N.

    2011-10-01

    Objective and background: We present a new version of Bossa Nova Technologies' passive polarization imaging camera. The previous version was performing live measurement of the Linear Stokes parameters (S0, S1, S2), and its derivatives. This new version presented in this paper performs live measurement of Full Stokes parameters, i.e. including the fourth parameter S3 related to the amount of circular polarization. Dedicated software was developed to provide live images of any Stokes related parameters such as the Degree Of Linear Polarization (DOLP), the Degree Of Circular Polarization (DOCP), the Angle Of Polarization (AOP). Results: We first we give a brief description of the camera and its technology. It is a Division Of Time Polarimeter using a custom ferroelectric liquid crystal cell. A description of the method used to calculate Data Reduction Matrix (DRM)5,9 linking intensity measurements and the Stokes parameters is given. The calibration was developed in order to maximize the condition number of the DRM. It also allows very efficient post processing of the images acquired. Complete evaluation of the precision of standard polarization parameters is described. We further present the standard features of the dedicated software that was developed to operate the camera. It provides live images of the Stokes vector components and the usual associated parameters. Finally some tests already conducted are presented. It includes indoor laboratory and outdoor measurements. This new camera will be a useful tool for many applications such as biomedical, remote sensing, metrology, material studies, and others.

  17. Anisotropic de Gennes Narrowing in Confined Fluids.

    PubMed

    Nygård, Kim; Buitenhuis, Johan; Kagias, Matias; Jefimovs, Konstantins; Zontone, Federico; Chushkin, Yuriy

    2016-04-22

    The collective diffusion of dense fluids in spatial confinement is studied by combining high-energy (21 keV) x-ray photon correlation spectroscopy and small-angle x-ray scattering from colloid-filled microfluidic channels. We find the structural relaxation in confinement to be slower compared to the bulk. The collective dynamics is wave vector dependent, akin to the de Gennes narrowing typically observed in bulk fluids. However, in stark contrast to the bulk, the structure factor and de Gennes narrowing in confinement are anisotropic. These experimental observations are essential in order to develop a microscopic theoretical description of collective diffusion of dense fluids in confined geometries. PMID:27152823

  18. Evaluating intensified camera systems

    SciTech Connect

    S. A. Baker

    2000-06-30

    This paper describes image evaluation techniques used to standardize camera system characterizations. The authors group is involved with building and fielding several types of camera systems. Camera types include gated intensified cameras, multi-frame cameras, and streak cameras. Applications range from X-ray radiography to visible and infrared imaging. Key areas of performance include sensitivity, noise, and resolution. This team has developed an analysis tool, in the form of image processing software, to aid an experimenter in measuring a set of performance metrics for their camera system. These performance parameters are used to identify a camera system's capabilities and limitations while establishing a means for camera system comparisons. The analysis tool is used to evaluate digital images normally recorded with CCD cameras. Electro-optical components provide fast shuttering and/or optical gain to camera systems. Camera systems incorporate a variety of electro-optical components such as microchannel plate (MCP) or proximity focused diode (PFD) image intensifiers; electro-static image tubes; or electron-bombarded (EB) CCDs. It is often valuable to evaluate the performance of an intensified camera in order to determine if a particular system meets experimental requirements.

  19. Study on airflow characteristics in the semi-closed irregular narrow flow channel

    NASA Astrophysics Data System (ADS)

    Jin, Yuzhen; Hu, Xiaodong; Zhu, Linhang; Hu, Xudong; Jin, Yingzi

    2016-04-01

    The air-jet loom is widely used in the textile industry. The interaction mechanism of airflow and yarn is not clear in such a narrow flow channel, the gas consumption is relatively large, the yarn motion is unstable and the weft insertion is often interrupted during the operation. In order to study the characteristics of the semi-closed flow field in profiled dents, the momentum conservation equation is modified and the model parameters and boundary conditions are set. Compared with the different r, the ratio of profiled dent's thickness and gap, the results show that the smaller the r is, the smaller the velocity fluctuations of the airflow is. When the angle of profiled dents α is close to zero, the diffusion of the airflow will be less. The experiment is also conducted to verify the result of the simulation with a high-speed camera and pressure sensor in profiled dents. The airflow characteristics in the semi-closed irregular narrow flow channel in the paper would provide the theoretical basis for optimizing the weft insertion process of the air-jet loom.

  20. Vacuum Camera Cooler

    NASA Technical Reports Server (NTRS)

    Laugen, Geoffrey A.

    2011-01-01

    Acquiring cheap, moving video was impossible in a vacuum environment, due to camera overheating. This overheating is brought on by the lack of cooling media in vacuum. A water-jacketed camera cooler enclosure machined and assembled from copper plate and tube has been developed. The camera cooler (see figure) is cup-shaped and cooled by circulating water or nitrogen gas through copper tubing. The camera, a store-bought "spy type," is not designed to work in a vacuum. With some modifications the unit can be thermally connected when mounted in the cup portion of the camera cooler. The thermal conductivity is provided by copper tape between parts of the camera and the cooled enclosure. During initial testing of the demonstration unit, the camera cooler kept the CPU (central processing unit) of this video camera at operating temperature. This development allowed video recording of an in-progress test, within a vacuum environment.

  1. Novel fundus camera design

    NASA Astrophysics Data System (ADS)

    Dehoog, Edward A.

    A fundus camera a complex optical system that makes use of the principle of reflex free indirect ophthalmoscopy to image the retina. Despite being in existence as early as 1900's, little has changed in the design of a fundus camera and there is minimal information about the design principles utilized. Parameters and specifications involved in the design of fundus camera are determined and their affect on system performance are discussed. Fundus cameras incorporating different design methods are modeled and a performance evaluation based on design parameters is used to determine the effectiveness of each design strategy. By determining the design principles involved in the fundus camera, new cameras can be designed to include specific imaging modalities such as optical coherence tomography, imaging spectroscopy and imaging polarimetry to gather additional information about properties and structure of the retina. Design principles utilized to incorporate such modalities into fundus camera systems are discussed. Design, implementation and testing of a snapshot polarimeter fundus camera are demonstrated.

  2. Advanced camera for surveys

    NASA Astrophysics Data System (ADS)

    Clampin, Mark; Ford, Holland C.; Bartko, Frank; Bely, Pierre Y.; Broadhurst, Tom; Burrows, Christopher J.; Cheng, Edward S.; Crocker, James H.; Franx, Marijn; Feldman, Paul D.; Golimowski, David A.; Hartig, George F.; Illingworth, Garth; Kimble, Randy A.; Lesser, Michael P.; Miley, George H.; Postman, Marc; Rafal, Marc D.; Rosati, Piero; Sparks, William B.; Tsvetanov, Zlatan; White, Richard L.; Sullivan, Pamela; Volmer, Paul; LaJeunesse, Tom

    2000-07-01

    The Advanced Camera for Surveys (ACS) is a third generation instrument for the Hubble Space Telescope (HST). It is currently planned for installation in HST during the fourth servicing mission in Summer 2001. The ACS will have three cameras.

  3. Constrained space camera assembly

    DOEpatents

    Heckendorn, Frank M.; Anderson, Erin K.; Robinson, Casandra W.; Haynes, Harriet B.

    1999-01-01

    A constrained space camera assembly which is intended to be lowered through a hole into a tank, a borehole or another cavity. The assembly includes a generally cylindrical chamber comprising a head and a body and a wiring-carrying conduit extending from the chamber. Means are included in the chamber for rotating the body about the head without breaking an airtight seal formed therebetween. The assembly may be pressurized and accompanied with a pressure sensing means for sensing if a breach has occurred in the assembly. In one embodiment, two cameras, separated from their respective lenses, are installed on a mounting apparatus disposed in the chamber. The mounting apparatus includes means allowing both longitudinal and lateral movement of the cameras. Moving the cameras longitudinally focuses the cameras, and moving the cameras laterally away from one another effectively converges the cameras so that close objects can be viewed. The assembly further includes means for moving lenses of different magnification forward of the cameras.

  4. Making Ceramic Cameras

    ERIC Educational Resources Information Center

    Squibb, Matt

    2009-01-01

    This article describes how to make a clay camera. This idea of creating functional cameras from clay allows students to experience ceramics, photography, and painting all in one unit. (Contains 1 resource and 3 online resources.)

  5. Nanosecond frame cameras

    SciTech Connect

    Frank, A M; Wilkins, P R

    2001-01-05

    The advent of CCD cameras and computerized data recording has spurred the development of several new cameras and techniques for recording nanosecond images. We have made a side by side comparison of three nanosecond frame cameras, examining them for both performance and operational characteristics. The cameras include; Micro-Channel Plate/CCD, Image Diode/CCD and Image Diode/Film; combinations of gating/data recording. The advantages and disadvantages of each device will be discussed.

  6. Digital Pinhole Camera

    ERIC Educational Resources Information Center

    Lancor, Rachael; Lancor, Brian

    2014-01-01

    In this article we describe how the classic pinhole camera demonstration can be adapted for use with digital cameras. Students can easily explore the effects of the size of the pinhole and its distance from the sensor on exposure time, magnification, and image quality. Instructions for constructing a digital pinhole camera and our method for…

  7. Evaluating intensified camera systems

    SciTech Connect

    S. A. Baker

    2000-07-01

    This paper describes image evaluation techniques used to standardize camera system characterizations. Key areas of performance include resolution, noise, and sensitivity. This team has developed a set of analysis tools, in the form of image processing software used to evaluate camera calibration data, to aid an experimenter in measuring a set of camera performance metrics. These performance metrics identify capabilities and limitations of the camera system, while establishing a means for comparing camera systems. Analysis software is used to evaluate digital camera images recorded with charge-coupled device (CCD) cameras. Several types of intensified camera systems are used in the high-speed imaging field. Electro-optical components are used to provide precise shuttering or optical gain for a camera system. These components including microchannel plate or proximity focused diode image intensifiers, electro-static image tubes, or electron-bombarded CCDs affect system performance. It is important to quantify camera system performance in order to qualify a system as meeting experimental requirements. The camera evaluation tool is designed to provide side-by-side camera comparison and system modeling information.

  8. Harpicon camera for HDTV

    NASA Astrophysics Data System (ADS)

    Tanada, Jun

    1992-08-01

    Ikegami has been involved in broadcast equipment ever since it was established as a company. In conjunction with NHK it has brought forth countless television cameras, from black-and-white cameras to color cameras, HDTV cameras, and special-purpose cameras. In the early days of HDTV (high-definition television, also known as "High Vision") cameras the specifications were different from those for the cameras of the present-day system, and cameras using all kinds of components, having different arrangements of components, and having different appearances were developed into products, with time spent on experimentation, design, fabrication, adjustment, and inspection. But recently the knowhow built up thus far in components, , printed circuit boards, and wiring methods has been incorporated in camera fabrication, making it possible to make HDTV cameras by metbods similar to the present system. In addition, more-efficient production, lower costs, and better after-sales service are being achieved by using the same circuits, components, mechanism parts, and software for both HDTV cameras and cameras that operate by the present system.

  9. Prediction of Viking lander camera image quality

    NASA Technical Reports Server (NTRS)

    Huck, F. O.; Burcher, E. E.; Jobson, D. J.; Wall, S. D.

    1976-01-01

    Formulations are presented that permit prediction of image quality as a function of camera performance, surface radiance properties, and lighting and viewing geometry. Predictions made for a wide range of surface radiance properties reveal that image quality depends strongly on proper camera dynamic range command and on favorable lighting and viewing geometry. Proper camera dynamic range commands depend mostly on the surface albedo that will be encountered. Favorable lighting and viewing geometries depend mostly on lander orientation with respect to the diurnal sun path over the landing site, and tend to be independent of surface albedo and illumination scattering function. Side lighting with low sun elevation angles (10 to 30 deg) is generally favorable for imaging spatial details and slopes, whereas high sun elevation angles are favorable for measuring spectral reflectances.

  10. Ultraviolet Spectroscopy of Narrow Coronal Mass Ejections

    NASA Astrophysics Data System (ADS)

    Dobrzycka, D.; Raymond, J. C.; Biesecker, D. A.; Li, J.; Ciaravella, A.

    2003-05-01

    We present Ultraviolet Coronagraph Spectrometer (UVCS) observations of five narrow coronal mass ejections (CMEs) that were among 15 narrow CMEs originally selected by Gilbert and coworkers. Two events (1999 March 27, April 15) were ``structured,'' i.e., in white-light data they exhibited well-defined interior features, and three (1999 May 9, May 21, June 3) were ``unstructured,'' i.e., appeared featureless. In UVCS data the events were seen as 4°-13° wide enhancements of the strongest coronal lines H I Lyα and O VI λλ1032, 1037. We derived electron densities for several of the events from the Large Angle and Spectrometric Coronagraph Experiment (LASCO) C2 white-light observations. They are comparable to or smaller than densities inferred for other CMEs. We modeled the observable properties of examples of the structured (1999 April 15) and unstructured (1999 May 9) narrow CMEs at different heights in the corona between 1.5 and 2 Rsolar. The derived electron temperatures, densities, and outflow speeds are similar for those two types of ejections. They were compared with properties of polar coronal jets and other CMEs. We discuss different scenarios of narrow CME formation as either a jet formed by reconnection onto open field lines or a CME ejected by expansion of closed field structures. Overall, we conclude that the existing observations do not definitively place the narrow CMEs into the jet or the CME picture, but the acceleration of the 1999 April 15 event resembles acceleration seen in many CMEs, rather than constant speeds or deceleration observed in jets.

  11. Contrail study with ground-based cameras

    NASA Astrophysics Data System (ADS)

    Schumann, U.; Hempel, R.; Flentje, H.; Garhammer, M.; Graf, K.; Kox, S.; Lösslein, H.; Mayer, B.

    2013-08-01

    Photogrammetric methods and analysis results for contrails observed with wide-angle cameras are described. Four cameras of two different types (view angle < 90° or whole-sky imager) at the ground at various positions are used to track contrails and to derive their altitude, width, and horizontal speed. Camera models for both types are described to derive the observation angles for given image coordinates and their inverse. The models are calibrated with sightings of the Sun, the Moon and a few bright stars. The methods are applied and tested in a case study. Four persistent contrails crossing each other together with a short-lived one are observed with the cameras. Vertical and horizontal positions of the contrails are determined from the camera images to an accuracy of better than 200 m and horizontal speed to 0.2 m s-1. With this information, the aircraft causing the contrails are identified by comparison to traffic waypoint data. The observations are compared with synthetic camera pictures of contrails simulated with the contrail prediction model CoCiP, a Lagrangian model using air traffic movement data and numerical weather prediction (NWP) data as input. The results provide tests for the NWP and contrail models. The cameras show spreading and thickening contrails suggesting ice-supersaturation in the ambient air. The ice-supersaturated layer is found thicker and more humid in this case than predicted by the NWP model used. The simulated and observed contrail positions agree up to differences caused by uncertain wind data. The contrail widths, which depend on wake vortex spreading, ambient shear and turbulence, were partly wider than simulated.

  12. Contrail study with ground-based cameras

    NASA Astrophysics Data System (ADS)

    Schumann, U.; Hempel, R.; Flentje, H.; Garhammer, M.; Graf, K.; Kox, S.; Lösslein, H.; Mayer, B.

    2013-12-01

    Photogrammetric methods and analysis results for contrails observed with wide-angle cameras are described. Four cameras of two different types (view angle < 90° or whole-sky imager) at the ground at various positions are used to track contrails and to derive their altitude, width, and horizontal speed. Camera models for both types are described to derive the observation angles for given image coordinates and their inverse. The models are calibrated with sightings of the Sun, the Moon and a few bright stars. The methods are applied and tested in a case study. Four persistent contrails crossing each other, together with a short-lived one, are observed with the cameras. Vertical and horizontal positions of the contrails are determined from the camera images to an accuracy of better than 230 m and horizontal speed to 0.2 m s-1. With this information, the aircraft causing the contrails are identified by comparison to traffic waypoint data. The observations are compared with synthetic camera pictures of contrails simulated with the contrail prediction model CoCiP, a Lagrangian model using air traffic movement data and numerical weather prediction (NWP) data as input. The results provide tests for the NWP and contrail models. The cameras show spreading and thickening contrails, suggesting ice-supersaturation in the ambient air. The ice-supersaturated layer is found thicker and more humid in this case than predicted by the NWP model used. The simulated and observed contrail positions agree up to differences caused by uncertain wind data. The contrail widths, which depend on wake vortex spreading, ambient shear and turbulence, were partly wider than simulated.

  13. 10. CONSTRUCTION OF FIXED CAMERA STATION 1100 (BUILDING NO. 42020) ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    10. CONSTRUCTION OF FIXED CAMERA STATION 1100 (BUILDING NO. 42020) LOOKING NORTHEAST SHOWING CONCRETE FOUNDATION, WOOD FORMWORK AND STEEL REINFORCING, March 26, 1948. (Original photograph in possession of Dave Willis, San Diego, California.) - Variable Angle Launcher Complex, Camera Stations, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  14. 9. COMPLETED ROLLING CAMERA CAR ON RAILROAD TRACK AND BRIDGE ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    9. COMPLETED ROLLING CAMERA CAR ON RAILROAD TRACK AND BRIDGE LOOKING WEST, APRIL 26, 1948. (ORIGINAL PHOTOGRAPH IN POSSESSION OF DAVE WILLIS, SAN DIEGO, CALIFORNIA.) - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  15. Tower Camera Handbook

    SciTech Connect

    Moudry, D

    2005-01-01

    The tower camera in Barrow provides hourly images of ground surrounding the tower. These images may be used to determine fractional snow cover as winter arrives, for comparison with the albedo that can be calculated from downward-looking radiometers, as well as some indication of present weather. Similarly, during spring time, the camera images show the changes in the ground albedo as the snow melts. The tower images are saved in hourly intervals. In addition, two other cameras, the skydeck camera in Barrow and the piling camera in Atqasuk, show the current conditions at those sites.

  16. Compact stereo endoscopic camera using microprism arrays.

    PubMed

    Yang, Sung-Pyo; Kim, Jae-Jun; Jang, Kyung-Won; Song, Weon-Kook; Jeong, Ki-Hun

    2016-03-15

    This work reports a microprism array (MPA) based compact stereo endoscopic camera with a single image sensor. The MPAs were monolithically fabricated by using two-step photolithography and geometry-guided resist reflow to form an appropriate prism angle for stereo image pair formation. The fabricated MPAs were transferred onto a glass substrate with a UV curable resin replica by using polydimethylsiloxane (PDMS) replica molding and then successfully integrated in front of a single camera module. The stereo endoscopic camera with MPA splits an image into two stereo images and successfully demonstrates the binocular disparities between the stereo image pairs for objects with different distances. This stereo endoscopic camera can serve as a compact and 3D imaging platform for medical, industrial, or military uses. PMID:26977690

  17. Design Versatility Of The Prism Panoramic Camera: The KS-116 And KA-95 Cameras

    NASA Astrophysics Data System (ADS)

    Ruck, Richard

    1981-12-01

    The increased need for smaller and more versatile aerial reconnaissance equipment has led to the use of the KS-116 camera in the RF-4B aircraft, and the KA-95 in the RIF-5E. Both cameras use a 12-inch fl, f/4.0 lens and a 5-inch film transport. The main difference between these two cameras is their configuration. This paper describes the features of the two cameras, including: selectable scan angle, forward motion compensation (FMC), roll stabilization, exposure control, unique packaging differences and focus correction. The inherent packaging flexibility of the prism pan camera and the availability of key modules have led to multiple configurations of which the KS-116 and KA-95 are two examples.

  18. Classroom multispectral imaging using inexpensive digital cameras.

    NASA Astrophysics Data System (ADS)

    Fortes, A. D.

    2007-12-01

    The proliferation of increasingly cheap digital cameras in recent years means that it has become easier to exploit the broad wavelength sensitivity of their CCDs (360 - 1100 nm) for classroom-based teaching. With the right tools, it is possible to open children's eyes to the invisible world of UVA and near-IR radiation either side of our narrow visual band. The camera-filter combinations I describe can be used to explore the world of animal vision, looking for invisible markings on flowers, or in bird plumage, for example. In combination with a basic spectroscope (such as the Project-STAR handheld plastic spectrometer, 25), it is possible to investigate the range of human vision and camera sensitivity, and to explore the atomic and molecular absorption lines from the solar and terrestrial atmospheres. My principal use of the cameras has been to teach multispectral imaging of the kind used to determine remotely the composition of planetary surfaces. A range of camera options, from 50 circuit-board mounted CCDs up to $900 semi-pro infrared camera kits (including mobile phones along the way), and various UV-vis-IR filter options will be presented. Examples of multispectral images taken with these systems are used to illustrate the range of classroom topics that can be covered. Particular attention is given to learning about spectral reflectance curves and comparing images from Earth and Mars taken using the same filter combination that it used on the Mars Rovers.

  19. The nucleus of comet 67P through the eyes of the OSIRIS cameras

    NASA Astrophysics Data System (ADS)

    Guettler, Carsten; Sierks, Holger; Barbieri, Cesare; Lamy, Philippe; Rodrigo, Rafael; Koschny, Detlef; Rickman, Hans; OSIRIS Team; Capaccioni, Fabrizio; Filacchione, Gianrico; Ciarniello, Mauro; Erard, Stephane; Rinaldi, Giovanna; Tosi, Federico

    2015-11-01

    The Rosetta spacecraft is studying comet 67P/Churyumov-Gerasimenko from a close distance since August 2014. Onboard the spacecraft, the two scientific cameras, the OSIRIS narrow- and the wide-angle camera, are observing the cometary nucleus, its activity, as well as the dust and gas environment.This overview paper will cover OSIRIS science from the early arrival and mapping phase, the PHILAE landing, and the escort phase including the two close fly-bys. With a first characterization of global physical parameters of the nucleus, the OSIRIS cameras also provided the data to reconstruct a 3D shape model of the comet and a division into morphologic sub-units. From observations of near-surface activity, jet-like features can be projected onto the surface and active sources can be correlated with surface features like cliffs, pits, or flat planes. The increase of activity during and after perihelion in August 2015 showed several outbursts, which were seen as strong, collimated jets originating from the southern hemisphere.A comparison of results between different Rosetta instruments will give further inside into the physics of the comet's nucleus and its coma. The OSIRIS and VIRTIS instruments are particularly well suited to support and complement each other. With an overlap in spectral range, one instrument can provide the best spatial resolution while the other is strong in the spectral resolution. A summary on collaborative efforts will be given.

  20. Automated Camera Calibration

    NASA Technical Reports Server (NTRS)

    Chen, Siqi; Cheng, Yang; Willson, Reg

    2006-01-01

    Automated Camera Calibration (ACAL) is a computer program that automates the generation of calibration data for camera models used in machine vision systems. Machine vision camera models describe the mapping between points in three-dimensional (3D) space in front of the camera and the corresponding points in two-dimensional (2D) space in the camera s image. Calibrating a camera model requires a set of calibration data containing known 3D-to-2D point correspondences for the given camera system. Generating calibration data typically involves taking images of a calibration target where the 3D locations of the target s fiducial marks are known, and then measuring the 2D locations of the fiducial marks in the images. ACAL automates the analysis of calibration target images and greatly speeds the overall calibration process.

  1. Automatic camera tracking for remote manipulators

    SciTech Connect

    Stoughton, R.S.; Martin, H.L.; Bentz, R.R.

    1984-04-01

    The problem of automatic camera tracking of mobile objects is addressed with specific reference to remote manipulators and using either fixed or mobile cameras. The technique uses a kinematic approach employing 4 x 4 coordinate transformation matrices to solve for the needed camera PAN and TILT angles. No vision feedback systems are used, as the required input data are obtained entirely from position sensors from the manipulator and the camera-positioning system. All hardware requirements are generally satisfied by currently available remote manipulator systems with a supervisory computer. The system discussed here implements linear plus on/off (bang-bang) closed-loop control with a +-2/sup 0/ deadband. The deadband area is desirable to avoid operator seasickness caused by continuous camera movement. Programming considerations for camera control, including operator interface options, are discussed. The example problem presented is based on an actual implementation using a PDP 11/34 computer, a TeleOperator Systems SM-229 manipulator, and an Oak Ridge National Laboratory (ORNL) camera-positioning system. 3 references, 6 figures, 2 tables.

  2. Automatic camera tracking for remote manipulators

    SciTech Connect

    Stoughton, R.S.; Martin, H.L.; Bentz, R.R.

    1984-07-01

    The problem of automatic camera tracking of mobile objects is addressed with specific reference to remote manipulators and using either fixed or mobile cameras. The technique uses a kinematic approach employing 4 x 4 coordinate transformation matrices to solve for the needed camera PAN and TILT angles. No vision feedback systems are used, as the required input data are obtained entirely from position sensors from the manipulator and the camera-positioning system. All hardware requirements are generally satisfied by currently available remote manipulator systems with a supervisory computer. The system discussed here implements linear plus on/off (bang-bang) closed-loop control with a +-2-deg deadband. The deadband area is desirable to avoid operator seasickness caused by continuous camera movement. Programming considerations for camera control, including operator interface options, are discussed. The example problem presented is based on an actual implementation using a PDP 11/34 computer, a TeleOperator Systems SM-229 manipulator, and an Oak Ridge National Laboratory (ORNL) camera-positioning system. 3 references, 6 figures, 2 tables.

  3. Ultra-fast framing camera tube

    DOEpatents

    Kalibjian, Ralph

    1981-01-01

    An electronic framing camera tube features focal plane image dissection and synchronized restoration of the dissected electron line images to form two-dimensional framed images. Ultra-fast framing is performed by first streaking a two-dimensional electron image across a narrow slit, thereby dissecting the two-dimensional electron image into sequential electron line images. The dissected electron line images are then restored into a framed image by a restorer deflector operated synchronously with the dissector deflector. The number of framed images on the tube's viewing screen is equal to the number of dissecting slits in the tube. The distinguishing features of this ultra-fast framing camera tube are the focal plane dissecting slits, and the synchronously-operated restorer deflector which restores the dissected electron line images into a two-dimensional framed image. The framing camera tube can produce image frames having high spatial resolution of optical events in the sub-100 picosecond range.

  4. Angle detector

    NASA Technical Reports Server (NTRS)

    Parra, G. T. (Inventor)

    1978-01-01

    An angle detector for determining a transducer's angular disposition to a capacitive pickup element is described. The transducer comprises a pendulum mounted inductive element moving past the capacitive pickup element. The capacitive pickup element divides the inductive element into two parts L sub 1 and L sub 2 which form the arms of one side of an a-c bridge. Two networks R sub 1 and R sub 2 having a plurality of binary weighted resistors and an equal number of digitally controlled switches for removing resistors from the networks form the arms of the other side of the a-c bridge. A binary counter, controlled by a phase detector, balances the bridge by adjusting the resistance of R sub 1 and R sub 2. The binary output of the counter is representative of the angle.

  5. Microchannel plate streak camera

    DOEpatents

    Wang, Ching L.

    1989-01-01

    An improved streak camera in which a microchannel plate electron multiplier is used in place of or in combination with the photocathode used in prior streak cameras. The improved streak camera is far more sensitive to photons (UV to gamma-rays) than the conventional x-ray streak camera which uses a photocathode. The improved streak camera offers gamma-ray detection with high temporal resolution. It also offers low-energy x-ray detection without attenuation inside the cathode. Using the microchannel plate in the improved camera has resulted in a time resolution of about 150 ps, and has provided a sensitivity sufficient for 1000 KeV x-rays.

  6. GRACE star camera noise

    NASA Astrophysics Data System (ADS)

    Harvey, Nate

    2016-08-01

    Extending results from previous work by Bandikova et al. (2012) and Inacio et al. (2015), this paper analyzes Gravity Recovery and Climate Experiment (GRACE) star camera attitude measurement noise by processing inter-camera quaternions from 2003 to 2015. We describe a correction to star camera data, which will eliminate a several-arcsec twice-per-rev error with daily modulation, currently visible in the auto-covariance function of the inter-camera quaternion, from future GRACE Level-1B product releases. We also present evidence supporting the argument that thermal conditions/settings affect long-term inter-camera attitude biases by at least tens-of-arcsecs, and that several-to-tens-of-arcsecs per-rev star camera errors depend largely on field-of-view.

  7. Microchannel plate streak camera

    DOEpatents

    Wang, C.L.

    1989-03-21

    An improved streak camera in which a microchannel plate electron multiplier is used in place of or in combination with the photocathode used in prior streak cameras is disclosed. The improved streak camera is far more sensitive to photons (UV to gamma-rays) than the conventional x-ray streak camera which uses a photocathode. The improved streak camera offers gamma-ray detection with high temporal resolution. It also offers low-energy x-ray detection without attenuation inside the cathode. Using the microchannel plate in the improved camera has resulted in a time resolution of about 150 ps, and has provided a sensitivity sufficient for 1,000 KeV x-rays. 3 figs.

  8. Microchannel plate streak camera

    DOEpatents

    Wang, C.L.

    1984-09-28

    An improved streak camera in which a microchannel plate electron multiplier is used in place of or in combination with the photocathode used in prior streak cameras. The improved streak camera is far more sensitive to photons (uv to gamma-rays) than the conventional x-ray streak camera which uses a photocathode. The improved streak camera offers gamma-ray detection with high temporal resolution. It also offers low-energy x-ray detection without attenuation inside the cathode. Using the microchannel plate in the improved camera has resulted in a time resolution of about 150 ps, and has provided a sensitivity sufficient for 1000 keV x-rays.

  9. Analytical multicollimator camera calibration

    USGS Publications Warehouse

    Tayman, W.P.

    1978-01-01

    Calibration with the U.S. Geological survey multicollimator determines the calibrated focal length, the point of symmetry, the radial distortion referred to the point of symmetry, and the asymmetric characteristiecs of the camera lens. For this project, two cameras were calibrated, a Zeiss RMK A 15/23 and a Wild RC 8. Four test exposures were made with each camera. Results are tabulated for each exposure and averaged for each set. Copies of the standard USGS calibration reports are included. ?? 1978.

  10. LSST Camera Optics Design

    SciTech Connect

    Riot, V J; Olivier, S; Bauman, B; Pratuch, S; Seppala, L; Gilmore, D; Ku, J; Nordby, M; Foss, M; Antilogus, P; Morgado, N

    2012-05-24

    The Large Synoptic Survey Telescope (LSST) uses a novel, three-mirror, telescope design feeding a camera system that includes a set of broad-band filters and three refractive corrector lenses to produce a flat field at the focal plane with a wide field of view. Optical design of the camera lenses and filters is integrated in with the optical design of telescope mirrors to optimize performance. We discuss the rationale for the LSST camera optics design, describe the methodology for fabricating, coating, mounting and testing the lenses and filters, and present the results of detailed analyses demonstrating that the camera optics will meet their performance goals.

  11. Polarization encoded color camera.

    PubMed

    Schonbrun, Ethan; Möller, Guðfríður; Di Caprio, Giuseppe

    2014-03-15

    Digital cameras would be colorblind if they did not have pixelated color filters integrated into their image sensors. Integration of conventional fixed filters, however, comes at the expense of an inability to modify the camera's spectral properties. Instead, we demonstrate a micropolarizer-based camera that can reconfigure its spectral response. Color is encoded into a linear polarization state by a chiral dispersive element and then read out in a single exposure. The polarization encoded color camera is capable of capturing three-color images at wavelengths spanning the visible to the near infrared. PMID:24690806

  12. Ringfield lithographic camera

    DOEpatents

    Sweatt, William C.

    1998-01-01

    A projection lithography camera is presented with a wide ringfield optimized so as to make efficient use of extreme ultraviolet radiation from a large area radiation source (e.g., D.sub.source .apprxeq.0.5 mm). The camera comprises four aspheric mirrors optically arranged on a common axis of symmetry with an increased etendue for the camera system. The camera includes an aperture stop that is accessible through a plurality of partial aperture stops to synthesize the theoretical aperture stop. Radiation from a mask is focused to form a reduced image on a wafer, relative to the mask, by reflection from the four aspheric mirrors.

  13. Ultraviolet Spectroscopy of Narrow CMEs

    NASA Astrophysics Data System (ADS)

    Dobrzycka, D.; Raymond, J. C.; Biesecker, D. A.; Li, J.; Ciaravella, A.

    2002-12-01

    Coronal mass ejections (CMEs) are commonly described as new, discrete, bright features appearing in the field of view of a white light coronagraph and moving outward over a period of minutes to hours. Apparent angular widths of the CMEs cover a wide range, from few to 360°. The very narrow structures (narrower than ~15-20°) form only a small subset of all the observed CMEs and are usually referred to as rays, spikes, fans, etc. Recently, Gilbert et al. (2001, ApJ, 550, 1093) reported LASCO white light observations of 15 selected narrow CMEs. We extended the study and analyzed ultraviolet spectroscopy of narrow ejections, including several events listed by Gilbert et al. The data were obtained by the Ultraviolet Coronagraph Spectrometer (UVCS/SOHO). We present comparison of narrow and large CMEs and discuss the relation of the narrow CMEs to coronal jets and/or other narrow transient events. This work is supported by NASA under Grant NAG5-11420 to the Smithsonian Astrophysical Observatory, by the Italian Space Agency and by PRODEX (Swiss contribution).

  14. New developments to improve SO2 cameras

    NASA Astrophysics Data System (ADS)

    Luebcke, P.; Bobrowski, N.; Hoermann, C.; Kern, C.; Klein, A.; Kuhn, J.; Vogel, L.; Platt, U.

    2012-12-01

    The SO2 camera is a remote sensing instrument that measures the two-dimensional distribution of SO2 (column densities) in volcanic plumes using scattered solar radiation as a light source. From these data SO2-fluxes can be derived. The high time resolution of the order of 1 Hz allows correlating SO2 flux measurements with other traditional volcanological measurement techniques, i.e., seismology. In the last years the application of SO2 cameras has increased, however, there is still potential to improve the instrumentation. First of all, the influence of aerosols and ash in the volcanic plume can lead to large errors in the calculated SO2 flux, if not accounted for. We present two different concepts to deal with the influence of ash and aerosols. The first approach uses a co-axial DOAS system that was added to a two filter SO2 camera. The camera used Filter A (peak transmission centred around 315 nm) to measures the optical density of SO2 and Filter B (centred around 330 nm) to correct for the influence of ash and aerosol. The DOAS system simultaneously performs spectroscopic measurements in a small area of the camera's field of view and gives additional information to correct for these effects. Comparing the optical densities for the two filters with the SO2 column density from the DOAS allows not only a much more precise calibration, but also to draw conclusions about the influence from ash and aerosol scattering. Measurement examples from Popocatépetl, Mexico in 2011 are shown and interpreted. Another approach combines the SO2 camera measurement principle with the extremely narrow and periodic transmission of a Fabry-Pérot interferometer. The narrow transmission window allows to select individual SO2 absorption bands (or series of bands) as a substitute for Filter A. Measurements are therefore more selective to SO2. Instead of Filter B, as in classical SO2 cameras, the correction for aerosol can be performed by shifting the transmission window of the Fabry

  15. Advantage of diverging radial type for mobile stereo camera

    NASA Astrophysics Data System (ADS)

    Lee, Dong-Su; Mun, Sungchul; Park, Min-Chul; Son, Jung-Young

    2013-05-01

    Distortions in the perceived image characteristics for three different camera arrangements of parallel, converging, and diverging are different according to each focal length, focus distance, field of view angle, color, magnification, and camera aligning direction. The distortions in perceived image for the parallel and converging arrangements have been researched commercially available stereoscopic TV based on high speed LCD, shutter glasses, and mobile devices. However, the distortion in the perceived image for diverging arrangement is not well known. This paper discusses the distortion in perceived image characteristics of diverging type stereo camera according to the magnification determining the enlargement and reduction of a camera image, and they are compared with those of other camera arrangements such as parallel and converging types. Also, the distortion induces the image closer to the viewers for the diverging type while away for the converging. The inducement is more prominent as the camera distance between two component cameras of the stereo camera for the diverging type. Furthermore, the effect of diverging angle on disparity will be considered that the inter-camera distance can be made as small as possible.

  16. Improved Tracking of Targets by Cameras on a Mars Rover

    NASA Technical Reports Server (NTRS)

    Kim, Won; Ansar, Adnan; Steele, Robert

    2007-01-01

    A paper describes a method devised to increase the robustness and accuracy of tracking of targets by means of three stereoscopic pairs of video cameras on a Mars-rover-type exploratory robotic vehicle. Two of the camera pairs are mounted on a mast that can be adjusted in pan and tilt; the third camera pair is mounted on the main vehicle body. Elements of the method include a mast calibration, a camera-pointing algorithm, and a purely geometric technique for handing off tracking between different camera pairs at critical distances as the rover approaches a target of interest. The mast calibration is an extension of camera calibration in which the camera images of calibration targets at known positions are collected at various pan and tilt angles. In the camerapointing algorithm, pan and tilt angles are computed by a closed-form, non-iterative solution of inverse kinematics of the mast combined with mathematical models of the cameras. The purely geometric camera-handoff technique involves the use of stereoscopic views of a target of interest in conjunction with the mast calibration.

  17. The Camera Cook Book.

    ERIC Educational Resources Information Center

    Education Development Center, Inc., Newton, MA.

    Intended for use with the photographic materials available from the Workshop for Learning Things, Inc., this "camera cookbook" describes procedures that have been tried in classrooms and workshops and proven to be the most functional and inexpensive. Explicit starting off instructions--directions for exploring and loading the camera and for taking…

  18. Constrained space camera assembly

    DOEpatents

    Heckendorn, F.M.; Anderson, E.K.; Robinson, C.W.; Haynes, H.B.

    1999-05-11

    A constrained space camera assembly which is intended to be lowered through a hole into a tank, a borehole or another cavity is disclosed. The assembly includes a generally cylindrical chamber comprising a head and a body and a wiring-carrying conduit extending from the chamber. Means are included in the chamber for rotating the body about the head without breaking an airtight seal formed therebetween. The assembly may be pressurized and accompanied with a pressure sensing means for sensing if a breach has occurred in the assembly. In one embodiment, two cameras, separated from their respective lenses, are installed on a mounting apparatus disposed in the chamber. The mounting apparatus includes means allowing both longitudinal and lateral movement of the cameras. Moving the cameras longitudinally focuses the cameras, and moving the cameras laterally away from one another effectively converges the cameras so that close objects can be viewed. The assembly further includes means for moving lenses of different magnification forward of the cameras. 17 figs.

  19. CCD Luminescence Camera

    NASA Technical Reports Server (NTRS)

    Janesick, James R.; Elliott, Tom

    1987-01-01

    New diagnostic tool used to understand performance and failures of microelectronic devices. Microscope integrated to low-noise charge-coupled-device (CCD) camera to produce new instrument for analyzing performance and failures of microelectronics devices that emit infrared light during operation. CCD camera also used to indentify very clearly parts that have failed where luminescence typically found.

  20. Camera Operator and Videographer

    ERIC Educational Resources Information Center

    Moore, Pam

    2007-01-01

    Television, video, and motion picture camera operators produce images that tell a story, inform or entertain an audience, or record an event. They use various cameras to shoot a wide range of material, including television series, news and sporting events, music videos, motion pictures, documentaries, and training sessions. Those who film or…

  1. Dry imaging cameras

    PubMed Central

    Indrajit, IK; Alam, Aftab; Sahni, Hirdesh; Bhatia, Mukul; Sahu, Samaresh

    2011-01-01

    Dry imaging cameras are important hard copy devices in radiology. Using dry imaging camera, multiformat images of digital modalities in radiology are created from a sealed unit of unexposed films. The functioning of a modern dry camera, involves a blend of concurrent processes, in areas of diverse sciences like computers, mechanics, thermal, optics, electricity and radiography. Broadly, hard copy devices are classified as laser and non laser based technology. When compared with the working knowledge and technical awareness of different modalities in radiology, the understanding of a dry imaging camera is often superficial and neglected. To fill this void, this article outlines the key features of a modern dry camera and its important issues that impact radiology workflow. PMID:21799589

  2. Detection of pointing errors with CMOS-based camera in intersatellite optical communications

    NASA Astrophysics Data System (ADS)

    Yu, Si-yuan; Ma, Jing; Tan, Li-ying

    2005-01-01

    For very high data rates, intersatellite optical communications hold a potential performance edge over microwave communications. Acquisition and Tracking problem is critical because of the narrow transmit beam. A single array detector in some systems performs both spatial acquisition and tracking functions to detect pointing errors, so both wide field of view and high update rate is required. The past systems tend to employ CCD-based camera with complex readout arrangements, but the additional complexity reduces the applicability of the array based tracking concept. With the development of CMOS array, CMOS-based cameras can employ the single array detector concept. The area of interest feature of the CMOS-based camera allows a PAT system to specify portion of the array. The maximum allowed frame rate increases as the size of the area of interest decreases under certain conditions. A commercially available CMOS camera with 105 fps @ 640×480 is employed in our PAT simulation system, in which only part pixels are used in fact. Beams angle varying in the field of view can be detected after getting across a Cassegrain telescope and an optical focus system. Spot pixel values (8 bits per pixel) reading out from CMOS are transmitted to a DSP subsystem via IEEE 1394 bus, and pointing errors can be computed by the centroid equation. It was shown in test that: (1) 500 fps @ 100×100 is available in acquisition when the field of view is 1mrad; (2)3k fps @ 10×10 is available in tracking when the field of view is 0.1mrad.

  3. What are the benefits of having multiple camera angles?

    Atmospheric Science Data Center

    2014-12-08

    ... can be interpreted (with appropriate models) to document the properties of the target, just as the more familiar spectral differences are exploited to document its chemical composition. Most imaging space-borne instruments acquire ...

  4. Replacing 16 mm film cameras with high definition digital cameras

    SciTech Connect

    Balch, K.S.

    1995-12-31

    For many years 16 mm film cameras have been used in severe environments. These film cameras are used on Hy-G automotive sleds, airborne gun cameras, range tracking and other hazardous environments. The companies and government agencies using these cameras are in need of replacing them with a more cost effective solution. Film-based cameras still produce the best resolving capability, however, film development time, chemical disposal, recurring media cost, and faster digital analysis are factors influencing the desire for a 16 mm film camera replacement. This paper will describe a new camera from Kodak that has been designed to replace 16 mm high speed film cameras.

  5. Night Vision Camera

    NASA Technical Reports Server (NTRS)

    1996-01-01

    PixelVision, Inc. developed the Night Video NV652 Back-illuminated CCD Camera, based on the expertise of a former Jet Propulsion Laboratory employee and a former employee of Scientific Imaging Technologies, Inc. The camera operates without an image intensifier, using back-illuminated and thinned CCD technology to achieve extremely low light level imaging performance. The advantages of PixelVision's system over conventional cameras include greater resolution and better target identification under low light conditions, lower cost and a longer lifetime. It is used commercially for research and aviation.

  6. Kitt Peak speckle camera

    NASA Technical Reports Server (NTRS)

    Breckinridge, J. B.; Mcalister, H. A.; Robinson, W. G.

    1979-01-01

    The speckle camera in regular use at Kitt Peak National Observatory since 1974 is described in detail. The design of the atmospheric dispersion compensation prisms, the use of film as a recording medium, the accuracy of double star measurements, and the next generation speckle camera are discussed. Photographs of double star speckle patterns with separations from 1.4 sec of arc to 4.7 sec of arc are shown to illustrate the quality of image formation with this camera, the effects of seeing on the patterns, and to illustrate the isoplanatic patch of the atmosphere.

  7. Structured light camera calibration

    NASA Astrophysics Data System (ADS)

    Garbat, P.; Skarbek, W.; Tomaszewski, M.

    2013-03-01

    Structured light camera which is being designed with the joined effort of Institute of Radioelectronics and Institute of Optoelectronics (both being large units of the Warsaw University of Technology within the Faculty of Electronics and Information Technology) combines various hardware and software contemporary technologies. In hardware it is integration of a high speed stripe projector and a stripe camera together with a standard high definition video camera. In software it is supported by sophisticated calibration techniques which enable development of advanced application such as real time 3D viewer of moving objects with the free viewpoint or 3D modeller for still objects.

  8. Ringfield lithographic camera

    DOEpatents

    Sweatt, W.C.

    1998-09-08

    A projection lithography camera is presented with a wide ringfield optimized so as to make efficient use of extreme ultraviolet radiation from a large area radiation source (e.g., D{sub source} {approx_equal} 0.5 mm). The camera comprises four aspheric mirrors optically arranged on a common axis of symmetry. The camera includes an aperture stop that is accessible through a plurality of partial aperture stops to synthesize the theoretical aperture stop. Radiation from a mask is focused to form a reduced image on a wafer, relative to the mask, by reflection from the four aspheric mirrors. 11 figs.

  9. Status of the Los Alamos Anger camera

    SciTech Connect

    Seeger, P.A.; Nutter, M.J.

    1985-01-01

    Results of preliminary tests of the neutron Anger camera being developed at Los Alamos are presented. This detector uses a unique encoding scheme involving parellel processing of multiple receptive fields. Design goals have not yet been met, but the results are very encouraging and improvements in the test procedures are expected to show that the detector will be ready for use on a small-angle scattering instrument next year. 3 refs., 4 figs.

  10. Early Experience & Multisensory Perceptual Narrowing

    PubMed Central

    Lewkowicz, David J.

    2014-01-01

    Perceptual narrowing is a reflection of early experience and contributes in key ways to perceptual and cognitive development. In general, findings have shown that unisensory perceptual sensitivity in early infancy is broadly tuned such that young infants respond to, and discriminate, native as well as non-native sensory inputs, whereas older infants only respond to native inputs. Recently, my colleagues and I discovered that perceptual narrowing occurs at the multisensory processing level as well. The present article reviews this new evidence and puts it in the larger context of multisensory perceptual development and the role that perceptual experience plays in it. Together, the evidence on unisensory and multisensory narrowing shows that early experience shapes the emergence of perceptual specialization and expertise. PMID:24435505

  11. Targetless Camera Calibration

    NASA Astrophysics Data System (ADS)

    Barazzetti, L.; Mussio, L.; Remondino, F.; Scaioni, M.

    2011-09-01

    In photogrammetry a camera is considered calibrated if its interior orientation parameters are known. These encompass the principal distance, the principal point position and some Additional Parameters used to model possible systematic errors. The current state of the art for automated camera calibration relies on the use of coded targets to accurately determine the image correspondences. This paper presents a new methodology for the efficient and rigorous photogrammetric calibration of digital cameras which does not require any longer the use of targets. A set of images depicting a scene with a good texture are sufficient for the extraction of natural corresponding image points. These are automatically matched with feature-based approaches and robust estimation techniques. The successive photogrammetric bundle adjustment retrieves the unknown camera parameters and their theoretical accuracies. Examples, considerations and comparisons with real data and different case studies are illustrated to show the potentialities of the proposed methodology.

  12. Miniature TV Camera

    NASA Technical Reports Server (NTRS)

    2004-01-01

    Originally devised to observe Saturn stage separation during Apollo flights, Marshall Space Flight Center's Miniature Television Camera, measuring only 4 x 3 x 1 1/2 inches, quickly made its way to the commercial telecommunications market.

  13. The MKID Camera

    NASA Astrophysics Data System (ADS)

    Maloney, P. R.; Czakon, N. G.; Day, P. K.; Duan, R.; Gao, J.; Glenn, J.; Golwala, S.; Hollister, M.; LeDuc, H. G.; Mazin, B.; Noroozian, O.; Nguyen, H. T.; Sayers, J.; Schlaerth, J.; Vaillancourt, J. E.; Vayonakis, A.; Wilson, P.; Zmuidzinas, J.

    2009-12-01

    The MKID Camera project is a collaborative effort of Caltech, JPL, the University of Colorado, and UC Santa Barbara to develop a large-format, multi-color millimeter and submillimeter-wavelength camera for astronomy using microwave kinetic inductance detectors (MKIDs). These are superconducting, micro-resonators fabricated from thin aluminum and niobium films. We couple the MKIDs to multi-slot antennas and measure the change in surface impedance produced by photon-induced breaking of Cooper pairs. The readout is almost entirely at room temperature and can be highly multiplexed; in principle hundreds or even thousands of resonators could be read out on a single feedline. The camera will have 576 spatial pixels that image simultaneously in four bands at 750, 850, 1100 and 1300 microns. It is scheduled for deployment at the Caltech Submillimeter Observatory in the summer of 2010. We present an overview of the camera design and readout and describe the current status of testing and fabrication.

  14. Advanced CCD camera developments

    SciTech Connect

    Condor, A.

    1994-11-15

    Two charge coupled device (CCD) camera systems are introduced and discussed, describing briefly the hardware involved, and the data obtained in their various applications. The Advanced Development Group Defense Sciences Engineering Division has been actively designing, manufacturing, fielding state-of-the-art CCD camera systems for over a decade. These systems were originally developed for the nuclear test program to record data from underground nuclear tests. Today, new and interesting application for these systems have surfaced and development is continuing in the area of advanced CCD camera systems, with the new CCD camera that will allow experimenters to replace film for x-ray imaging at the JANUS, USP, and NOVA laser facilities.

  15. Traffic Sign Recognition with Invariance to Lighting in Dual-Focal Active Camera System

    NASA Astrophysics Data System (ADS)

    Gu, Yanlei; Panahpour Tehrani, Mehrdad; Yendo, Tomohiro; Fujii, Toshiaki; Tanimoto, Masayuki

    In this paper, we present an automatic vision-based traffic sign recognition system, which can detect and classify traffic signs at long distance under different lighting conditions. To realize this purpose, the traffic sign recognition is developed in an originally proposed dual-focal active camera system. In this system, a telephoto camera is equipped as an assistant of a wide angle camera. The telephoto camera can capture a high accuracy image for an object of interest in the view field of the wide angle camera. The image from the telephoto camera provides enough information for recognition when the accuracy of traffic sign is low from the wide angle camera. In the proposed system, the traffic sign detection and classification are processed separately for different images from the wide angle camera and telephoto camera. Besides, in order to detect traffic sign from complex background in different lighting conditions, we propose a type of color transformation which is invariant to light changing. This color transformation is conducted to highlight the pattern of traffic signs by reducing the complexity of background. Based on the color transformation, a multi-resolution detector with cascade mode is trained and used to locate traffic signs at low resolution in the image from the wide angle camera. After detection, the system actively captures a high accuracy image of each detected traffic sign by controlling the direction and exposure time of the telephoto camera based on the information from the wide angle camera. Moreover, in classification, a hierarchical classifier is constructed and used to recognize the detected traffic signs in the high accuracy image from the telephoto camera. Finally, based on the proposed system, a set of experiments in the domain of traffic sign recognition is presented. The experimental results demonstrate that the proposed system can effectively recognize traffic signs at low resolution in different lighting conditions.

  16. Gamma ray camera

    SciTech Connect

    Robbins, C.D.; Wang, S.

    1980-09-09

    An anger gamma ray camera is improved by the substitution of a gamma ray sensitive, proximity type image intensifier tube for the scintillator screen in the anger camera, the image intensifier tube having a negatively charged flat scintillator screen and a flat photocathode layer and a grounded, flat output phosphor display screen all of the same dimension (Unity image magnification) and all within a grounded metallic tube envelope and having a metallic, inwardly concaved input window between the scintillator screen and the collimator.

  17. Calibration of Action Cameras for Photogrammetric Purposes

    PubMed Central

    Balletti, Caterina; Guerra, Francesco; Tsioukas, Vassilios; Vernier, Paolo

    2014-01-01

    The use of action cameras for photogrammetry purposes is not widespread due to the fact that until recently the images provided by the sensors, using either still or video capture mode, were not big enough to perform and provide the appropriate analysis with the necessary photogrammetric accuracy. However, several manufacturers have recently produced and released new lightweight devices which are: (a) easy to handle, (b) capable of performing under extreme conditions and more importantly (c) able to provide both still images and video sequences of high resolution. In order to be able to use the sensor of action cameras we must apply a careful and reliable self-calibration prior to the use of any photogrammetric procedure, a relatively difficult scenario because of the short focal length of the camera and its wide angle lens that is used to obtain the maximum possible resolution of images. Special software, using functions of the OpenCV library, has been created to perform both the calibration and the production of undistorted scenes for each one of the still and video image capturing mode of a novel action camera, the GoPro Hero 3 camera that can provide still images up to 12 Mp and video up 8 Mp resolution. PMID:25237898

  18. Calibration of action cameras for photogrammetric purposes.

    PubMed

    Balletti, Caterina; Guerra, Francesco; Tsioukas, Vassilios; Vernier, Paolo

    2014-01-01

    The use of action cameras for photogrammetry purposes is not widespread due to the fact that until recently the images provided by the sensors, using either still or video capture mode, were not big enough to perform and provide the appropriate analysis with the necessary photogrammetric accuracy. However, several manufacturers have recently produced and released new lightweight devices which are: (a) easy to handle, (b) capable of performing under extreme conditions and more importantly (c) able to provide both still images and video sequences of high resolution. In order to be able to use the sensor of action cameras we must apply a careful and reliable self-calibration prior to the use of any photogrammetric procedure, a relatively difficult scenario because of the short focal length of the camera and its wide angle lens that is used to obtain the maximum possible resolution of images. Special software, using functions of the OpenCV library, has been created to perform both the calibration and the production of undistorted scenes for each one of the still and video image capturing mode of a novel action camera, the GoPro Hero 3 camera that can provide still images up to 12 Mp and video up 8 Mp resolution. PMID:25237898

  19. Camera Edge Response

    NASA Astrophysics Data System (ADS)

    Zisk, Stanley H.; Wittels, Norman

    1988-02-01

    Edge location is an important machine vision task. Machine vision systems perform mathematical operations on rectangular arrays of numbers that are intended to faithfully represent the spatial distribution of scene luminance. The numbers are produced by periodic sampling and quantization of the camera's video output. This sequence can cause artifacts to appear in the data with a noise spectrum that is high in power at high spatial frequencies. This is a problem because most edge detection algorithms are preferentially sensitive to the high-frequency content in an image. Solid state cameras can introduce errors because of the spatial periodicity of their sensor elements. This can result in problems when image edges are aligned with camera pixel boundaries: (a) some cameras introduce transients into the video signal while switching between sensor elements; (b) most cameras use analog low-pass filters to minimize sampling artifacts and these introduce video phase delays that shift the locations of edges. The problems compound when the vision system samples asynchronously with the camera's pixel rate. Moire patterns (analogous to beat frequencies) can result. In this paper, we examine and model quantization effects in a machine vision system with particular emphasis on edge detection performance. We also compare our models with experimental measurements.

  20. Spacecraft camera image registration

    NASA Technical Reports Server (NTRS)

    Kamel, Ahmed A. (Inventor); Graul, Donald W. (Inventor); Chan, Fred N. T. (Inventor); Gamble, Donald W. (Inventor)

    1987-01-01

    A system for achieving spacecraft camera (1, 2) image registration comprises a portion external to the spacecraft and an image motion compensation system (IMCS) portion onboard the spacecraft. Within the IMCS, a computer (38) calculates an image registration compensation signal (60) which is sent to the scan control loops (84, 88, 94, 98) of the onboard cameras (1, 2). At the location external to the spacecraft, the long-term orbital and attitude perturbations on the spacecraft are modeled. Coefficients (K, A) from this model are periodically sent to the onboard computer (38) by means of a command unit (39). The coefficients (K, A) take into account observations of stars and landmarks made by the spacecraft cameras (1, 2) themselves. The computer (38) takes as inputs the updated coefficients (K, A) plus synchronization information indicating the mirror position (AZ, EL) of each of the spacecraft cameras (1, 2), operating mode, and starting and stopping status of the scan lines generated by these cameras (1, 2), and generates in response thereto the image registration compensation signal (60). The sources of periodic thermal errors on the spacecraft are discussed. The system is checked by calculating measurement residuals, the difference between the landmark and star locations predicted at the external location and the landmark and star locations as measured by the spacecraft cameras (1, 2).

  1. Copernican craters: Early results from the Lunar Reconnaissance Orbiter Camera

    NASA Astrophysics Data System (ADS)

    McEwen, A. S.; Hiesinger, H.; Thomas, P. C.; Robinson, M. S.; van der Bogert, C.; Ostrach, L.; Plescia, J. B.; Bray, V. J.; Tornabene, L. L.

    2009-12-01

    The youngest (Copernican) craters on the Moon provide the best examples of original crater morphology and a record of the impact flux over the last ~1 Ga in the Earth-Moon system. The LRO Narrow Angle Cameras (NAC) provide 50 cm pixels from an altitude of 50 km. With changing incidence angle, global access, and very high data rates, these cameras provide unprecedented data on lunar craters. Stereo image pairs are being acquired for detailed topographic mapping. These data allow comparisons of relative ages of the larger young craters, some of which are tied to absolute radiometric ages from Apollo-returned samples. These relative ages, the crater populations at small diameters, and details of crater morphology including ejecta and melt morphologies, allow better delineation of recent lunar history and the formation and modification of impact craters. Crater counts may also reveal differences in the formation and preservation of small diameter craters as a function of target material (e.g., unconsolidated regolith versus solid impact melt). One key question: Is the current cratering rate constant or does it fluctuate. We will constrain the very recent cratering rate (at 10-100 m diameter) by comparing LROC images with those taken by Apollo nearly 40 years ago to determine the number of new impact craters. The current cratering rate and an assumption of constant cratering rate over time may or may not correctly predict the number of craters superimposed over radiometrically-dated surfaces such as South Ray, Cone, and North Ray craters, which range from 2-50 Ma and are not saturated by 10-100 m craters. If the prediction fails with realistic consideration of errors, then the present-day cratering rate must be atypical. Secondary craters complicate this analysis, but the resolution and coverage of LROC enables improved recognition of secondary craters. Of particular interest for the youngest Copernican craters is the possibility of self-cratering. LROC is providing the

  2. Calibration Procedures on Oblique Camera Setups

    NASA Astrophysics Data System (ADS)

    Kemper, G.; Melykuti, B.; Yu, C.

    2016-06-01

    the nadir camera and the GPS/IMU data, an initial orientation correction and radial correction were calculated. With this approach, the whole project was calculated and calibrated in one step. During the iteration process the radial and tangential parameters were switched on individually for the camera heads and after that the camera constants and principal point positions were checked and finally calibrated. Besides that, the bore side calibration can be performed either on basis of the nadir camera and their offsets, or independently for each camera without correlation to the others. This must be performed in a complete mission anyway to get stability between the single camera heads. Determining the lever arms of the nodal-points to the IMU centre needs more caution than for a single camera especially due to the strong tilt angle. Prepared all these previous steps, you get a highly accurate sensor that enables a fully automated data extraction with a rapid update of you existing data. Frequently monitoring urban dynamics is then possible in fully 3D environment.

  3. Seasonal and vertical changes in leaf angle distribution for selected deciduous broadleaf tree species common to Europe

    NASA Astrophysics Data System (ADS)

    Raabe, Kairi; Pisek, Jan; Sonnentag, Oliver; Annuk, Kalju

    2014-05-01

    Leaf inclination angle distribution is a key parameter in determining the transmission and reflection of radiation by vegetation canopies. It has been previously observed that leaf inclination angle might change gradually from more vertical in the upper canopy and in high light habitats to more horizontal in the lower canopy and in low light habitats [1]. Despite its importance, relatively few measurements on actual leaf angle distributions have been reported for different tree species. Even smaller number of studies have dealt with the possible seasonal changes in leaf angle distribution [2]. In this study the variation of leaf inclination angle distributions was examined both temporally throughout the growing season and vertically at different heights of trees. We report on leaf inclination angle distributions for five deciduous broadleaf species found commonly in several parts of Europe: grey alder (Alnus incana), Silver birch (Betula pendula Roth), chestnut (Castanea), Norway maple (Acer platanoides), and aspen (Populus tremula). The angles were measured using the leveled camera method [3], with the data collected at several separate heights and four times during the period of May-September 2013. The results generally indicate the greatest change in leaf inclination angles for spring, with the changes usually being the most pronounced at the top of the canopy. It should also be noted, however, that whereas the temporal variation proved to be rather consistent for different species, the vertical variation differed more between species. The leveled camera method was additionally tested in terms of sensitivity to different users. Ten people were asked to measure the leaf angles for four different species. The results indicate the method is quite robust in providing coinciding distributions irrespective of the user and level of previous experience with the method. However, certain caution must be exercised when measuring long narrow leaves. References [1] G.G. Mc

  4. Deployable Wireless Camera Penetrators

    NASA Technical Reports Server (NTRS)

    Badescu, Mircea; Jones, Jack; Sherrit, Stewart; Wu, Jiunn Jeng

    2008-01-01

    A lightweight, low-power camera dart has been designed and tested for context imaging of sampling sites and ground surveys from an aerobot or an orbiting spacecraft in a microgravity environment. The camera penetrators also can be used to image any line-of-sight surface, such as cliff walls, that is difficult to access. Tethered cameras to inspect the surfaces of planetary bodies use both power and signal transmission lines to operate. A tether adds the possibility of inadvertently anchoring the aerobot, and requires some form of station-keeping capability of the aerobot if extended examination time is required. The new camera penetrators are deployed without a tether, weigh less than 30 grams, and are disposable. They are designed to drop from any altitude with the boost in transmitting power currently demonstrated at approximately 100-m line-of-sight. The penetrators also can be deployed to monitor lander or rover operations from a distance, and can be used for surface surveys or for context information gathering from a touch-and-go sampling site. Thanks to wireless operation, the complexity of the sampling or survey mechanisms may be reduced. The penetrators may be battery powered for short-duration missions, or have solar panels for longer or intermittent duration missions. The imaging device is embedded in the penetrator, which is dropped or projected at the surface of a study site at 90 to the surface. Mirrors can be used in the design to image the ground or the horizon. Some of the camera features were tested using commercial "nanny" or "spy" camera components with the charge-coupled device (CCD) looking at a direction parallel to the ground. Figure 1 shows components of one camera that weighs less than 8 g and occupies a volume of 11 cm3. This camera could transmit a standard television signal, including sound, up to 100 m. Figure 2 shows the CAD models of a version of the penetrator. A low-volume array of such penetrator cameras could be deployed from an

  5. Calibration Procedures in Mid Format Camera Setups

    NASA Astrophysics Data System (ADS)

    Pivnicka, F.; Kemper, G.; Geissler, S.

    2012-07-01

    A growing number of mid-format cameras are used for aerial surveying projects. To achieve a reliable and geometrically precise result also in the photogrammetric workflow, awareness on the sensitive parts is important. The use of direct referencing systems (GPS/IMU), the mounting on a stabilizing camera platform and the specific values of the mid format camera make a professional setup with various calibration and misalignment operations necessary. An important part is to have a proper camera calibration. Using aerial images over a well designed test field with 3D structures and/or different flight altitudes enable the determination of calibration values in Bingo software. It will be demonstrated how such a calibration can be performed. The direct referencing device must be mounted in a solid and reliable way to the camera. Beside the mechanical work especially in mounting the camera beside the IMU, 2 lever arms have to be measured in mm accuracy. Important are the lever arms from the GPS Antenna to the IMU's calibrated centre and also the lever arm from the IMU centre to the Camera projection centre. In fact, the measurement with a total station is not a difficult task but the definition of the right centres and the need for using rotation matrices can cause serious accuracy problems. The benefit of small and medium format cameras is that also smaller aircrafts can be used. Like that, a gyro bases stabilized platform is recommended. This causes, that the IMU must be mounted beside the camera on the stabilizer. The advantage is, that the IMU can be used to control the platform, the problematic thing is, that the IMU to GPS antenna lever arm is floating. In fact we have to deal with an additional data stream, the values of the movement of the stabiliser to correct the floating lever arm distances. If the post-processing of the GPS-IMU data by taking the floating levers into account, delivers an expected result, the lever arms between IMU and camera can be applied

  6. Image dissector camera system study

    NASA Technical Reports Server (NTRS)

    Howell, L.

    1984-01-01

    Various aspects of a rendezvous and docking system using an image dissector detector as compared to a GaAs detector were discussed. Investigation into a gimbled scanning system is also covered and the measured video response curves from the image dissector camera are presented. Rendezvous will occur at ranges greater than 100 meters. The maximum range considered was 1000 meters. During docking, the range, range-rate, angle, and angle-rate to each reflector on the satellite must be measured. Docking range will be from 3 to 100 meters. The system consists of a CW laser diode transmitter and an image dissector receiver. The transmitter beam is amplitude modulated with three sine wave tones for ranging. The beam is coaxially combined with the receiver beam. Mechanical deflection of the transmitter beam, + or - 10 degrees in both X and Y, can be accomplished before or after it is combined with the receiver beam. The receiver will have a field-of-view (FOV) of 20 degrees and an instantaneous field-of-view (IFOV) of two milliradians (mrad) and will be electronically scanned in the image dissector. The increase in performance obtained from the GaAs photocathode is not needed to meet the present performance requirements.

  7. Uncooled radiometric camera performance

    NASA Astrophysics Data System (ADS)

    Meyer, Bill; Hoelter, T.

    1998-07-01

    Thermal imaging equipment utilizing microbolometer detectors operating at room temperature has found widespread acceptance in both military and commercial applications. Uncooled camera products are becoming effective solutions to applications currently using traditional, photonic infrared sensors. The reduced power consumption and decreased mechanical complexity offered by uncooled cameras have realized highly reliable, low-cost, hand-held instruments. Initially these instruments displayed only relative temperature differences which limited their usefulness in applications such as Thermography. Radiometrically calibrated microbolometer instruments are now available. The ExplorIR Thermography camera leverages the technology developed for Raytheon Systems Company's first production microbolometer imaging camera, the Sentinel. The ExplorIR camera has a demonstrated temperature measurement accuracy of 4 degrees Celsius or 4% of the measured value (whichever is greater) over scene temperatures ranges of minus 20 degrees Celsius to 300 degrees Celsius (minus 20 degrees Celsius to 900 degrees Celsius for extended range models) and camera environmental temperatures of minus 10 degrees Celsius to 40 degrees Celsius. Direct temperature measurement with high resolution video imaging creates some unique challenges when using uncooled detectors. A temperature controlled, field-of-view limiting aperture (cold shield) is not typically included in the small volume dewars used for uncooled detector packages. The lack of a field-of-view shield allows a significant amount of extraneous radiation from the dewar walls and lens body to affect the sensor operation. In addition, the transmission of the Germanium lens elements is a function of ambient temperature. The ExplorIR camera design compensates for these environmental effects while maintaining the accuracy and dynamic range required by today's predictive maintenance and condition monitoring markets.

  8. Divergent-ray projection method for measuring the flapping angle, lag angle, and torsional angle of a bumblebee wing

    NASA Astrophysics Data System (ADS)

    Zeng, Lijiang; Matsumoto, Hirokazu; Kawachi, Keiji

    1996-11-01

    A divergent-ray projection (DRP) method was developed for measuring the flapping angle, lag angle, and torsional angle of bumblebee wing during beating motion. This new method can measure the spatial coordinates of an insect wing by digitizing the images that are projected by two divergent laser rays from different directions. The advantage of the DRP method is its ability to measure those three angles simultaneously using only one high-speed camera. The resolution of the DRP method can be changed easily by adjusting system parameters to meet the needs of different types of objects. The measurement results for these angles of a bumblebee wing probe the effectiveness of the DRP method in studying the flight performance of insects.

  9. Sensing driver awareness by combining fisheye camera and Kinect

    NASA Astrophysics Data System (ADS)

    Wuhe, Z.; Lei, Z.; Ning, D.

    2014-11-01

    In this paper, we propose a Driver's Awareness Catching System to sense the driver's awareness. The system consists of a fisheye camera and a Kinect. The Kinect mounted inside vehicle is used to recognize and locate the 3D face of the driver. The fisheye camera mounted outside vehicle is used to monitor the road. The relative pose between two cameras is calibrated via a state-of-the-art method for calibrating cameras with non-overlapping field of view. The camera system works in this way: First, the SDK of Kinect released by Microsoft is used to tracking driver's face and capture eye's location together with sight direction. Secondly, the eye's location and the sight direction are transformed to the coordinate system of fisheye camera. Thirdly, corresponding view field is extracted from fisheye image. As there is a small displacement between driver's eyes and the optical center of fisheye camera, it will lead to a view angle deviation. Finally, we did a systematic analysis of the error distribution by numerical simulation and proved the feasibility of our camera system. On the other hand, we realized this camera system and achieved desired effect in realworld experiment.

  10. The CAMCAO infrared camera

    NASA Astrophysics Data System (ADS)

    Amorim, Antonio; Melo, Antonio; Alves, Joao; Rebordao, Jose; Pinhao, Jose; Bonfait, Gregoire; Lima, Jorge; Barros, Rui; Fernandes, Rui; Catarino, Isabel; Carvalho, Marta; Marques, Rui; Poncet, Jean-Marc; Duarte Santos, Filipe; Finger, Gert; Hubin, Norbert; Huster, Gotthard; Koch, Franz; Lizon, Jean-Louis; Marchetti, Enrico

    2004-09-01

    The CAMCAO instrument is a high resolution near infrared (NIR) camera conceived to operate together with the new ESO Multi-conjugate Adaptive optics Demonstrator (MAD) with the goal of evaluating the feasibility of Multi-Conjugate Adaptive Optics techniques (MCAO) on the sky. It is a high-resolution wide field of view (FoV) camera that is optimized to use the extended correction of the atmospheric turbulence provided by MCAO. While the first purpose of this camera is the sky observation, in the MAD setup, to validate the MCAO technology, in a second phase, the CAMCAO camera is planned to attach directly to the VLT for scientific astrophysical studies. The camera is based on the 2kx2k HAWAII2 infrared detector controlled by an ESO external IRACE system and includes standard IR band filters mounted on a positional filter wheel. The CAMCAO design requires that the optical components and the IR detector should be kept at low temperatures in order to avoid emitting radiation and lower detector noise in the region analysis. The cryogenic system inclues a LN2 tank and a sptially developed pulse tube cryocooler. Field and pupil cold stops are implemented to reduce the infrared background and the stray-light. The CAMCAO optics provide diffraction limited performance down to J Band, but the detector sampling fulfills the Nyquist criterion for the K band (2.2mm).

  11. The Dark Energy Camera

    SciTech Connect

    Flaugher, B.

    2015-04-11

    The Dark Energy Camera is a new imager with a 2.2-degree diameter field of view mounted at the prime focus of the Victor M. Blanco 4-meter telescope on Cerro Tololo near La Serena, Chile. The camera was designed and constructed by the Dark Energy Survey Collaboration, and meets or exceeds the stringent requirements designed for the wide-field and supernova surveys for which the collaboration uses it. The camera consists of a five element optical corrector, seven filters, a shutter with a 60 cm aperture, and a CCD focal plane of 250-μm thick fully depleted CCDs cooled inside a vacuum Dewar. The 570 Mpixel focal plane comprises 62 2k x 4k CCDs for imaging and 12 2k x 2k CCDs for guiding and focus. The CCDs have 15μm x 15μm pixels with a plate scale of 0.263" per pixel. A hexapod system provides state-of-the-art focus and alignment capability. The camera is read out in 20 seconds with 6-9 electrons readout noise. This paper provides a technical description of the camera's engineering, construction, installation, and current status.

  12. CAOS-CMOS camera.

    PubMed

    Riza, Nabeel A; La Torre, Juan Pablo; Amin, M Junaid

    2016-06-13

    Proposed and experimentally demonstrated is the CAOS-CMOS camera design that combines the coded access optical sensor (CAOS) imager platform with the CMOS multi-pixel optical sensor. The unique CAOS-CMOS camera engages the classic CMOS sensor light staring mode with the time-frequency-space agile pixel CAOS imager mode within one programmable optical unit to realize a high dynamic range imager for extreme light contrast conditions. The experimentally demonstrated CAOS-CMOS camera is built using a digital micromirror device, a silicon point-photo-detector with a variable gain amplifier, and a silicon CMOS sensor with a maximum rated 51.3 dB dynamic range. White light imaging of three different brightness simultaneously viewed targets, that is not possible by the CMOS sensor, is achieved by the CAOS-CMOS camera demonstrating an 82.06 dB dynamic range. Applications for the camera include industrial machine vision, welding, laser analysis, automotive, night vision, surveillance and multispectral military systems. PMID:27410361

  13. The Dark Energy Camera

    NASA Astrophysics Data System (ADS)

    Flaugher, B.; Diehl, H. T.; Honscheid, K.; Abbott, T. M. C.; Alvarez, O.; Angstadt, R.; Annis, J. T.; Antonik, M.; Ballester, O.; Beaufore, L.; Bernstein, G. M.; Bernstein, R. A.; Bigelow, B.; Bonati, M.; Boprie, D.; Brooks, D.; Buckley-Geer, E. J.; Campa, J.; Cardiel-Sas, L.; Castander, F. J.; Castilla, J.; Cease, H.; Cela-Ruiz, J. M.; Chappa, S.; Chi, E.; Cooper, C.; da Costa, L. N.; Dede, E.; Derylo, G.; DePoy, D. L.; de Vicente, J.; Doel, P.; Drlica-Wagner, A.; Eiting, J.; Elliott, A. E.; Emes, J.; Estrada, J.; Fausti Neto, A.; Finley, D. A.; Flores, R.; Frieman, J.; Gerdes, D.; Gladders, M. D.; Gregory, B.; Gutierrez, G. R.; Hao, J.; Holland, S. E.; Holm, S.; Huffman, D.; Jackson, C.; James, D. J.; Jonas, M.; Karcher, A.; Karliner, I.; Kent, S.; Kessler, R.; Kozlovsky, M.; Kron, R. G.; Kubik, D.; Kuehn, K.; Kuhlmann, S.; Kuk, K.; Lahav, O.; Lathrop, A.; Lee, J.; Levi, M. E.; Lewis, P.; Li, T. S.; Mandrichenko, I.; Marshall, J. L.; Martinez, G.; Merritt, K. W.; Miquel, R.; Muñoz, F.; Neilsen, E. H.; Nichol, R. C.; Nord, B.; Ogando, R.; Olsen, J.; Palaio, N.; Patton, K.; Peoples, J.; Plazas, A. A.; Rauch, J.; Reil, K.; Rheault, J.-P.; Roe, N. A.; Rogers, H.; Roodman, A.; Sanchez, E.; Scarpine, V.; Schindler, R. H.; Schmidt, R.; Schmitt, R.; Schubnell, M.; Schultz, K.; Schurter, P.; Scott, L.; Serrano, S.; Shaw, T. M.; Smith, R. C.; Soares-Santos, M.; Stefanik, A.; Stuermer, W.; Suchyta, E.; Sypniewski, A.; Tarle, G.; Thaler, J.; Tighe, R.; Tran, C.; Tucker, D.; Walker, A. R.; Wang, G.; Watson, M.; Weaverdyck, C.; Wester, W.; Woods, R.; Yanny, B.; DES Collaboration

    2015-11-01

    The Dark Energy Camera is a new imager with a 2.°2 diameter field of view mounted at the prime focus of the Victor M. Blanco 4 m telescope on Cerro Tololo near La Serena, Chile. The camera was designed and constructed by the Dark Energy Survey Collaboration and meets or exceeds the stringent requirements designed for the wide-field and supernova surveys for which the collaboration uses it. The camera consists of a five-element optical corrector, seven filters, a shutter with a 60 cm aperture, and a charge-coupled device (CCD) focal plane of 250 μm thick fully depleted CCDs cooled inside a vacuum Dewar. The 570 megapixel focal plane comprises 62 2k × 4k CCDs for imaging and 12 2k × 2k CCDs for guiding and focus. The CCDs have 15 μm × 15 μm pixels with a plate scale of 0.″263 pixel-1. A hexapod system provides state-of-the-art focus and alignment capability. The camera is read out in 20 s with 6-9 electron readout noise. This paper provides a technical description of the camera's engineering, construction, installation, and current status.

  14. Satellite camera image navigation

    NASA Technical Reports Server (NTRS)

    Kamel, Ahmed A. (Inventor); Graul, Donald W. (Inventor); Savides, John (Inventor); Hanson, Charles W. (Inventor)

    1987-01-01

    Pixels within a satellite camera (1, 2) image are precisely located in terms of latitude and longitude on a celestial body, such as the earth, being imaged. A computer (60) on the earth generates models (40, 50) of the satellite's orbit and attitude, respectively. The orbit model (40) is generated from measurements of stars and landmarks taken by the camera (1, 2), and by range data. The orbit model (40) is an expression of the satellite's latitude and longitude at the subsatellite point, and of the altitude of the satellite, as a function of time, using as coefficients (K) the six Keplerian elements at epoch. The attitude model (50) is based upon star measurements taken by each camera (1, 2). The attitude model (50) is a set of expressions for the deviations in a set of mutually orthogonal reference optical axes (x, y, z) as a function of time, for each camera (1, 2). Measured data is fit into the models (40, 50) using a walking least squares fit algorithm. A transformation computer (66 ) transforms pixel coordinates as telemetered by the camera (1, 2) into earth latitude and longitude coordinates, using the orbit and attitude models (40, 50).

  15. Binocular Camera for cockpit visibility of general aviation aircraft

    NASA Astrophysics Data System (ADS)

    Barile, A. J.

    1981-04-01

    A history of cockpit visibility studies and requirements with regard to aircraft safety, human factors, collision avoidance, and accident investigations is presented. The Federal Aviation Administration's development of the Binocular Camera is reviewed, and the technical details of a new and improved camera are discussed. The Binocular Camera uses two 65 mm wide angle F6.8 lenses and covers an 88 1/2 deg field of vision. The camera produces images, representative of what the human eyes see before the brain integrates them into one, thus making it possible to analyze the effect of obstruction to vision. The improvements, applications, and uses of the camera in the research, development, and operations of general aviation aircraft are discussed.

  16. ROSAT wide field camera mirrors.

    PubMed

    Willingale, R

    1988-04-15

    The ROSAT wide field camera (WFC) is an XUV telescope operating in the 12-250-eV energy band. The mirror system utilizes Wolter-Schwarzschild type I (WS I) grazing incidence optics with a focal length of 525 mm, comprised of three nested aluminum shells with an outermost diameter of 576 mm providing a geometric aperture area of 456 cm(2). The reflecting surfaces are electroless nickel plated and coated with gold to enhance their reflectivity in the XUV. The mirrors have undergone full aperture optical testing, narrow beam XUV testing, and full aperture XUV testing. Measurements of the reflectivity are compared to theoretical values derived from the optical constants of gold in the XUV range. Analysis of the focused distribution is used to estimate the surface roughness and figuring errors of the polished surfaces. The results are compared to the mechanical metrology data collected during manufacture of the shells and the power spectral density of the reflecting surfaces is found to have a power-law form. PMID:20531591

  17. Selective-imaging camera

    NASA Astrophysics Data System (ADS)

    Szu, Harold; Hsu, Charles; Landa, Joseph; Cha, Jae H.; Krapels, Keith A.

    2015-05-01

    How can we design cameras that image selectively in Full Electro-Magnetic (FEM) spectra? Without selective imaging, we cannot use, for example, ordinary tourist cameras to see through fire, smoke, or other obscurants contributing to creating a Visually Degraded Environment (VDE). This paper addresses a possible new design of selective-imaging cameras at firmware level. The design is consistent with physics of the irreversible thermodynamics of Boltzmann's molecular entropy. It enables imaging in appropriate FEM spectra for sensing through the VDE, and displaying in color spectra for Human Visual System (HVS). We sense within the spectra the largest entropy value of obscurants such as fire, smoke, etc. Then we apply a smart firmware implementation of Blind Sources Separation (BSS) to separate all entropy sources associated with specific Kelvin temperatures. Finally, we recompose the scene using specific RGB colors constrained by the HVS, by up/down shifting Planck spectra at each pixel and time.

  18. Solid state television camera

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The design, fabrication, and tests of a solid state television camera using a new charge-coupled imaging device are reported. An RCA charge-coupled device arranged in a 512 by 320 format and directly compatible with EIA format standards was the sensor selected. This is a three-phase, sealed surface-channel array that has 163,840 sensor elements, which employs a vertical frame transfer system for image readout. Included are test results of the complete camera system, circuit description and changes to such circuits as a result of integration and test, maintenance and operation section, recommendations to improve the camera system, and a complete set of electrical and mechanical drawing sketches.

  19. What convention is used for the illumination and view angles?

    Atmospheric Science Data Center

    2014-12-08

    ... Azimuth angles are measured clockwise from the direction of travel to local north. For both the Sun and cameras, azimuth describes the ... to the equator, because of its morning equator crossing time. Additionally, the difference in view and solar azimuth angle will be near ...

  20. 3-D Flow Visualization with a Light-field Camera

    NASA Astrophysics Data System (ADS)

    Thurow, B.

    2012-12-01

    Light-field cameras have received attention recently due to their ability to acquire photographs that can be computationally refocused after they have been acquired. In this work, we describe the development of a light-field camera system for 3D visualization of turbulent flows. The camera developed in our lab, also known as a plenoptic camera, uses an array of microlenses mounted next to an image sensor to resolve both the position and angle of light rays incident upon the camera. For flow visualization, the flow field is seeded with small particles that follow the fluid's motion and are imaged using the camera and a pulsed light source. The tomographic MART algorithm is then applied to the light-field data in order to reconstruct a 3D volume of the instantaneous particle field. 3D, 3C velocity vectors are then determined from a pair of 3D particle fields using conventional cross-correlation algorithms. As an illustration of the concept, 3D/3C velocity measurements of a turbulent boundary layer produced on the wall of a conventional wind tunnel are presented. Future experiments are planned to use the camera to study the influence of wall permeability on the 3-D structure of the turbulent boundary layer.Schematic illustrating the concept of a plenoptic camera where each pixel represents both the position and angle of light rays entering the camera. This information can be used to computationally refocus an image after it has been acquired. Instantaneous 3D velocity field of a turbulent boundary layer determined using light-field data captured by a plenoptic camera.

  1. Artificial human vision camera

    NASA Astrophysics Data System (ADS)

    Goudou, J.-F.; Maggio, S.; Fagno, M.

    2014-10-01

    In this paper we present a real-time vision system modeling the human vision system. Our purpose is to inspire from human vision bio-mechanics to improve robotic capabilities for tasks such as objects detection and tracking. This work describes first the bio-mechanical discrepancies between human vision and classic cameras and the retinal processing stage that takes place in the eye, before the optic nerve. The second part describes our implementation of these principles on a 3-camera optical, mechanical and software model of the human eyes and associated bio-inspired attention model.

  2. Lightweight, Compact, Long Range Camera Design

    NASA Astrophysics Data System (ADS)

    Shafer, Donald V.

    1983-08-01

    The model 700 camera is the latest in a 30-year series of LOROP cameras developed by McDonnell Douglas Astronautics Company (MDAC) and their predecessor companies. The design achieves minimum size and weight and is optimized for low-contrast performance. The optical system includes a 66-inch focal length, f/5.6, apochromatic lens and three folding mirrors imaging on a 4.5-inch square format. A three-axis active stabilization system provides the capability for long exposure time and, hence, fine grain films can be used. The optical path forms a figure "4" behind the lens. In front of the lens is a 45° pointing mirror. This folded configuration contributed greatly to the lightweight and compact design. This sequential autocycle frame camera has three modes of operation with one, two, and three step positions to provide a choice of swath widths within the range of lateral coverage. The magazine/shutter assembly rotates in relationship with the pointing mirror and aircraft drift angle to maintain film format alignment with the flight path. The entire camera is angular rate stabilized in roll, pitch, and yaw. It also employs a lightweight, electro-magnetically damped, low-natural-frequency spring suspension for passive isolation from aircraft vibration inputs. The combined film transport and forward motion compensation (FMC) mechanism, which is operated by a single motor, is contained in a magazine that can, depending on accessibility which is installation dependent, be changed in flight. The design also stresses thermal control, focus control, structural stiffness, and maintainability. The camera is operated from a remote control panel. This paper describes the leading particulars and features of the camera as related to weight and configuration.

  3. Underwater camera with depth measurement

    NASA Astrophysics Data System (ADS)

    Wang, Wei-Chih; Lin, Keng-Ren; Tsui, Chi L.; Schipf, David; Leang, Jonathan

    2016-04-01

    The objective of this study is to develop an RGB-D (video + depth) camera that provides three-dimensional image data for use in the haptic feedback of a robotic underwater ordnance recovery system. Two camera systems were developed and studied. The first depth camera relies on structured light (as used by the Microsoft Kinect), where the displacement of an object is determined by variations of the geometry of a projected pattern. The other camera system is based on a Time of Flight (ToF) depth camera. The results of the structural light camera system shows that the camera system requires a stronger light source with a similar operating wavelength and bandwidth to achieve a desirable working distance in water. This approach might not be robust enough for our proposed underwater RGB-D camera system, as it will require a complete re-design of the light source component. The ToF camera system instead, allows an arbitrary placement of light source and camera. The intensity output of the broadband LED light source in the ToF camera system can be increased by putting them into an array configuration and the LEDs can be modulated comfortably with any waveform and frequencies required by the ToF camera. In this paper, both camera were evaluated and experiments were conducted to demonstrate the versatility of the ToF camera.

  4. Photogrammetric camera calibration

    USGS Publications Warehouse

    Tayman, W.P.; Ziemann, H.

    1984-01-01

    Section 2 (Calibration) of the document "Recommended Procedures for Calibrating Photogrammetric Cameras and Related Optical Tests" from the International Archives of Photogrammetry, Vol. XIII, Part 4, is reviewed in the light of recent practical work, and suggestions for changes are made. These suggestions are intended as a basis for a further discussion. ?? 1984.

  5. Spas color camera

    NASA Technical Reports Server (NTRS)

    Toffales, C.

    1983-01-01

    The procedures to be followed in assessing the performance of the MOS color camera are defined. Aspects considered include: horizontal and vertical resolution; value of the video signal; gray scale rendition; environmental (vibration and temperature) tests; signal to noise ratios; and white balance correction.

  6. Imaging phoswich anger camera

    NASA Astrophysics Data System (ADS)

    Manchanda, R. K.; Sood, R. K.

    1991-08-01

    High angular resolution and low background are the primary requisites for detectors for future astronomy experiments in the low energy gamma-ray region. Scintillation counters are still the only available large area detector for studies in this energy range. Preliminary details of a large area phoswich anger camera designed for coded aperture imaging is described and its background and position characteristics are discussed.

  7. Communities, Cameras, and Conservation

    ERIC Educational Resources Information Center

    Patterson, Barbara

    2012-01-01

    Communities, Cameras, and Conservation (CCC) is the most exciting and valuable program the author has seen in her 30 years of teaching field science courses. In this citizen science project, students and community volunteers collect data on mountain lions ("Puma concolor") at four natural areas and public parks along the Front Range of Colorado.…

  8. Advanced Virgo phase cameras

    NASA Astrophysics Data System (ADS)

    van der Schaaf, L.; Agatsuma, K.; van Beuzekom, M.; Gebyehu, M.; van den Brand, J.

    2016-05-01

    A century after the prediction of gravitational waves, detectors have reached the sensitivity needed to proof their existence. One of them, the Virgo interferometer in Pisa, is presently being upgraded to Advanced Virgo (AdV) and will come into operation in 2016. The power stored in the interferometer arms raises from 20 to 700 kW. This increase is expected to introduce higher order modes in the beam, which could reduce the circulating power in the interferometer, limiting the sensitivity of the instrument. To suppress these higher-order modes, the core optics of Advanced Virgo is equipped with a thermal compensation system. Phase cameras, monitoring the real-time status of the beam constitute a critical component of this compensation system. These cameras measure the phases and amplitudes of the laser-light fields at the frequencies selected to control the interferometer. The measurement combines heterodyne detection with a scan of the wave front over a photodetector with pin-hole aperture. Three cameras observe the phase front of these laser sidebands. Two of them monitor the in-and output of the interferometer arms and the third one is used in the control of the aberrations introduced by the power recycling cavity. In this paper the working principle of the phase cameras is explained and some characteristic parameters are described.

  9. The LSST Camera Overview

    SciTech Connect

    Gilmore, Kirk; Kahn, Steven A.; Nordby, Martin; Burke, David; O'Connor, Paul; Oliver, John; Radeka, Veljko; Schalk, Terry; Schindler, Rafe; /SLAC

    2007-01-10

    The LSST camera is a wide-field optical (0.35-1um) imager designed to provide a 3.5 degree FOV with better than 0.2 arcsecond sampling. The detector format will be a circular mosaic providing approximately 3.2 Gigapixels per image. The camera includes a filter mechanism and, shuttering capability. It is positioned in the middle of the telescope where cross-sectional area is constrained by optical vignetting and heat dissipation must be controlled to limit thermal gradients in the optical beam. The fast, f/1.2 beam will require tight tolerances on the focal plane mechanical assembly. The focal plane array operates at a temperature of approximately -100 C to achieve desired detector performance. The focal plane array is contained within an evacuated cryostat, which incorporates detector front-end electronics and thermal control. The cryostat lens serves as an entrance window and vacuum seal for the cryostat. Similarly, the camera body lens serves as an entrance window and gas seal for the camera housing, which is filled with a suitable gas to provide the operating environment for the shutter and filter change mechanisms. The filter carousel can accommodate 5 filters, each 75 cm in diameter, for rapid exchange without external intervention.

  10. Jack & the Video Camera

    ERIC Educational Resources Information Center

    Charlan, Nathan

    2010-01-01

    This article narrates how the use of video camera has transformed the life of Jack Williams, a 10-year-old boy from Colorado Springs, Colorado, who has autism. The way autism affected Jack was unique. For the first nine years of his life, Jack remained in his world, alone. Functionally non-verbal and with motor skill problems that affected his…

  11. Anger Camera Firmware

    Energy Science and Technology Software Center (ESTSC)

    2010-11-19

    The firmware is responsible for the operation of Anger Camera Electronics, calculation of position, time of flight and digital communications. It provides a first stage analysis of 48 signals from 48 analog signals that have been converted to digital values using A/D convertors.

  12. Make a Pinhole Camera

    ERIC Educational Resources Information Center

    Fisher, Diane K.; Novati, Alexander

    2009-01-01

    On Earth, using ordinary visible light, one can create a single image of light recorded over time. Of course a movie or video is light recorded over time, but it is a series of instantaneous snapshots, rather than light and time both recorded on the same medium. A pinhole camera, which is simple to make out of ordinary materials and using ordinary…

  13. Image Sensors Enhance Camera Technologies

    NASA Technical Reports Server (NTRS)

    2010-01-01

    In the 1990s, a Jet Propulsion Laboratory team led by Eric Fossum researched ways of improving complementary metal-oxide semiconductor (CMOS) image sensors in order to miniaturize cameras on spacecraft while maintaining scientific image quality. Fossum s team founded a company to commercialize the resulting CMOS active pixel sensor. Now called the Aptina Imaging Corporation, based in San Jose, California, the company has shipped over 1 billion sensors for use in applications such as digital cameras, camera phones, Web cameras, and automotive cameras. Today, one of every three cell phone cameras on the planet feature Aptina s sensor technology.

  14. The Martian Atmosphere as seen by the OSIRIS camera

    NASA Astrophysics Data System (ADS)

    Moissl, R.; Pajola, M.; Määttänen, A.; Küppers, M.

    2013-09-01

    Despite the long time that has passed since the observations, only few studies based on the data from the wide- (WAC) and narrow- (NAC) angle camera systems of OSIRIS have been published to date. In this Paper we will present the results on the observations of the Martian Limbs acquired by the OSIRIS [1] instrument on board the ESA mission Rosetta during its swing-by maneuver around February 25th, 2007 on the way to Comet 67P/Churyumov-Gerasimenko, during the onset of the very active dust storm season of Mars year 28 (at Ls ~190). Although OSIRIS did only capture the Planet during a relatively short time interval of several hours, the obtained global view and the spectral coverage, from the UV (245 nm) over the full visible range to the near IR (1000 nm), allow for a valuable global overview over the state of the Martian atmosphere. The image acquisition started a February 24 around 18:00 UTC from a distance of about 260.000 km and continued until 04:51 UTC on February 25 to a distance of 105.000 km. During the Closest Approach to the Planet at 01:54 UTC o February 25 at a distance of 250 km. All images have been manually co-registered with the help of SPICE data, and vertical profiles have been extracted over the limb in intervals of ~0.5 degrees (se Figures 1 and 2). Despite the long time that has passed since the observations, only few studies based on the data from the wide- (WAC) and narrow- (NAC) angle camera systems of OSIRIS have been published to date. In this Paper we will present the results on the observations of the Martian Limbs acquired by the OSIRIS [1] instrument on board the ESA mission Rosetta during its swing-by maneuver around February 25th, 2007 on the way to Comet 67P/Churyumov-Gerasimenko, during the onset of the very active dust storm season of Mars year 28 (at Ls ~190). Although OSIRIS did only capture the Planet during a relatively short time interval of several hours, the obtained global view and the spectral coverage, from the UV (245

  15. 15. ELEVATED CAMERA STAND, SHOWING LINE OF CAMERA STANDS PARALLEL ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    15. ELEVATED CAMERA STAND, SHOWING LINE OF CAMERA STANDS PARALLEL TO SLED TRACK. Looking west southwest down Camera Road. - Edwards Air Force Base, South Base Sled Track, Edwards Air Force Base, North of Avenue B, between 100th & 140th Streets East, Lancaster, Los Angeles County, CA

  16. Dual cameras acquisition and display system of retina-like sensor camera and rectangular sensor camera

    NASA Astrophysics Data System (ADS)

    Cao, Nan; Cao, Fengmei; Lin, Yabin; Bai, Tingzhu; Song, Shengyu

    2015-04-01

    For a new kind of retina-like senor camera and a traditional rectangular sensor camera, dual cameras acquisition and display system need to be built. We introduce the principle and the development of retina-like senor. Image coordinates transformation and interpolation based on sub-pixel interpolation need to be realized for our retina-like sensor's special pixels distribution. The hardware platform is composed of retina-like senor camera, rectangular sensor camera, image grabber and PC. Combined the MIL and OpenCV library, the software program is composed in VC++ on VS 2010. Experience results show that the system can realizes two cameras' acquisition and display.

  17. Combustion pinhole camera system

    DOEpatents

    Witte, Arvel B.

    1984-02-21

    A pinhole camera system utilizing a sealed optical-purge assembly which provides optical access into a coal combustor or other energy conversion reactors. The camera system basically consists of a focused-purge pinhole optical port assembly, a conventional TV vidicon receiver, an external, variable density light filter which is coupled electronically to the vidicon automatic gain control (agc). The key component of this system is the focused-purge pinhole optical port assembly which utilizes a purging inert gas to keep debris from entering the port and a lens arrangement which transfers the pinhole to the outside of the port assembly. One additional feature of the port assembly is that it is not flush with the interior of the combustor.

  18. Combustion pinhole camera system

    DOEpatents

    Witte, A.B.

    1984-02-21

    A pinhole camera system is described utilizing a sealed optical-purge assembly which provides optical access into a coal combustor or other energy conversion reactors. The camera system basically consists of a focused-purge pinhole optical port assembly, a conventional TV vidicon receiver, an external, variable density light filter which is coupled electronically to the vidicon automatic gain control (agc). The key component of this system is the focused-purge pinhole optical port assembly which utilizes a purging inert gas to keep debris from entering the port and a lens arrangement which transfers the pinhole to the outside of the port assembly. One additional feature of the port assembly is that it is not flush with the interior of the combustor. 2 figs.

  19. The NEAT Camera Project

    NASA Technical Reports Server (NTRS)

    Jr., Ray L. Newburn

    1995-01-01

    The NEAT (Near Earth Asteroid Tracking) camera system consists of a camera head with a 6.3 cm square 4096 x 4096 pixel CCD, fast electronics, and a Sun Sparc 20 data and control computer with dual CPUs, 256 Mbytes of memory, and 36 Gbytes of hard disk. The system was designed for optimum use with an Air Force GEODSS (Ground-based Electro-Optical Deep Space Surveillance) telescope. The GEODSS telescopes have 1 m f/2.15 objectives of the Ritchey-Chretian type, designed originally for satellite tracking. Installation of NEAT began July 25 at the Air Force Facility on Haleakala, a 3000 m peak on Maui in Hawaii.

  20. LSST Camera Optics

    SciTech Connect

    Olivier, S S; Seppala, L; Gilmore, K; Hale, L; Whistler, W

    2006-06-05

    The Large Synoptic Survey Telescope (LSST) is a unique, three-mirror, modified Paul-Baker design with an 8.4m primary, a 3.4m secondary, and a 5.0m tertiary feeding a camera system that includes corrector optics to produce a 3.5 degree field of view with excellent image quality (<0.3 arcsecond 80% encircled diffracted energy) over the entire field from blue to near infra-red wavelengths. We describe the design of the LSST camera optics, consisting of three refractive lenses with diameters of 1.6m, 1.0m and 0.7m, along with a set of interchangeable, broad-band, interference filters with diameters of 0.75m. We also describe current plans for fabricating, coating, mounting and testing these lenses and filters.

  1. Depth estimation and camera calibration of a focused plenoptic camera for visual odometry

    NASA Astrophysics Data System (ADS)

    Zeller, Niclas; Quint, Franz; Stilla, Uwe

    2016-08-01

    This paper presents new and improved methods of depth estimation and camera calibration for visual odometry with a focused plenoptic camera. For depth estimation we adapt an algorithm previously used in structure-from-motion approaches to work with images of a focused plenoptic camera. In the raw image of a plenoptic camera, scene patches are recorded in several micro-images under slightly different angles. This leads to a multi-view stereo-problem. To reduce the complexity, we divide this into multiple binocular stereo problems. For each pixel with sufficient gradient we estimate a virtual (uncalibrated) depth based on local intensity error minimization. The estimated depth is characterized by the variance of the estimate and is subsequently updated with the estimates from other micro-images. Updating is performed in a Kalman-like fashion. The result of depth estimation in a single image of the plenoptic camera is a probabilistic depth map, where each depth pixel consists of an estimated virtual depth and a corresponding variance. Since the resulting image of the plenoptic camera contains two plains: the optical image and the depth map, camera calibration is divided into two separate sub-problems. The optical path is calibrated based on a traditional calibration method. For calibrating the depth map we introduce two novel model based methods, which define the relation of the virtual depth, which has been estimated based on the light-field image, and the metric object distance. These two methods are compared to a well known curve fitting approach. Both model based methods show significant advantages compared to the curve fitting method. For visual odometry we fuse the probabilistic depth map gained from one shot of the plenoptic camera with the depth data gained by finding stereo correspondences between subsequent synthesized intensity images of the plenoptic camera. These images can be synthesized totally focused and thus finding stereo correspondences is enhanced

  2. Streak camera receiver definition study

    NASA Technical Reports Server (NTRS)

    Johnson, C. B.; Hunkler, L. T., Sr.; Letzring, S. A.; Jaanimagi, P.

    1990-01-01

    Detailed streak camera definition studies were made as a first step toward full flight qualification of a dual channel picosecond resolution streak camera receiver for the Geoscience Laser Altimeter and Ranging System (GLRS). The streak camera receiver requirements are discussed as they pertain specifically to the GLRS system, and estimates of the characteristics of the streak camera are given, based upon existing and near-term technological capabilities. Important problem areas are highlighted, and possible corresponding solutions are discussed.

  3. Automated Camera Array Fine Calibration

    NASA Technical Reports Server (NTRS)

    Clouse, Daniel; Padgett, Curtis; Ansar, Adnan; Cheng, Yang

    2008-01-01

    Using aerial imagery, the JPL FineCalibration (JPL FineCal) software automatically tunes a set of existing CAHVOR camera models for an array of cameras. The software finds matching features in the overlap region between images from adjacent cameras, and uses these features to refine the camera models. It is not necessary to take special imagery of a known target and no surveying is required. JPL FineCal was developed for use with an aerial, persistent surveillance platform.

  4. Orbiter Camera Payload System

    NASA Technical Reports Server (NTRS)

    1980-01-01

    Components for an orbiting camera payload system (OCPS) include the large format camera (LFC), a gas supply assembly, and ground test, handling, and calibration hardware. The LFC, a high resolution large format photogrammetric camera for use in the cargo bay of the space transport system, is also adaptable to use on an RB-57 aircraft or on a free flyer satellite. Carrying 4000 feet of film, the LFC is usable over the visible to near IR, at V/h rates of from 11 to 41 milliradians per second, overlap of 10, 60, 70 or 80 percent and exposure times of from 4 to 32 milliseconds. With a 12 inch focal length it produces a 9 by 18 inch format (long dimension in line of flight) with full format low contrast resolution of 88 lines per millimeter (AWAR), full format distortion of less than 14 microns and a complement of 45 Reseau marks and 12 fiducial marks. Weight of the OCPS as supplied, fully loaded is 944 pounds and power dissipation is 273 watts average when in operation, 95 watts in standby. The LFC contains an internal exposure sensor, or will respond to external command. It is able to photograph starfields for inflight calibration upon command.

  5. Hemispherical Laue camera

    DOEpatents

    Li, James C. M.; Chu, Sungnee G.

    1980-01-01

    A hemispherical Laue camera comprises a crystal sample mount for positioning a sample to be analyzed at the center of sphere of a hemispherical, X-radiation sensitive film cassette, a collimator, a stationary or rotating sample mount and a set of standard spherical projection spheres. X-radiation generated from an external source is directed through the collimator to impinge onto the single crystal sample on the stationary mount. The diffracted beam is recorded on the hemispherical X-radiation sensitive film mounted inside the hemispherical film cassette in either transmission or back-reflection geometry. The distances travelled by X-radiation diffracted from the crystal to the hemispherical film are the same for all crystal planes which satisfy Bragg's Law. The recorded diffraction spots or Laue spots on the film thereby preserve both the symmetry information of the crystal structure and the relative intensities which are directly related to the relative structure factors of the crystal orientations. The diffraction pattern on the exposed film is compared with the known diffraction pattern on one of the standard spherical projection spheres for a specific crystal structure to determine the orientation of the crystal sample. By replacing the stationary sample support with a rotating sample mount, the hemispherical Laue camera can be used for crystal structure determination in a manner previously provided in conventional Debye-Scherrer cameras.

  6. Gamma ray camera

    DOEpatents

    Perez-Mendez, V.

    1997-01-21

    A gamma ray camera is disclosed for detecting rays emanating from a radiation source such as an isotope. The gamma ray camera includes a sensor array formed of a visible light crystal for converting incident gamma rays to a plurality of corresponding visible light photons, and a photosensor array responsive to the visible light photons in order to form an electronic image of the radiation therefrom. The photosensor array is adapted to record an integrated amount of charge proportional to the incident gamma rays closest to it, and includes a transparent metallic layer, photodiode consisting of a p-i-n structure formed on one side of the transparent metallic layer, and comprising an upper p-type layer, an intermediate layer and a lower n-type layer. In the preferred mode, the scintillator crystal is composed essentially of a cesium iodide (CsI) crystal preferably doped with a predetermined amount impurity, and the p-type upper intermediate layers and said n-type layer are essentially composed of hydrogenated amorphous silicon (a-Si:H). The gamma ray camera further includes a collimator interposed between the radiation source and the sensor array, and a readout circuit formed on one side of the photosensor array. 6 figs.

  7. Gamma ray camera

    DOEpatents

    Perez-Mendez, Victor

    1997-01-01

    A gamma ray camera for detecting rays emanating from a radiation source such as an isotope. The gamma ray camera includes a sensor array formed of a visible light crystal for converting incident gamma rays to a plurality of corresponding visible light photons, and a photosensor array responsive to the visible light photons in order to form an electronic image of the radiation therefrom. The photosensor array is adapted to record an integrated amount of charge proportional to the incident gamma rays closest to it, and includes a transparent metallic layer, photodiode consisting of a p-i-n structure formed on one side of the transparent metallic layer, and comprising an upper p-type layer, an intermediate layer and a lower n-type layer. In the preferred mode, the scintillator crystal is composed essentially of a cesium iodide (CsI) crystal preferably doped with a predetermined amount impurity, and the p-type upper intermediate layers and said n-type layer are essentially composed of hydrogenated amorphous silicon (a-Si:H). The gamma ray camera further includes a collimator interposed between the radiation source and the sensor array, and a readout circuit formed on one side of the photosensor array.

  8. Camera Trajectory fromWide Baseline Images

    NASA Astrophysics Data System (ADS)

    Havlena, M.; Torii, A.; Pajdla, T.

    2008-09-01

    Camera trajectory estimation, which is closely related to the structure from motion computation, is one of the fundamental tasks in computer vision. Reliable camera trajectory estimation plays an important role in 3D reconstruction, self localization, and object recognition. There are essential issues for a reliable camera trajectory estimation, for instance, choice of the camera and its geometric projection model, camera calibration, image feature detection and description, and robust 3D structure computation. Most of approaches rely on classical perspective cameras because of the simplicity of their projection models and ease of their calibration. However, classical perspective cameras offer only a limited field of view, and thus occlusions and sharp camera turns may cause that consecutive frames look completely different when the baseline becomes longer. This makes the image feature matching very difficult (or impossible) and the camera trajectory estimation fails under such conditions. These problems can be avoided if omnidirectional cameras, e.g. a fish-eye lens convertor, are used. The hardware which we are using in practice is a combination of Nikon FC-E9 mounted via a mechanical adaptor onto a Kyocera Finecam M410R digital camera. Nikon FC-E9 is a megapixel omnidirectional addon convertor with 180° view angle which provides images of photographic quality. Kyocera Finecam M410R delivers 2272×1704 images at 3 frames per second. The resulting combination yields a circular view of diameter 1600 pixels in the image. Since consecutive frames of the omnidirectional camera often share a common region in 3D space, the image feature matching is often feasible. On the other hand, the calibration of these cameras is non-trivial and is crucial for the accuracy of the resulting 3D reconstruction. We calibrate omnidirectional cameras off-line using the state-of-the-art technique and Mičušík's two-parameter model, that links the radius of the image point r to the

  9. ETR BUILDING, TRA642, INTERIOR. BASEMENT. CAMERA IS IN SAME POSITION ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    ETR BUILDING, TRA-642, INTERIOR. BASEMENT. CAMERA IS IN SAME POSITION AS ID-33-G-98 BUT ANGLED TO SHOW FAR END OF CORRIDOR AND OTHER EXPERIMENTAL GEAR. CAMERA FACES WEST. INL NEGATIVE NO. HD46-30-3. Mike Crane, Photographer, 2/2005 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  10. Visual Feedback Stabilization of Balancing Tasks with Camera Misalignment

    NASA Astrophysics Data System (ADS)

    Hirata, Kentaro; Mizuno, Takashi

    In this paper, we consider visual feedback stabilization which tolerates small camera misalignment. Specifically, a balancing task with a cart-pendulum system using camera image is examined. Such a task is known to rely heavily on the detection of the vertical direction and the angle measurement error due to the camera misalignment could be fatal for stabilization. From a mathematical model of the measurement error, the effect of the misalignment is naturally represented by affine perturbation to the coefficient matrix of the output equation. Motivated by this fact, a special type of robust dynamic output feedback stabilization against polytopic uncertainty is investigated. By solving the related BMI, one can design a controller which tolerates the camera misalignment to some extent. The result is verified via experiments.

  11. Posture metrology for aerospace camera in the assembly of spacecraft

    NASA Astrophysics Data System (ADS)

    Yang, ZaiHua; Yang, Song; Wan, Bile; Pan, Tingyao; Long, Changyu

    2016-01-01

    During the spacecraft assembly process, the posture of the aerospace camera to the spacecraft coordinate system needs to be measured precisely, because the posture data are very important for the earth observing. In order to measure the angles between the camera optical axis and the spacecraft coordinate system's three axes x, y, z, a measurement scheme was designed. The scheme was based on the principle of space intersection measurement with theodolites. Three thodolites were used to respectively collimate the camera axis and two faces of a base cube. Then, through aiming at each other, a measurement network was built. Finally, the posture of the camera was measured. The error analysis and measurement experiments showed that the precision can reach 6″. This method has been used in the assembly of satellite GF-2 with satisfactory results.

  12. Utilization of consumer level digital cameras in astronomy

    NASA Astrophysics Data System (ADS)

    Páta, Petr; Fliegel, Karel; Klíma, Miloš; Blažek, Martin; Řeřábek, Martin

    2010-08-01

    This paper presents a study of possible utilization of digital single-lens reflex (DSLR) cameras in astronomy. The DSLRs have a great advantage over the professional equipments in better cost efficiency with comparable usability for selected purposes. The quality of electro-optical system in the DSLR camera determines the area where it can be used with acceptable precision. At first a set of important camera parameters for astronomical utilization is introduced in the paper. Color filter array (CFA) structure, demosaicing algorithm, image sensor spectral properties, noise and transfer characteristics are the parameters that belong among the very important ones and these are further analyzed in the paper. Compression of astronomical images using the KLT approach is also described below. The potential impact of these parameters on position and photometric measurement is presented based on the analysis and measurements with the wide-angle lens. The prospective utilization of consumer DSLR camera as a substitute for expensive devices is discussed.

  13. Role of Optical Coherence Tomography in Assessing Anterior Chamber Angles

    PubMed Central

    Kochupurakal, Reema Thomas; Jha, Kirti Nath; Rajalakshmi, A.R.; Nagarajan, Swathi; Ezhumalai, G.

    2016-01-01

    Introduction Gonioscopy is the gold standard in assessing anterior chamber angles. However, interobserver variations are common and there is a need for reliable objective method of assessment. Aim To compare the anterior chamber angle by gonioscopy and Spectral Domain Optical Coherence Tomography (SD-OCT) in individuals with shallow anterior chamber. Materials and Methods This comparative observational study was conducted in a rural tertiary multi-speciality teaching hospital. A total of 101 eyes of 54 patients with shallow anterior chamber on slit lamp evaluation were included. Anterior chamber angle was graded by gonioscopy using the shaffer grading system. Angles were also assessed by SD-OCT with Trabecular Iris Angle (TIA) and Angle Opening Distance (AOD). Chi-square test, sensitivity, specificity, positive and negative predictive value to find correlation between OCT parameters and gonioscopy grading. Results Females represented 72.7%. The mean age was 53.93 ±8.24 years and mean anterior chamber depth was 2.47 ± 0.152 mm. Shaffer grade ≤ 2 were identified in 95(94%) superior, 42(41.5%) inferior, 65(64.3%) nasal and 57(56.4%) temporal quadrants. Cut-off values of TIA ≤ 22° and AOD ≤ 290 μm were taken as narrow angles on SD-OCT. TIA of ≤ 22° were found in 88(92.6%) nasal and 87(87%) temporal angles. AOD of ≤ 290 μm was found in 73(76.8%) nasal and 83(83%) temporal quadrants. Sensitivity in detecting narrow angles was 90.7% and 82.2% for TIA and AOD, while specificity was 11.7% and 23.4%, respectively. Conclusion Individuals were found to have narrow angles more with SD-OCT. Sensitivity was high and specificity was low in detecting narrow angles compared to gonioscopy, making it an unreliable tool for screening. PMID:27190851

  14. Wide-angle imaging system with fiberoptic components providing angle-dependent virtual material stops

    NASA Technical Reports Server (NTRS)

    Vaughan, Arthur H. (Inventor)

    1993-01-01

    A strip imaging wide angle optical system is provided. The optical system is provided with a 'virtual' material stop to avoid aberrational effects inherent in wide angle optical systems. The optical system includes a spherical mirror section for receiving light from a 180 deg strip or arc of a target image. Light received by the spherical mirror section is reflected to a frustoconical mirror section for subsequent rereflection to a row of optical fibers. Each optical fiber transmits a portion of the received light to a detector. The optical system exploits the narrow cone of acceptance associated with optical fibers to substantially eliminate vignetting effects inherent in wide angle systems. Further, the optical system exploits the narrow cone of acceptance of the optical fibers to substantially limit spherical aberration. The optical system is ideally suited for any application wherein a 180 deg strip image need be detected, and is particularly well adapted for use in hostile environments such as in planetary exploration.

  15. Multispectral Photometry of the Moon and Absolute Calibration of the Clementine UV/Vis Camera

    NASA Astrophysics Data System (ADS)

    Hillier, John K.; Buratti, Bonnie J.; Hill, Kathryn

    1999-10-01

    We present a multispectral photometric study of the Moon between solar phase angles of 0 and 85°. Using Clementine images obtained between 0.4 and 1.0 μm, we produce a comprehensive study of the lunar surface containing the following results: (1) empirical photometric functions for the spectral range and viewing and illumination geometries mentioned, (2) photometric modeling that derives the physical properties of the upper regolith and includes a detailed study of the causes for the lunar opposition surge, (3) an absolute calibration of the Clementine UV/Vis camera. The calibration procedure given on the Clementine calibration web site produces reflectances relative to a halon standard and further appear significantly higher than those seen in groundbased observations. By comparing Clementine observations with prior groundbased observations of 15 sites on the Moon we have determined a good absolute calibration of the Clementine UV/Vis camera. A correction factor of 0.532 has been determined to convert the web site (www.planetary.brown.edu/clementine/calibration.html) reflectances to absolute values. From the calibrated data, we calculate empirical phase functions useful for performing photometric corrections to observations of the Moon between solar phase angles of 0 and 85° and in the spectral range 0.4 to 1.0μm. Finally, the calibrated data is used to fit a version of Hapke's photometric model modified to incorporate a new formulation, developed in this paper, of the lunar opposition surge which includes coherent backscatter. Recent studies of the lunar opposition effect have yielded contradictory results as to the mechanism responsible: shadow hiding, coherent backscatter, or both. We find that most of the surge can be explained by shadow hiding with a halfwidth of ˜8°. However, for the brightest regions (the highlands at 0.75-1.0μm) a small additional narrow component (halfwidth of <2°) of total amplitude ˜1/6 to 1/4 that of the shadow hiding surge is

  16. Evidence for narrow structure in the analyzing power of the γ reaction

    SciTech Connect

    L. Santi; M. Barlett; D. Ciskowski; R. Garfagnini; M. M. Gazzaly; G. W. Hoffmann; K. W. Jones; M. A. Nasser, G. Pauletta; C. Smith; N. Tanaka; R. Whitney

    1988-11-01

    The differential cross section and the analyzing power Ay have been measured for the 3He(p→,d)X reaction at a deuteron angle of 22° (lab) and a proton momentum of 1.46 GeV/c. Evidence of narrow structure in the missing mass dependence of Ay was observed. The masses of the maxima are compared with other reports of narrow resonance-like structure and with the predictions of theory. Significant correspondence is found.

  17. 980 nm narrow linewidth Yb-doped phosphate fiber laser

    NASA Astrophysics Data System (ADS)

    Li, Pingxue; Yao, Yifei; Hu, Haowei; Chi, Junjie; Yang, Chun; Zhao, Ziqiang; Zhang, Guangju

    2014-12-01

    A narrow-linewidth ytterbium (Yb)-doped phosphate fiber laser based on fiber Bragg grating (FBG) operating around 980 nm is reported. Two different kinds of cavity are applied to obtain the 980 nm narrow-linewidth output. One kind of the cavity consists of a 0.35 nm broadband lindwidth high-reflection FBG and the Yb-doped phosphate fiber end with 0° angle, which generates a maximum output power of 25 mW. The other kind of resonator is composed of a single mode Yb-doped phosphate fiber and a pair of FBGs. Over 10.7 mW stable continuous wave are obtained with two longitudinal modes at 980 nm. We have given a detailed analysis and discussion for the results.

  18. Water–Soluble Narrow Line Radicals for Dynamic Nuclear Polarization

    PubMed Central

    Haze, Olesya; Corzilius, Björn; Smith, Albert A.; Griffin, Robert G.; Swager, Timothy M.

    2012-01-01

    The synthesis of air-stable highly water-soluble organic radicals containing a 1,3-bisdiphenylene-2-phenylallyl (BDPA) core is reported. A sulfonated derivative, SA-BDPA, retains the narrow EPR linewidth (<30 MHz at 5 T) of the parent BDPA in highly concentrated glycerol/water solutions (40 mM), which enables its use as polarizing agent for solid effect dynamic nuclear polarization (SE DNP). Sensitivity enhancement of 110 was obtained in high field magic-angle-spinning nuclear magnetic resonance (MAS NMR) experiments. The ease of synthesis and high maximum enhancements obtained with the BDPA-based radicals constitute a major advance over the trityl-type narrow line polarization agents. PMID:22917088

  19. Accurate camera calibration method specialized for virtual studios

    NASA Astrophysics Data System (ADS)

    Okubo, Hidehiko; Yamanouchi, Yuko; Mitsumine, Hideki; Fukaya, Takashi; Inoue, Seiki

    2008-02-01

    Virtual studio is a popular technology for TV programs, that makes possible to synchronize computer graphics (CG) to realshot image in camera motion. Normally, the geometrical matching accuracy between CG and realshot image is not expected so much on real-time system, we sometimes compromise on directions, not to come out the problem. So we developed the hybrid camera calibration method and CG generating system to achieve the accurate geometrical matching of CG and realshot on virtual studio. Our calibration method is intended for the camera system on platform and tripod with rotary encoder, that can measure pan/tilt angles. To solve the camera model and initial pose, we enhanced the bundle adjustment algorithm to fit the camera model, using pan/tilt data as known parameters, and optimizing all other parameters invariant against pan/tilt value. This initialization yields high accurate camera position and orientation consistent with any pan/tilt values. Also we created CG generator implemented the lens distortion function with GPU programming. By applying the lens distortion parameters obtained by camera calibration process, we could get fair compositing results.

  20. Cryogenic Detectors (Narrow Field Instruments)

    NASA Astrophysics Data System (ADS)

    Hoevers, H.; Verhoeve, P.

    Two cryogenic imaging spectrometer arrays are currently considered as focal plane instruments for XEUS. The narrow field imager 1 (NFI 1) will cover the energy range from 0.05 to 3 keV with an energy resolution of 2 eV, or better, at 500 eV. A second narrow field imager (NFI 2) covers the energy range from 1 to 15 keV with an energy resolution of 2 eV (at 1 keV) and 5 eV (at 7 keV), creating some overlap with part of the NFI 1 energy window. Both narrow field imagers have a 0.5 arcmin field of view. Their imaging capabilities are matched to the XEUS optics of 2 to 5 arcsec leading to 1 arcsec pixels. The detector arrays will be cooled by a closed cycle system comprising a mechanical cooler with a base temperature of 2.5 K and either a low temperature 3He sorption pump providing the very low temperature stage and/or an Adiabatic Demagnetization Refrigerator (ADR). The ADR cooler is explicitly needed to cool the NFI 2 array. The narrow field imager 1} Currently a 48 times 48 element array of superconducting tunnel junctions (STJ) is envisaged. Its operating temperature is in the range between 30 and 350 mK. Small, single Ta STJs (20-50 mum on a side) have shown 3.5 eV (FWHM) resolution at E = 525 eV and small arrays have been successfully demonstrated (6 times 6 pixels), or are currently tested (10 times 12 pixels). Alternatively, a prototype Distributed Read-Out Imaging Device (DROID), consisting of a linear superconducting Ta absorber of 20 times 100 mum2, including a 20 times 20 mum STJ for readout at either end, has shown a measured energy resolution of 2.4 eV (FWHM) at E = 500 eV. Simulations involving the diffusion properties as well as loss and tunnel rates have shown that the performance can be further improved by slight modifications in the geometry, and that the size of the DROIDS can be increased to 0.5-1.0 mm without loss in energy resolution. The relatively large areas and good energy resolution compared to single STJs make DROIDS good candidates for the

  1. DEVICE CONTROLLER, CAMERA CONTROL

    Energy Science and Technology Software Center (ESTSC)

    1998-07-20

    This is a C++ application that is the server for the cameral control system. Devserv drives serial devices, such as cameras and videoswitchers used in a videoconference, upon request from a client such as the camxfgbfbx ccint program. cc Deverv listens on UPD ports for clients to make network contractions. After a client connects and sends a request to control a device (such as to pan,tilt, or zooma camera or do picture-in-picture with a videoswitcher),more » devserv formats the request into an RS232 message appropriate for the device and sends this message over the serial port to which the device is connected. Devserv then reads the reply from the device from the serial port to which the device is connected. Devserv then reads the reply from the device from the serial port and then formats and sends via multicast a status message. In addition, devserv periodically multicasts status or description messages so that all clients connected to the multicast channel know what devices are supported and their ranges of motion and the current position. The software design employs a class hierarchy such that an abstract base class for devices can be subclassed into classes for various device categories(e.g. sonyevid30, cononvco4, panasonicwjmx50, etc.). which are further subclassed into classes for various device categories. The devices currently supported are the Sony evi-D30, Canon, VCC1, Canon VCC3, and Canon VCC4 cameras and the Panasonic WJ-MX50 videoswitcher. However, developers can extend the class hierarchy to support other devices.« less

  2. Adaptive compressive sensing camera

    NASA Astrophysics Data System (ADS)

    Hsu, Charles; Hsu, Ming K.; Cha, Jae; Iwamura, Tomo; Landa, Joseph; Nguyen, Charles; Szu, Harold

    2013-05-01

    We have embedded Adaptive Compressive Sensing (ACS) algorithm on Charge-Coupled-Device (CCD) camera based on the simplest concept that each pixel is a charge bucket, and the charges comes from Einstein photoelectric conversion effect. Applying the manufactory design principle, we only allow altering each working component at a minimum one step. We then simulated what would be such a camera can do for real world persistent surveillance taking into account of diurnal, all weather, and seasonal variations. The data storage has saved immensely, and the order of magnitude of saving is inversely proportional to target angular speed. We did design two new components of CCD camera. Due to the matured CMOS (Complementary metal-oxide-semiconductor) technology, the on-chip Sample and Hold (SAH) circuitry can be designed for a dual Photon Detector (PD) analog circuitry for changedetection that predicts skipping or going forward at a sufficient sampling frame rate. For an admitted frame, there is a purely random sparse matrix [Φ] which is implemented at each bucket pixel level the charge transport bias voltage toward its neighborhood buckets or not, and if not, it goes to the ground drainage. Since the snapshot image is not a video, we could not apply the usual MPEG video compression and Hoffman entropy codec as well as powerful WaveNet Wrapper on sensor level. We shall compare (i) Pre-Processing FFT and a threshold of significant Fourier mode components and inverse FFT to check PSNR; (ii) Post-Processing image recovery will be selectively done by CDT&D adaptive version of linear programming at L1 minimization and L2 similarity. For (ii) we need to determine in new frames selection by SAH circuitry (i) the degree of information (d.o.i) K(t) dictates the purely random linear sparse combination of measurement data a la [Φ]M,N M(t) = K(t) Log N(t).

  3. Phoenix Robotic Arm Camera

    NASA Astrophysics Data System (ADS)

    Keller, H. U.; Goetz, W.; Hartwig, H.; Hviid, S. F.; Kramm, R.; Markiewicz, W. J.; Reynolds, R.; Shinohara, C.; Smith, P.; Tanner, R.; Woida, P.; Woida, R.; Bos, B. J.; Lemmon, M. T.

    2008-10-01

    The Phoenix Robotic Arm Camera (RAC) is a variable-focus color camera mounted to the Robotic Arm (RA) of the Phoenix Mars Lander. It is designed to acquire both close-up images of the Martian surface and microscopic images (down to a scale of 23 μm/pixel) of material collected in the RA scoop. The mounting position at the end of the Robotic Arm allows the RAC to be actively positioned for imaging of targets not easily seen by the Stereo Surface Imager (SSI), such as excavated trench walls and targets under the Lander structure. Color information is acquired by illuminating the target with red, green, and blue light-emitting diodes. Digital terrain models (DTM) can be generated from RAC images acquired from different view points. This can provide high-resolution stereo information about fine details of the trench walls. The large stereo baseline possible with the arm can also provide a far-field DTM. The primary science objectives of the RAC are the search for subsurface soil/ice layering at the landing site and the characterization of scoop samples prior to delivery to other instruments on board Phoenix. The RAC shall also provide low-resolution panoramas in support of SSI activities and acquire images of the Lander deck for instrument and Lander check out. The camera design was inherited from the unsuccessful Mars Polar Lander mission (1999) and further developed for the (canceled) Mars Surveyor 2001 Lander (MSL01). Extensive testing and partial recalibration qualified the MSL01 RAC flight model for integration into the Phoenix science payload.

  4. DEVICE CONTROLLER, CAMERA CONTROL

    SciTech Connect

    Perry, Marcia

    1998-07-20

    This is a C++ application that is the server for the cameral control system. Devserv drives serial devices, such as cameras and videoswitchers used in a videoconference, upon request from a client such as the camxfgbfbx ccint program. cc Deverv listens on UPD ports for clients to make network contractions. After a client connects and sends a request to control a device (such as to pan,tilt, or zooma camera or do picture-in-picture with a videoswitcher), devserv formats the request into an RS232 message appropriate for the device and sends this message over the serial port to which the device is connected. Devserv then reads the reply from the device from the serial port to which the device is connected. Devserv then reads the reply from the device from the serial port and then formats and sends via multicast a status message. In addition, devserv periodically multicasts status or description messages so that all clients connected to the multicast channel know what devices are supported and their ranges of motion and the current position. The software design employs a class hierarchy such that an abstract base class for devices can be subclassed into classes for various device categories(e.g. sonyevid30, cononvco4, panasonicwjmx50, etc.). which are further subclassed into classes for various device categories. The devices currently supported are the Sony evi-D30, Canon, VCC1, Canon VCC3, and Canon VCC4 cameras and the Panasonic WJ-MX50 videoswitcher. However, developers can extend the class hierarchy to support other devices.

  5. Neutron Imaging Camera

    NASA Technical Reports Server (NTRS)

    Hunter, Stanley; deNolfo, G. A.; Barbier, L. M.; Link, J. T.; Son, S.; Floyd, S. R.; Guardala, N.; Skopec, M.; Stark, B.

    2008-01-01

    The Neutron Imaging Camera (NIC) is based on the Three-dimensional Track Imager (3DTI) technology developed at GSFC for gamma-ray astrophysics applications. The 3-DTI, a large volume time-projection chamber, provides accurate, approximately 0.4 mm resolution, 3-D tracking of charged particles. The incident direction of fast neutrons, En > 0.5 MeV, are reconstructed from the momenta and energies of the proton and triton fragments resulting from (sup 3)He(n,p) (sup 3)H interactions in the 3-DTI volume. The performance of the NIC from laboratory and accelerator tests is presented.

  6. 91. 22'X34' original blueprint, VariableAngle Launcher, 'CONNECTING BRIDGE, REAR VIEW ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    91. 22'X34' original blueprint, Variable-Angle Launcher, 'CONNECTING BRIDGE, REAR VIEW CAMERA HOUSE ASSEMBLY' drawn at 3/8=1'-0', 3'=1'-0'. (BUORD Sketch # 209042). - Variable Angle Launcher Complex, Variable Angle Launcher, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  7. Angled Layers in Super Resolution

    NASA Technical Reports Server (NTRS)

    2004-01-01

    Researchers used a special imaging technique with the panoramic camera on NASA's Mars Exploration Rover Opportunity to get as detailed a look as possible at a target region near eastern foot of 'Burns Cliff.' The intervening terrain was too difficult for driving the rover closer. The target is the boundary between two sections of layered rock. The layers in lower section (left) run at a marked angle to the layers in next higher section (right).

    This view is the product of a technique called super resolution. It was generated from data acquired on sol 288 of Opportunity's mission (Nov. 14, 2004) from a position along the southeast wall of 'Endurance Crater.' Resolution slightly higher than normal for the panoramic camera was synthesized for this view by combining 17 separate images of this scene, each one 'dithered' or pointed slightly differently from the previous one. Computer manipulation of the individual images was then used to generate a new synthetic view of the scene in a process known mathematically as iterative deconvolution, but referred to informally as super resolution. Similar methods have been used to enhance the resolution of images from the Mars Pathfinder mission and the Hubble Space Telescope.

  8. LRO Camera Imaging of the Moon: Apollo 17 and other Sites for Ground Truth

    NASA Astrophysics Data System (ADS)

    Jolliff, B. L.; Wiseman, S. M.; Robinson, M. S.; Lawrence, S.; Denevi, B. W.; Bell, J. F.

    2009-12-01

    One of the fundamental goals of the Lunar Reconnaissance Orbiter (LRO) is the determination of mineralogic and compositional distributions and their relation to geologic features on the Moon’s surface. Through a combination of imaging with the LRO narrow-angle cameras and wide-angle camera (NAC, WAC), very fine-scale geologic features are resolved with better than meter-per-pixel resolution (NAC) and correlated to spectral variations mapped with the lower resolution, 7-band WAC (400-m/pix, ultraviolet bands centered at 321 and 360 nm; 100-m/pix, visible bands centered at 415, 566, 604, 643, and 689 nm). Keys to understanding spectral variations in terms of composition, and relationships between compositional variations and surface geology, are ground-truth sites where surface compositions and mineralogy, as well as geology and geologic history, are well known. The Apollo 17 site is especially useful because the site geology includes a range of features from high-Ti mare basalts to Serenitatis-Basin-related massifs containing basin impact-melt breccia and feldspathic highlands materials, and a regional black and orange pyroclastic deposit. Moreover, relative and absolute ages of these features are known. In addition to rock samples, astronauts collected well-documented soil samples at 22 different sample locations across this diverse area. Many of these sample sites can be located in the multispectral data using the co-registered NAC images. Digital elevation data are used to normalize illumination geometry and thus fully exploit the multispectral data and compare derived compositional parameters for different geologic units. Regolith characteristics that are known in detail from the Apollo 17 samples, such as maturity and petrography of mineral, glass, and lithic components, contribute to spectral variations and are considered in the assessment of spectral variability at the landing site. In this work, we focus on variations associated with the ilmenite content

  9. Mars Science Laboratory Engineering Cameras

    NASA Technical Reports Server (NTRS)

    Maki, Justin N.; Thiessen, David L.; Pourangi, Ali M.; Kobzeff, Peter A.; Lee, Steven W.; Dingizian, Arsham; Schwochert, Mark A.

    2012-01-01

    NASA's Mars Science Laboratory (MSL) Rover, which launched to Mars in 2011, is equipped with a set of 12 engineering cameras. These cameras are build-to-print copies of the Mars Exploration Rover (MER) cameras, which were sent to Mars in 2003. The engineering cameras weigh less than 300 grams each and use less than 3 W of power. Images returned from the engineering cameras are used to navigate the rover on the Martian surface, deploy the rover robotic arm, and ingest samples into the rover sample processing system. The navigation cameras (Navcams) are mounted to a pan/tilt mast and have a 45-degree square field of view (FOV) with a pixel scale of 0.82 mrad/pixel. The hazard avoidance cameras (Haz - cams) are body-mounted to the rover chassis in the front and rear of the vehicle and have a 124-degree square FOV with a pixel scale of 2.1 mrad/pixel. All of the cameras utilize a frame-transfer CCD (charge-coupled device) with a 1024x1024 imaging region and red/near IR bandpass filters centered at 650 nm. The MSL engineering cameras are grouped into two sets of six: one set of cameras is connected to rover computer A and the other set is connected to rover computer B. The MSL rover carries 8 Hazcams and 4 Navcams.

  10. Refraction of narrow probing light beam in multilayer optical fiber

    NASA Astrophysics Data System (ADS)

    Mirovitskaya, S. D.; Kudryavtsev, D. L.

    1984-11-01

    For an experimental design and performance evaluation of optical fibers, it is necessary to establish an analytical relation between the refraction angle of a probing light beam and the geometrical parameters of the fiber. Here the problem is solved by application of Snell's law for a narrow light beam, first to a double-layer fiber consisting of a gradient core inside a homogeneous sheath and then to a triple-layer fiber consisting of a gradient core inside a double-layer sheath.

  11. Stereoscopic camera design

    NASA Astrophysics Data System (ADS)

    Montgomery, David J.; Jones, Christopher K.; Stewart, James N.; Smith, Alan

    2002-05-01

    It is clear from the literature that the majority of work in stereoscopic imaging is directed towards the development of modern stereoscopic displays. As costs come down, wider public interest in this technology is expected to increase. This new technology would require new methods of image formation. Advances in stereo computer graphics will of course lead to the creation of new stereo computer games, graphics in films etc. However, the consumer would also like to see real-world stereoscopic images, pictures of family, holiday snaps etc. Such scenery would have wide ranges of depth to accommodate and would need also to cope with moving objects, such as cars, and in particular other people. Thus, the consumer acceptance of auto/stereoscopic displays and 3D in general would be greatly enhanced by the existence of a quality stereoscopic camera. This paper will cover an analysis of existing stereoscopic camera designs and show that they can be categorized into four different types, with inherent advantages and disadvantages. A recommendation is then made with regard to 3D consumer still and video photography. The paper will go on to discuss this recommendation and describe its advantages and how it can be realized in practice.

  12. Perception of Perspective Angles.

    PubMed

    Erkelens, Casper J

    2015-06-01

    We perceive perspective angles, that is, angles that have an orientation in depth, differently from what they are in physical space. Extreme examples are angles between rails of a railway line or between lane dividers of a long and straight road. In this study, subjects judged perspective angles between bars lying on the floor of the laboratory. Perspective angles were also estimated from pictures taken from the same point of view. Converging and diverging angles were judged to test three models of visual space. Four subjects evaluated the perspective angles by matching them to nonperspective angles, that is, angles between the legs of a compass oriented in the frontal plane. All subjects judged both converging and diverging angles larger than the physical angle and smaller than the angles in the proximal stimuli. A model of shallow visual space describes the results. According to the model, lines parallel to visual lines, vanishing at infinity in physical space, converge to visual lines in visual space. The perceived shape of perspective angles is incompatible with the perceived length and width of the bars. The results have significance for models of visual perception and practical implications for driving and flying in poor visibility conditions. PMID:27433312

  13. Perception of Perspective Angles

    PubMed Central

    2015-01-01

    We perceive perspective angles, that is, angles that have an orientation in depth, differently from what they are in physical space. Extreme examples are angles between rails of a railway line or between lane dividers of a long and straight road. In this study, subjects judged perspective angles between bars lying on the floor of the laboratory. Perspective angles were also estimated from pictures taken from the same point of view. Converging and diverging angles were judged to test three models of visual space. Four subjects evaluated the perspective angles by matching them to nonperspective angles, that is, angles between the legs of a compass oriented in the frontal plane. All subjects judged both converging and diverging angles larger than the physical angle and smaller than the angles in the proximal stimuli. A model of shallow visual space describes the results. According to the model, lines parallel to visual lines, vanishing at infinity in physical space, converge to visual lines in visual space. The perceived shape of perspective angles is incompatible with the perceived length and width of the bars. The results have significance for models of visual perception and practical implications for driving and flying in poor visibility conditions. PMID:27433312

  14. Object recognition through turbulence with a modified plenoptic camera

    NASA Astrophysics Data System (ADS)

    Wu, Chensheng; Ko, Jonathan; Davis, Christopher

    2015-03-01

    Atmospheric turbulence adds accumulated distortion to images obtained by cameras and surveillance systems. When the turbulence grows stronger or when the object is further away from the observer, increasing the recording device resolution helps little to improve the quality of the image. Many sophisticated methods to correct the distorted images have been invented, such as using a known feature on or near the target object to perform a deconvolution process, or use of adaptive optics. However, most of the methods depend heavily on the object's location, and optical ray propagation through the turbulence is not directly considered. Alternatively, selecting a lucky image over many frames provides a feasible solution, but at the cost of time. In our work, we propose an innovative approach to improving image quality through turbulence by making use of a modified plenoptic camera. This type of camera adds a micro-lens array to a traditional high-resolution camera to form a semi-camera array that records duplicate copies of the object as well as "superimposed" turbulence at slightly different angles. By performing several steps of image reconstruction, turbulence effects will be suppressed to reveal more details of the object independently (without finding references near the object). Meanwhile, the redundant information obtained by the plenoptic camera raises the possibility of performing lucky image algorithmic analysis with fewer frames, which is more efficient. In our work, the details of our modified plenoptic cameras and image processing algorithms will be introduced. The proposed method can be applied to coherently illuminated object as well as incoherently illuminated objects. Our result shows that the turbulence effect can be effectively suppressed by the plenoptic camera in the hardware layer and a reconstructed "lucky image" can help the viewer identify the object even when a "lucky image" by ordinary cameras is not achievable.

  15. Finding success in ACA narrow networks.

    PubMed

    Daly, Rich

    2015-12-01

    Health systems should carefully consider the specific details of their local market before deciding to launch a narrow network plan or to join an existing insurer's narrow network. Key steps to take in the evaluation process include: Determining insurer interest in forming a true partnership. Assessing capability for greater efficiency. Assessing insurer priorities. Obtaining access to enrollee data. Identifying capabilities that differentiate it from other narrow networks. PMID:26793944

  16. Circuitry for Angle Measurements

    NASA Technical Reports Server (NTRS)

    Currie, J. R.; Kissel, R. R.

    1983-01-01

    Angle resolver pulsed and read under microprocessor control. Pulse generator excites resolver windings with dual slope pulse. System sequentially reads sine and cosine windings. Microprocessor determines angle through which resolver shaft turned from reference angle. Suitable applications include rate tables, antenna direction controllers, and machine tools.

  17. Transmission electron microscope CCD camera

    DOEpatents

    Downing, Kenneth H.

    1999-01-01

    In order to improve the performance of a CCD camera on a high voltage electron microscope, an electron decelerator is inserted between the microscope column and the CCD. This arrangement optimizes the interaction of the electron beam with the scintillator of the CCD camera while retaining optimization of the microscope optics and of the interaction of the beam with the specimen. Changing the electron beam energy between the specimen and camera allows both to be optimized.

  18. Anger perceptually and conceptually narrows cognitive scope.

    PubMed

    Gable, Philip A; Poole, Bryan D; Harmon-Jones, Eddie

    2015-07-01

    For the last 50 years, research investigating the effect of emotions on scope of cognitive processing was based on models proposing that affective valence determined cognitive scope. More recently, our motivational intensity model suggests that this past work had confounded valence with motivational intensity. Research derived from this model supports the idea that motivational intensity, rather than affective valence, explains much of the variance emotions have on cognitive scope. However, the motivational intensity model is limited in that the empirical work has examined only positive affects high in approach and negative affects high in avoidance motivation. Thus, perhaps only approach-positive and avoidance-negative states narrow cognitive scope. The present research was designed to clarify these conceptual issues by examining the effect of anger, a negatively valenced approach-motivated state, on cognitive scope. Results revealed that anger narrowed attentional scope relative to a neutral state and that attentional narrowing to anger was similar to the attentional narrowing caused by high approach-motivated positive affects (Study 1). This narrowing of attention was related to trait approach motivation (Studies 2 and Study 3). Anger also narrowed conceptual cognitive categorization (Study 4). Narrowing of categorization related to participants' approach motivation toward anger stimuli. Together, these results suggest that anger, an approach-motivated negative affect, narrows perceptual and conceptual cognitive scope. More broadly, these results support the conceptual model that motivational intensity per se, rather than approach-positive and avoidance-negative states, causes a narrowing of cognitive scope. PMID:26011662

  19. Optimal Number of Angle Images for Calculating Anterior Angle Volume and Iris Volume Measurements

    PubMed Central

    Blieden, Lauren S.; Chuang, Alice Z.; Baker, Laura A.; Bell, Nicholas P.; Fuller, Timothy S.; Mankiewicz, Kimberly A.; Feldman, Robert M.

    2015-01-01

    Purpose. We determined the optimal number of angle images required to obtain reliable measurements of trabecular-iris circumferential volume (TICV) and iris volume (IV) using swept-source Fourier domain anterior segment optical coherence tomography (SSFD-ASOCT) scans in narrow angle eyes. Methods. Scleral spur landmarks (SSL) were manually identified on ASOCT angle images from 128 meridians from each of 24 eyes with chronic primary angle closure (PAC) spectrum of disease. The anterior and posterior corneal curves, and the anterior and posterior iris surfaces were identified automatically by the anterior chamber analysis and interpretation (ACAI) software, then manually examined and edited by the reader if required. Trabecular-iris circumferential volume at 750 μm from SSL (TICV750) and IV were subsequently calculated using varying numbers of angle images. Threshold error was determined to be less than the lower 95% confidence limit of mean absolute percent error (MAPE) of the change in TICV or IV resulting from laser peripheral iridotomy, which would be 17% for TICV and 5% for IV, based on previous studies. The optimal number of angle images was the smallest number of images where MAPE was less than this threshold for TICV and IV. Results. A total of 32 equally-spaced angle images (16 meridians) was required to estimate TICV750 and 16 angle images (8 meridians) to estimate IV. Both were within 4.6% and 1.6% of MAPE, respectively. Conclusions. It is possible to determine TICV and IV parameters reliably in narrow angles without evaluating all 128 meridians obtained with SSFD-ASOCT. PMID:25829412

  20. Neutron Imaging Camera

    NASA Technical Reports Server (NTRS)

    Hunter, Stanley D.; DeNolfo, Georgia; Floyd, Sam; Krizmanic, John; Link, Jason; Son, Seunghee; Guardala, Noel; Skopec, Marlene; Stark, Robert

    2008-01-01

    We describe the Neutron Imaging Camera (NIC) being developed for DTRA applications by NASA/GSFC and NSWC/Carderock. The NIC is based on the Three-dimensional Track Imager (3-DTI) technology developed at GSFC for gamma-ray astrophysics applications. The 3-DTI, a large volume time-projection chamber, provides accurate, approximately 0.4 mm resolution. 3-D tracking of charged particles. The incident direction of fast neutrons, E(sub N) > 0.5 MeV. arc reconstructed from the momenta and energies of the proton and triton fragments resulting from 3He(n,p)3H interactions in the 3-DTI volume. We present angular and energy resolution performance of the NIC derived from accelerator tests.

  1. A Motionless Camera

    NASA Technical Reports Server (NTRS)

    1994-01-01

    Omniview, a motionless, noiseless, exceptionally versatile camera was developed for NASA as a receiving device for guiding space robots. The system can see in one direction and provide as many as four views simultaneously. Developed by Omniview, Inc. (formerly TRI) under a NASA Small Business Innovation Research (SBIR) grant, the system's image transformation electronics produce a real-time image from anywhere within a hemispherical field. Lens distortion is removed, and a corrected "flat" view appears on a monitor. Key elements are a high resolution charge coupled device (CCD), image correction circuitry and a microcomputer for image processing. The system can be adapted to existing installations. Applications include security and surveillance, teleconferencing, imaging, virtual reality, broadcast video and military operations. Omniview technology is now called IPIX. The company was founded in 1986 as TeleRobotics International, became Omniview in 1995, and changed its name to Interactive Pictures Corporation in 1997.

  2. Narrow band 3 × 3 Mueller polarimetric endoscopy

    PubMed Central

    Qi, Ji; Ye, Menglong; Singh, Mohan; Clancy, Neil T.; Elson, Daniel S.

    2013-01-01

    Mueller matrix polarimetric imaging has shown potential in tissue diagnosis but is challenging to implement endoscopically. In this work, a narrow band 3 × 3 Mueller matrix polarimetric endoscope was designed by rotating the endoscope to generate 0°, 45° and 90° linearly polarized illumination and positioning a rotating filter wheel in front of the camera containing three polarisers to permit polarization state analysis for backscattered light. The system was validated with a rotating linear polarizer and a diffuse reflection target. Initial measurements of 3 × 3 Mueller matrices on a rat are demonstrated, followed by matrix decomposition into the depolarization and retardance matrices for further analysis. Our work shows the feasibility of implementing polarimetric imaging in a rigid endoscope conveniently and economically in order to reveal diagnostic information. PMID:24298405

  3. LRO Camera Imaging of Potential Landing Sites in the South Pole-Aitken Basin

    NASA Astrophysics Data System (ADS)

    Jolliff, B. L.; Wiseman, S. M.; Gibson, K. E.; Lauber, C.; Robinson, M.; Gaddis, L. R.; Scholten, F.; Oberst, J.; LROC Science; Operations Team

    2010-12-01

    We show results of WAC (Wide Angle Camera) and NAC (Narrow Angle Camera) imaging of candidate landing sites within the South Pole-Aitken (SPA) basin of the Moon obtained by the Lunar Reconnaissance Orbiter during the first full year of operation. These images enable a greatly improved delineation of geologic units, determination of unit thicknesses and stratigraphy, and detailed surface characterization that has not been possible with previous data. WAC imaging encompasses the entire SPA basin, located within an area ranging from ~ 130-250 degrees east longitude and ~15 degrees south latitude to the South Pole, at different incidence angles, with the specific range of incidence dependent on latitude. The WAC images show morphology and surface detail at better than 100 m per pixel, with spatial coverage and quality unmatched by previous data sets. NAC images reveal details at the sub-meter pixel scale that enable new ways to evaluate the origins and stratigraphy of deposits. Key among new results is the capability to discern extents of ancient volcanic deposits that are covered by later crater ejecta (cryptomare) [see Petro et al., this conference] using new, complementary color data from Kaguya and Chandrayaan-1. Digital topographic models derived from WAC and NAC geometric stereo coverage show broad intercrater-plains areas where slopes are acceptably low for high-probability safe landing [see Archinal et al., this conference]. NAC images allow mapping and measurement of small, fresh craters that excavated boulders and thus provide information on surface roughness and depth to bedrock beneath regolith and plains deposits. We use these data to estimate deposit thickness in areas of interest for landing and potential sample collection to better understand the possible provenance of samples. Also, small regions marked by fresh impact craters and their associated boulder fields are readily identified by their bright ejecta patterns and marked as lander keep-out zones

  4. Reflectance characteristics of the Viking lander camera reference test charts

    NASA Technical Reports Server (NTRS)

    Wall, S. D.; Burcher, E. E.; Jabson, D. J.

    1975-01-01

    Reference test charts provide radiometric, colorimetric, and spatial resolution references for the Viking lander cameras on Mars. Reflectance measurements of these references are described, including the absolute bidirectional reflectance of the radiometric references and the relative spectral reflectance of both radiometric and colorimetric references. Results show that the bidirection reflectance of the radiometric references is Lambertian to within + or - 7% for incidence angles between 20 deg and 60 deg, and that their spectral reflectance is constant with wavelength to within + or - 5% over the spectral range of the cameras. Estimated accuracy of the measurements is + or - 0.05 in relative spectral reflectance.

  5. Comparison and evaluation of datasets for off-angle iris recognition

    NASA Astrophysics Data System (ADS)

    Kurtuncu, Osman M.; Cerme, Gamze N.; Karakaya, Mahmut

    2016-05-01

    In this paper, we investigated the publicly available iris recognition datasets and their data capture procedures in order to determine if they are suitable for the stand-off iris recognition research. Majority of the iris recognition datasets include only frontal iris images. Even if a few datasets include off-angle iris images, the frontal and off-angle iris images are not captured at the same time. The comparison of the frontal and off-angle iris images shows not only differences in the gaze angle but also change in pupil dilation and accommodation as well. In order to isolate the effect of the gaze angle from other challenging issues including dilation and accommodation, the frontal and off-angle iris images are supposed to be captured at the same time by using two different cameras. Therefore, we developed an iris image acquisition platform by using two cameras in this work where one camera captures frontal iris image and the other one captures iris images from off-angle. Based on the comparison of Hamming distance between frontal and off-angle iris images captured with the two-camera- setup and one-camera-setup, we observed that Hamming distance in two-camera-setup is less than one-camera-setup ranging from 0.05 to 0.001. These results show that in order to have accurate results in the off-angle iris recognition research, two-camera-setup is necessary in order to distinguish the challenging issues from each other.

  6. A fringe shadow method for measuring flapping angle and torsional angle of a dragonfly wing

    NASA Astrophysics Data System (ADS)

    Zeng, Lijiang; Matsumoto, Hirokazu; Kawachi, Keiji

    1996-05-01

    A fringe shadow (FS) method was developed for measuring the flapping angle and torsional angle of a dragonfly wing during beating motion. This new method involves two sets of fringe patterns projected onto the wing from orthogonal directions. The torsional angle is determined using the length of the shadow of the wing chord that is cast by the two sets of fringe patterns. The flapping angle is determined using the shadowgraph of the wing projected by a laser. The advantages of the FS method are its capability (i) to measure the flapping angle and torsional angle of a dragonfly wing simultaneously using only one high-speed camera and (ii) to recognize the spanwise position of a section from the number of fringes, without having to use diffuse marks that are common in current methods. The resolution of the FS method can be changed easily by adjusting the fringe spacing. The measurement results for the torsional angle and flapping angle of a dragonfly wing prove the effectiveness of the FS method in studying the flight performance of dragonflies.

  7. Camera Calibration for Uav Application Using Sensor of Mobile Camera

    NASA Astrophysics Data System (ADS)

    Takahashi, Y.; Chikatsu, H.

    2015-05-01

    Recently, 3D measurements using small unmanned aerial vehicles (UAVs) have increased in Japan, because small type UAVs is easily available at low cost and the analysis software can be created the easily 3D models. However, small type UAVs have a problem: they have very short flight times and a small payload. In particular, as the payload of a small type UAV increases, its flight time decreases. Therefore, it is advantageous to use lightweight sensors in small type UAVs. A mobile camera is lightweight and has many sensors such as an accelerometer, a magnetic field, and a gyroscope. Moreover, these sensors can be used simultaneously. Therefore, the authors think that the problems of small UAVs can be solved using the mobile camera. The authors executed camera calibration using a test target for evaluating sensor values measured using a mobile camera. Consequently, the authors confirmed the same accuracy with normal camera calibration.

  8. The underwater camera calibration based on virtual camera lens distortion

    NASA Astrophysics Data System (ADS)

    Qin, Dahui; Mao, Ting; Cheng, Peng; Zhang, Zhiliang

    2011-08-01

    The machine view is becoming more and more popular in underwater. It is a challenge to calibrate the camera underwater because of the complicated light ray path in underwater and air environment. In this paper we firstly analyzed characteristic of the camera when light transported from air to water. Then we proposed a new method that takes the high-level camera distortion model to compensate the deviation of the light refraction when light ray come through the water and air media. In the end experience result shows the high-level distortion model can simulate the effect made by the underwater light refraction which also makes effect on the camera's image in the process of the camera underwater calibration.

  9. Narrow-band radiation wavelength measurement by processing digital photographs in RAW format

    SciTech Connect

    Kraiskii, A V; Mironova, T V; Sultanov, T T

    2012-12-31

    The technique of measuring the mean wavelength of narrow-band radiation in the 455 - 625-nm range using the image of the emitting surface is presented. The data from the camera array unprocessed by the built-in processor (RAW format) are used. The method is applied for determining the parameters of response of holographic sensors. Depending on the wavelength and brightness of the image fragment, the mean square deviation of the wavelength amounts to 0.3 - 3 nm. (experimental techniques)

  10. Comparison of different camera calibration approaches for underwater applications.

    PubMed

    Silvatti, Amanda Piaia; Dias, Fabio Augusto Salve; Cerveri, Pietro; Barros, Ricardo M L

    2012-04-01

    The purpose of this study was to compare three camera calibration approaches applied to underwater applications: (1) static control points with nonlinear DLT; (2) moving wand with nonlinear camera model and bundle adjustment; (3) moving plate with nonlinear camera model. The DVideo kinematic analysis system was used for underwater data acquisition. The system consisted of two gen-locked Basler cameras working at 100 Hz, with wide angle lenses that were enclosed in housings. The accuracy of the methods was compared in a dynamic rigid bar test (acquisition volume-4.5×1×1.5 m(3)). The mean absolute errors were 6.19 mm for the nonlinear DLT, 1.16 mm for the wand calibration, 1.20 mm for the 2D plate calibration using 8 control points and 0.73 mm for the 2D plane calibration using 16 control points. The results of the wand and 2D plate camera calibration methods were less associated to the rigid body position in the working volume and provided better accuracy than the nonlinear DLT. Wand and 2D plate camera calibration methods presented similar and highly accurate results, being alternatives for underwater 3D motion analysis. PMID:22284990

  11. A view of the ET camera on STS-112

    NASA Technical Reports Server (NTRS)

    2002-01-01

    KENNEDY SPACE CENTER, FLA. - A closeup view of the camera mounted on the external tank of Space Shuttle Atlantis. The color video camera mounted to the top of Atlantis' external tank will provide a view of the front and belly of the orbiter and a portion of the solid rocket boosters (SRBs) and external tank during the launch of Atlantis on mission STS-112. It will offer the STS-112 team an opportunity to monitor the shuttle's performance from a new angle. The camera will be turned on fifteen minutes prior to launch and will show the orbiter and solid rocket boosters on the launch pad. The video will be downlinked from the external tank during flight to several NASA data-receiving sites and then relayed to the live television broadcast. The camera is expected to operate for about 15 minutes following liftoff. At liftoff, viewers will see the shuttle clearing the launch tower and, at two minutes after liftoff, see the right SRB separate from the external tank. When the external tank separates from Atlantis about eight minutes into the flight, the camera is expected to continue its live feed for about six more minutes although NASA may be unable to pick up the camera's signal because the tank may have moved out of range.

  12. A view of the ET camera on STS-112

    NASA Technical Reports Server (NTRS)

    2002-01-01

    KENNEDY SPACE CENTER, FLA. - A view of the camera mounted on the external tank of Space Shuttle Atlantis. The color video camera mounted to the top of Atlantis' external tank will provide a view of the front and belly of the orbiter and a portion of the solid rocket boosters (SRBs) and external tank during the launch of Atlantis on mission STS-112. It will offer the STS-112 team an opportunity to monitor the shuttle's performance from a new angle. The camera will be turned on fifteen minutes prior to launch and will show the orbiter and solid rocket boosters on the launch pad. The video will be downlinked from the external tank during flight to several NASA data-receiving sites and then relayed to the live television broadcast. The camera is expected to operate for about 15 minutes following liftoff. At liftoff, viewers will see the shuttle clearing the launch tower and, at two minutes after liftoff, see the right SRB separate from the external tank. When the external tank separates from Atlantis about eight minutes into the flight, the camera is expected to continue its live feed for about six more minutes although NASA may be unable to pick up the camera's signal because the tank may have moved out of range.

  13. Bundle Adjustment for Multi-Camera Systems with Points at Infinity

    NASA Astrophysics Data System (ADS)

    Schneider, J.; Schindler, F.; Läbe, T.; Förstner, W.

    2012-07-01

    We present a novel approach for a rigorous bundle adjustment for omnidirectional and multi-view cameras, which enables an efficient maximum-likelihood estimation with image and scene points at infinity. Multi-camera systems are used to increase the resolution, to combine cameras with different spectral sensitivities (Z/I DMC, Vexcel Ultracam) or - like omnidirectional cameras - to augment the effective aperture angle (Blom Pictometry, Rollei Panoscan Mark III). Additionally multi-camera systems gain in importance for the acquisition of complex 3D structures. For stabilizing camera orientations - especially rotations - one should generally use points at the horizon over long periods of time within the bundle adjustment that classical bundle adjustment programs are not capable of. We use a minimal representation of homogeneous coordinates for image and scene points. Instead of eliminating the scale factor of the homogeneous vectors by Euclidean normalization, we normalize the homogeneous coordinates spherically. This way we can use images of omnidirectional cameras with single-view point like fisheye cameras and scene points, which are far away or at infinity. We demonstrate the feasibility and the potential of our approach on real data taken with a single camera, the stereo camera FinePix Real 3D W3 from Fujifilm and the multi-camera system Ladybug 3 from Point Grey.

  14. Critical Heat Flux In Inclined Rectangular Narrow Long Channel

    SciTech Connect

    J. L. Rempe; S. W. Noh; Y. H. Kim; K. Y. Suh; F.B.Cheung; S. B. Kim

    2005-05-01

    In the TMI-2 accident, the lower part of the reactor pressure vessel had been overheated and then rather rapidly cooled down, as was later identified in a vessel investigation project. This accounted for the possibility of gap cooling feasibility. For this reason, several investigations were performed to determine the critical heat flux (CHF) from the standpoint of invessel retention. The experiments are conducted to investigate the general boiling phenomena, and the triggering mechanism for the CHF in a narrow gap using a 5 x 105 mm2 crevice type heater assembly and de-mineralized water. The test parameters include the gap size of 5 mm, and the surface orientation angles from the downward facing position (180o) to the vertical position (90o). The orientation angle affects the bubble layer and escape from the narrow gap. The CHF is less than that in a shorter channel, compared with the previous experiments having a heated length of 35 mmin the copper test section.

  15. Digital synchroballistic schlieren camera for high-speed photography of bullets and rocket sleds

    NASA Astrophysics Data System (ADS)

    Buckner, Benjamin D.; L'Esperance, Drew

    2013-08-01

    A high-speed digital streak camera designed for simultaneous high-resolution color photography and focusing schlieren imaging is described. The camera uses a computer-controlled galvanometer scanner to achieve synchroballistic imaging through a narrow slit. Full color 20 megapixel images of a rocket sled moving at 480 m/s and of projectiles fired at around 400 m/s were captured, with high-resolution schlieren imaging in the latter cases, using conventional photographic flash illumination. The streak camera can achieve a line rate for streak imaging of up to 2.4 million lines/s.

  16. Automatic feature extraction for panchromatic Mars Global Surveyor Mars Orbiter camera imagery

    NASA Astrophysics Data System (ADS)

    Plesko, Catherine S.; Brumby, Steven P.; Leovy, Conway B.

    2002-01-01

    The Mars Global Surveyor Mars Orbiter Camera (MOC) has produced tens of thousands of images, which contain a wealth of information about the surface of the planet Mars. Current manual analysis techniques are inadequate for the comprehensive analysis of such a large dataset, while development of handwritten feature extraction algorithms is laborious and expensive. This project investigates application of an automatic feature extraction approach to analysis of the MOC narrow angle panchromatic dataset, using an evolutionary computation software package called GENIE. GENIE uses a genetic algorithm to assemble feature extraction tools from low-level image operators. Each generated tool is evaluated against training data provided by the user. The best tools in each generation are allowed to 'reproduce' to produce the next generation, and the population of tools is permitted to evolve until it converges to a solution or reaches a level of performance specified by the user. Craters are one of the most scientifically interesting and most numerous features in the MOC data set, and present a wide range of shapes at many spatial scales. We now describe preliminary results on development of a crater finder algorithm using the GENIE software.

  17. Infants Experience Perceptual Narrowing for Nonprimate Faces

    ERIC Educational Resources Information Center

    Simpson, Elizabeth A.; Varga, Krisztina; Frick, Janet E.; Fragaszy, Dorothy

    2011-01-01

    Perceptual narrowing--a phenomenon in which perception is broad from birth, but narrows as a function of experience--has previously been tested with primate faces. In the first 6 months of life, infants can discriminate among individual human and monkey faces. Though the ability to discriminate monkey faces is lost after about 9 months, infants…

  18. Do narrow {Sigma}-hypernuclear states exist?

    SciTech Connect

    Chrien, R.E.

    1995-12-31

    Reports of narrow states in {Sigma}-hypernucleus production have appeared from time to time. The present experiment is a repeat of the first and seemingly most definitive such experiment, that on a target of {sup 9}Be, but with much better statistics. No narrow states were observed.

  19. Viewing angle analysis of integral imaging

    NASA Astrophysics Data System (ADS)

    Wang, Hong-Xia; Wu, Chun-Hong; Yang, Yang; Zhang, Lan

    2007-12-01

    Integral imaging (II) is a technique capable of displaying 3D images with continuous parallax in full natural color. It is becoming the most perspective technique in developing next generation three-dimensional TV (3DTV) and visualization field due to its outstanding advantages. However, most of conventional integral images are restricted by its narrow viewing angle. One reason is that the range in which a reconstructed integral image can be displayed with consistent parallax is limited. The other is that the aperture of system is finite. By far many methods , an integral imaging method to enhance the viewing angle of integral images has been proposed. Nevertheless, except Ren's MVW (Maximum Viewing Width) most of these methods involve complex hardware and modifications of optical system, which usually bring other disadvantages and make operation more difficult. At the same time the cost of these systems should be higher. In order to simplify optical systems, this paper systematically analyzes the viewing angle of traditional integral images instead of modified ones. Simultaneously for the sake of cost the research was based on computer generated integral images (CGII). With the analysis result we can know clearly how the viewing angle can be enhanced and how the image overlap or image flipping can be avoided. The result also promotes the development of optical instruments. Based on theoretical analysis, preliminary calculation was done to demonstrate how the other viewing properties which are closely related with the viewing angle, such as viewing distance, viewing zone, lens pitch, and etc. affect the viewing angle.

  20. Digital Cameras for Student Use.

    ERIC Educational Resources Information Center

    Simpson, Carol

    1997-01-01

    Describes the features, equipment and operations of digital cameras and compares three different digital cameras for use in education. Price, technology requirements, features, transfer software, and accessories for the Kodak DC25, Olympus D-200L and Casio QV-100 are presented in a comparison table. (AEF)

  1. The "All Sky Camera Network"

    ERIC Educational Resources Information Center

    Caldwell, Andy

    2005-01-01

    In 2001, the "All Sky Camera Network" came to life as an outreach program to connect the Denver Museum of Nature and Science (DMNS) exhibit "Space Odyssey" with Colorado schools. The network is comprised of cameras placed strategically at schools throughout Colorado to capture fireballs--rare events that produce meteorites. Meteorites have great…

  2. An Educational PET Camera Model

    ERIC Educational Resources Information Center

    Johansson, K. E.; Nilsson, Ch.; Tegner, P. E.

    2006-01-01

    Positron emission tomography (PET) cameras are now in widespread use in hospitals. A model of a PET camera has been installed in Stockholm House of Science and is used to explain the principles of PET to school pupils as described here.

  3. Radiation camera motion correction system

    DOEpatents

    Hoffer, P.B.

    1973-12-18

    The device determines the ratio of the intensity of radiation received by a radiation camera from two separate portions of the object. A correction signal is developed to maintain this ratio at a substantially constant value and this correction signal is combined with the camera signal to correct for object motion. (Official Gazette)

  4. SEOS frame camera applications study

    NASA Technical Reports Server (NTRS)

    1974-01-01

    A research and development satellite is discussed which will provide opportunities for observation of transient phenomena that fall within the fixed viewing circle of the spacecraft. The evaluation of possible applications for frame cameras, for SEOS, are studied. The computed lens characteristics for each camera are listed.

  5. Airborne ballistic camera tracking systems

    NASA Technical Reports Server (NTRS)

    Redish, W. L.

    1976-01-01

    An operational airborne ballistic camera tracking system was tested for operational and data reduction feasibility. The acquisition and data processing requirements of the system are discussed. Suggestions for future improvements are also noted. A description of the data reduction mathematics is outlined. Results from a successful reentry test mission are tabulated. The test mission indicated that airborne ballistic camera tracking systems are feasible.

  6. Multi-PSPMT scintillation camera

    SciTech Connect

    Pani, R.; Pellegrini, R.; Trotta, G.; Scopinaro, F.; Soluri, A.; Vincentis, G. de; Scafe, R.; Pergola, A.

    1999-06-01

    Gamma ray imaging is usually accomplished by the use of a relatively large scintillating crystal coupled to either a number of photomultipliers (PMTs) (Anger Camera) or to a single large Position Sensitive PMT (PSPMT). Recently the development of new diagnostic techniques, such as scintimammography and radio-guided surgery, have highlighted a number of significant limitations of the Anger camera in such imaging procedures. In this paper a dedicated gamma camera is proposed for clinical applications with the aim of improving image quality by utilizing detectors with an appropriate size and shape for the part of the body under examination. This novel scintillation camera is based upon an array of PSPMTs (Hamamatsu R5900-C8). The basic concept of this camera is identical to the Anger Camera with the exception of the substitution of PSPMTs for the PMTs. In this configuration it is possible to use the high resolution of the PSPMTs and still correctly position events lying between PSPMTs. In this work the test configuration is a 2 by 2 array of PSPMTs. Some advantages of this camera are: spatial resolution less than 2 mm FWHM, good linearity, thickness less than 3 cm, light weight, lower cost than equivalent area PSPMT, large detection area when coupled to scintillating arrays, small dead boundary zone (< 3 mm) and flexibility in the shape of the camera.

  7. Mars Exploration Rover engineering cameras

    USGS Publications Warehouse

    Maki, J.N.; Bell, J.F., III; Herkenhoff, K. E.; Squyres, S. W.; Kiely, A.; Klimesh, M.; Schwochert, M.; Litwin, T.; Willson, R.; Johnson, Aaron H.; Maimone, M.; Baumgartner, E.; Collins, A.; Wadsworth, M.; Elliot, S.T.; Dingizian, A.; Brown, D.; Hagerott, E.C.; Scherr, L.; Deen, R.; Alexander, D.; Lorre, J.

    2003-01-01

    NASA's Mars Exploration Rover (MER) Mission will place a total of 20 cameras (10 per rover) onto the surface of Mars in early 2004. Fourteen of the 20 cameras are designated as engineering cameras and will support the operation of the vehicles on the Martian surface. Images returned from the engineering cameras will also be of significant importance to the scientific community for investigative studies of rock and soil morphology. The Navigation cameras (Navcams, two per rover) are a mast-mounted stereo pair each with a 45?? square field of view (FOV) and an angular resolution of 0.82 milliradians per pixel (mrad/pixel). The Hazard Avoidance cameras (Hazcams, four per rover) are a body-mounted, front- and rear-facing set of stereo pairs, each with a 124?? square FOV and an angular resolution of 2.1 mrad/pixel. The Descent camera (one per rover), mounted to the lander, has a 45?? square FOV and will return images with spatial resolutions of ???4 m/pixel. All of the engineering cameras utilize broadband visible filters and 1024 x 1024 pixel detectors. Copyright 2003 by the American Geophysical Union.

  8. Sublimation of icy aggregates in the coma of comet 67P/Churyumov-Gerasimenko detected with the OSIRIS cameras onboard Rosetta.

    NASA Astrophysics Data System (ADS)

    Gicquel, A.; Vincent, J.-B.; Agarwal, J.; A'Hearn, M. F.; Bertini, I.; Bodewits, D.; Sierks, H.; Lin, Z.-Y.; Barbieri, C.; Lamy, P. L.; Rodrigo, R.; Koschny, D.; Rickman, H.; Keller, H. U.; Barucci, M. A.; Bertaux, J.-L.; Besse, S.; Cremonese, G.; Da Deppo, V.; Davidsson, B.; Debei, S.; Deller, J.; De Cecco, M.; Frattin, E.; El-Maarry, M. R.; Fornasier, S.; Fulle, M.; Groussin, O.; Gutiérrez, P. J.; Gutiérrez-Marquez, P.; Güttler, C.; Höfner, S.; Hofmann, M.; Hu, X.; Hviid, S. F.; Ip, W.-H.; Jorda, L.; Knollenberg, J.; Kovacs, G.; Kramm, J.-R.; Kührt, E.; Küppers, M.; Lara, L. M.; Lazzarin, M.; Moreno, J. J. Lopez; Lowry, S.; Marzari, F.; Masoumzadeh, N.; Massironi, M.; Moreno, F.; Mottola, S.; Naletto, G.; Oklay, N.; Pajola, M.; Pommerol, A.; Preusker, F.; Scholten, F.; Shi, X.; Thomas, N.; Toth, I.; Tubiana, C.

    2016-08-01

    Beginning in March 2014, the OSIRIS (Optical, Spectroscopic, and Infrared Remote Imaging System) cameras began capturing images of the nucleus and coma (gas and dust) of comet 67P/Churyumov-Gerasimenko using both the wide angle camera (WAC) and the narrow angle camera (NAC). The many observations taken since July of 2014 have been used to study the morphology, location, and temporal variation of the comet's dust jets. We analyzed the dust monitoring observations shortly after the southern vernal equinox on May 30 and 31, 2015 with the WAC at the heliocentric distance R_h = 1.53 AU, where it is possible to observe that the jet rotates with the nucleus. We found that the decline of brightness as a function of the distance of the jet is much steeper than the background coma, which is a first indication of sublimation. We adapted a model of sublimation of icy aggregates and studied the effect as a function of the physical properties of the aggregates (composition and size). The major finding of this article was that through the sublimation of the aggregates of dirty grains (radius a between 5μm and 50μm) we were able to completely reproduce the radial brightness profile of a jet beyond 4 km from the nucleus. To reproduce the data we needed to inject a number of aggregates between 8.5 × 1013 and 8.5 × 1010 for a = 5μm and 50μm respectively, or an initial mass of H_2O ice around 22kg.

  9. Narrow band gap amorphous silicon semiconductors

    DOEpatents

    Madan, A.; Mahan, A.H.

    1985-01-10

    Disclosed is a narrow band gap amorphous silicon semiconductor comprising an alloy of amorphous silicon and a band gap narrowing element selected from the group consisting of Sn, Ge, and Pb, with an electron donor dopant selected from the group consisting of P, As, Sb, Bi and N. The process for producing the narrow band gap amorphous silicon semiconductor comprises the steps of forming an alloy comprising amorphous silicon and at least one of the aforesaid band gap narrowing elements in amount sufficient to narrow the band gap of the silicon semiconductor alloy below that of amorphous silicon, and also utilizing sufficient amounts of the aforesaid electron donor dopant to maintain the amorphous silicon alloy as an n-type semiconductor.

  10. LISS-4 camera for Resourcesat

    NASA Astrophysics Data System (ADS)

    Paul, Sandip; Dave, Himanshu; Dewan, Chirag; Kumar, Pradeep; Sansowa, Satwinder Singh; Dave, Amit; Sharma, B. N.; Verma, Anurag

    2006-12-01

    The Indian Remote Sensing Satellites use indigenously developed high resolution cameras for generating data related to vegetation, landform /geomorphic and geological boundaries. This data from this camera is used for working out maps at 1:12500 scale for national level policy development for town planning, vegetation etc. The LISS-4 Camera was launched onboard Resourcesat-1 satellite by ISRO in 2003. LISS-4 is a high-resolution multi-spectral camera with three spectral bands and having a resolution of 5.8m and swath of 23Km from 817 Km altitude. The panchromatic mode provides a swath of 70Km and 5-day revisit. This paper briefly discusses the configuration of LISS-4 Camera of Resourcesat-1, its onboard performance and also the changes in the Camera being developed for Resourcesat-2. LISS-4 camera images the earth in push-broom mode. It is designed around a three mirror un-obscured telescope, three linear 12-K CCDs and associated electronics for each band. Three spectral bands are realized by splitting the focal plane in along track direction using an isosceles prism. High-speed Camera Electronics is designed for each detector with 12- bit digitization and digital double sampling of video. Seven bit data selected from 10 MSBs data by Telecommand is transmitted. The total dynamic range of the sensor covers up to 100% albedo. The camera structure has heritage of IRS- 1C/D. The optical elements are precisely glued to specially designed flexure mounts. The camera is assembled onto a rotating deck on spacecraft to facilitate +/- 26° steering in Pitch-Yaw plane. The camera is held on spacecraft in a stowed condition before deployment. The excellent imageries from LISS-4 Camera onboard Resourcesat-1 are routinely used worldwide. Such second Camera is being developed for Resourcesat-2 launch in 2007 with similar performance. The Camera electronics is optimized and miniaturized. The size and weight are reduced to one third and the power to half of the values in Resourcesat

  11. IMAX camera (12-IML-1)

    NASA Technical Reports Server (NTRS)

    1992-01-01

    The IMAX camera system is used to record on-orbit activities of interest to the public. Because of the extremely high resolution of the IMAX camera, projector, and audio systems, the audience is afforded a motion picture experience unlike any other. IMAX and OMNIMAX motion picture systems were designed to create motion picture images of superior quality and audience impact. The IMAX camera is a 65 mm, single lens, reflex viewing design with a 15 perforation per frame horizontal pull across. The frame size is 2.06 x 2.77 inches. Film travels through the camera at a rate of 336 feet per minute when the camera is running at the standard 24 frames/sec.

  12. Coherent infrared imaging camera (CIRIC)

    SciTech Connect

    Hutchinson, D.P.; Simpson, M.L.; Bennett, C.A.; Richards, R.K.; Emery, M.S.; Crutcher, R.I.; Sitter, D.N. Jr.; Wachter, E.A.; Huston, M.A.

    1995-07-01

    New developments in 2-D, wide-bandwidth HgCdTe (MCT) and GaAs quantum-well infrared photodetectors (QWIP) coupled with Monolithic Microwave Integrated Circuit (MMIC) technology are now making focal plane array coherent infrared (IR) cameras viable. Unlike conventional IR cameras which provide only thermal data about a scene or target, a coherent camera based on optical heterodyne interferometry will also provide spectral and range information. Each pixel of the camera, consisting of a single photo-sensitive heterodyne mixer followed by an intermediate frequency amplifier and illuminated by a separate local oscillator beam, constitutes a complete optical heterodyne receiver. Applications of coherent IR cameras are numerous and include target surveillance, range detection, chemical plume evolution, monitoring stack plume emissions, and wind shear detection.

  13. Proportional counter radiation camera

    DOEpatents

    Borkowski, C.J.; Kopp, M.K.

    1974-01-15

    A gas-filled proportional counter camera that images photon emitting sources is described. A two-dimensional, positionsensitive proportional multiwire counter is provided as the detector. The counter consists of a high- voltage anode screen sandwiched between orthogonally disposed planar arrays of multiple parallel strung, resistively coupled cathode wires. Two terminals from each of the cathode arrays are connected to separate timing circuitry to obtain separate X and Y coordinate signal values from pulse shape measurements to define the position of an event within the counter arrays which may be recorded by various means for data display. The counter is further provided with a linear drift field which effectively enlarges the active gas volume of the counter and constrains the recoil electrons produced from ionizing radiation entering the counter to drift perpendicularly toward the planar detection arrays. A collimator is interposed between a subject to be imaged and the counter to transmit only the radiation from the subject which has a perpendicular trajectory with respect to the planar cathode arrays of the detector. (Official Gazette)

  14. Cameras for digital microscopy.

    PubMed

    Spring, Kenneth R

    2013-01-01

    This chapter reviews the fundamental characteristics of charge-coupled devices (CCDs) and related detectors, outlines the relevant parameters for their use in microscopy, and considers promising recent developments in the technology of detectors. Electronic imaging with a CCD involves three stages--interaction of a photon with the photosensitive surface, storage of the liberated charge, and readout or measurement of the stored charge. The most demanding applications in fluorescence microscopy may require as much as four orders of greater magnitude sensitivity. The image in the present-day light microscope is usually acquired with a CCD camera. The CCD is composed of a large matrix of photosensitive elements (often referred to as "pixels" shorthand for picture elements, which simultaneously capture an image over the entire detector surface. The light-intensity information for each pixel is stored as electronic charge and is converted to an analog voltage by a readout amplifier. This analog voltage is subsequently converted to a numerical value by a digitizer situated on the CCD chip, or very close to it. Several (three to six) amplifiers are required for each pixel, and to date, uniform images with a homogeneous background have been a problem because of the inherent difficulties of balancing the gain in all of the amplifiers. Complementary metal oxide semiconductor sensors also exhibit relatively high noise associated with the requisite high-speed switching. Both of these deficiencies are being addressed, and sensor performance is nearing that required for scientific imaging. PMID:23931507

  15. Lights, Camera, Courtroom? Should Trials Be Televised?

    ERIC Educational Resources Information Center

    Kirtley, Jane E.; Brothers, Thomas W.; Veal, Harlan K.

    1999-01-01

    Presents three differing perspectives from American Bar Association members on whether television cameras should be allowed in the courtroom. Contends that cameras should be allowed with differing degrees of certainty: cameras truly open the courts to the public; cameras must be strategically placed; and cameras should be used only with the…

  16. Wide Angle View of Arsia Mons Volcano

    NASA Technical Reports Server (NTRS)

    1999-01-01

    Arsia Mons (above) is one of the largest volcanoes known. This shield volcano is part of an aligned trio known as the Tharsis Montes--the others are Pavonis Mons and Ascraeus Mons. Arsia Mons is rivaled only by Olympus Mons in terms of its volume. The summit of Arsia Mons is more than 9 kilometers (5.6 miles) higher than the surrounding plains. The crater--or caldera--at the volcano summit is approximately 110 km (68 mi) across. This view of Arsia Mons was taken by the red and blue wide angle cameras of the Mars Global Surveyor Mars Orbiter Camera (MOC) system. Bright water ice clouds (the whitish/bluish wisps) hang above the volcano--a common sight every martian afternoon in this region. Arsia Mons is located at 120o west longitude and 9o south latitude. Illumination is from the left.

  17. Camera evidence: visibility analysis through a multicamera viewpoint

    NASA Astrophysics Data System (ADS)

    Bajuk, Mark

    1992-06-01

    A major criterion in the design of backhoes (and other heavy machinery) is the ability of the operator to see all critical portions of the vehicle and the surrounding environment. Computer graphics provides a method for analyzing this ability prior to the building of full-scale wooden models. By placing the computer graphic camera at the operator's eyepoint, designers can detect poor placement of supports, blind spots, etc. In this type of analysis, the camera becomes an active, yet somewhat imperfect, participant in our understanding of what an operator of the backhoe 'sees'. In order to simulate a backhoe operator's vision from within a cab, one needs to expand the angle of view of the camera to mimic unfocused, peripheral vision. A traditional wide-angle lens creates extreme distortions that are not present in 'natural' vision, and is therefore hardly an adequate representation. The solution we arrived at uses seven cameras fanned out horizontally in order to capture a relatively undistorted 155 degree angle of view. In addition, another camera displays and numerically analyzes the percentage of the loader bucket visible and blocked. These two views are presented simultaneously in order to address both the 'naturalistic' and quantitative needs of the designers, as well as to point to the incompleteness of any one representation of a scene. In the next phase of this project we will bring this type of analysis into a machine environment more conducive to interactivity: a backhoe simulator with levers to control the vehicle and bucket positions, viewed through a virtual reality environment.

  18. The Advanced Camera for Surveys

    NASA Astrophysics Data System (ADS)

    Clampin, M.; Ford, H. C.; Feldman, P.; Golimowski, D.; Tsvetanov, Z.; Bartko, F.; Brown, B.; Burrows, C.; Hartig, G.; Postman, M.; Rafal, M.; Sparks, B.; White, R.; Crocker, J.; Bely, P.; Cheng, E.; Krebs, C.; Kimble, R.; Neff, S.; Illingworth, G.; Lesser, M.; Broadhurst, T.; Miley, G.; Lajeunesse, T.; Woodruff, B.

    1998-01-01

    The Advanced Camera for Surveys (ACS) is to be installed in the Hubble Space Telescope (HST) during the third HST servicing mission in December 1999. The ACS comprises three cameras each designed to achieve specific goals. The first, the Wide Field Camera, will be a high throughput, wide field (200" x 204"), visible to nar-IR camera that is half critically sampled at 500 nm. The second, the High Resolution Camera (HRC), is critically sampled at 500 nm, and has a 26" x 29" field of view. The HRC optical path includes a coronagraph which will improve the HST's contrast near bright objects by a factor of 10. The third camera is a far ultraviolet, Solar-Blind Camera (SBC),with a field of 26" x 29". ACS will increase HST's capability for imaging surveys and discovery by at least a factor of 10. We give an overview of the ACS design and discuss image the quality of the optics and the performance of the CCD and MAMA detectors. The plans for the GTO science program are reviewed, and specific scientific capabilities of the instrument reviewed.

  19. CCD Camera Observations

    NASA Astrophysics Data System (ADS)

    Buchheim, Bob; Argyle, R. W.

    One night late in 1918, astronomer William Milburn, observing the region of Cassiopeia from Reverend T.H.E.C. Espin's observatory in Tow Law (England), discovered a hitherto unrecorded double star (Wright 1993). He reported it to Rev. Espin, who measured the pair using his 24-in. reflector: the fainter star was 6.0 arcsec from the primary, at position angle 162.4 ^{circ } (i.e. the fainter star was south-by-southeast from the primary) (Espin 1919). Some time later, it was recognized that the astrograph of the Vatican Observatory had taken an image of the same star-field a dozen years earlier, in late 1906. At that earlier epoch, the fainter star had been separated from the brighter one by only 4.8 arcsec, at position angle 186.2 ^{circ } (i.e. almost due south). Were these stars a binary pair, or were they just two unrelated stars sailing past each other? Some additional measurements might have begun to answer this question. If the secondary star was following a curved path, that would be a clue of orbital motion; if it followed a straight-line path, that would be a clue that these are just two stars passing in the night. Unfortunately, nobody took the trouble to re-examine this pair for almost a century, until the 2MASS astrometric/photometric survey recorded it in late 1998. After almost another decade, this amateur astronomer took some CCD images of the field in 2007, and added another data point on the star's trajectory, as shown in Fig. 15.1.

  20. Vision Sensors and Cameras

    NASA Astrophysics Data System (ADS)

    Hoefflinger, Bernd

    Silicon charge-coupled-device (CCD) imagers have been and are a specialty market ruled by a few companies for decades. Based on CMOS technologies, active-pixel sensors (APS) began to appear in 1990 at the 1 μm technology node. These pixels allow random access, global shutters, and they are compatible with focal-plane imaging systems combining sensing and first-level image processing. The progress towards smaller features and towards ultra-low leakage currents has provided reduced dark currents and μm-size pixels. All chips offer Mega-pixel resolution, and many have very high sensitivities equivalent to ASA 12.800. As a result, HDTV video cameras will become a commodity. Because charge-integration sensors suffer from a limited dynamic range, significant processing effort is spent on multiple exposure and piece-wise analog-digital conversion to reach ranges >10,000:1. The fundamental alternative is log-converting pixels with an eye-like response. This offers a range of almost a million to 1, constant contrast sensitivity and constant colors, important features in professional, technical and medical applications. 3D retino-morphic stacking of sensing and processing on top of each other is being revisited with sub-100 nm CMOS circuits and with TSV technology. With sensor outputs directly on top of neurons, neural focal-plane processing will regain momentum, and new levels of intelligent vision will be achieved. The industry push towards thinned wafers and TSV enables backside-illuminated and other pixels with a 100% fill-factor. 3D vision, which relies on stereo or on time-of-flight, high-speed circuitry, will also benefit from scaled-down CMOS technologies both because of their size as well as their higher speed.

  1. A liquid xenon radioisotope camera.

    NASA Technical Reports Server (NTRS)

    Zaklad, H.; Derenzo, S. E.; Muller, R. A.; Smadja, G.; Smits, R. G.; Alvarez, L. W.

    1972-01-01

    A new type of gamma-ray camera is discussed that makes use of electron avalanches in liquid xenon and is currently under development. It is shown that such a radioisotope camera promises many advantages over any other existing gamma-ray cameras. Spatial resolution better than 1 mm and counting rates higher than one million C/sec are possible. An energy resolution of 11% FWHM has recently been achieved with a collimated Hg-203 source using a parallel-plate ionization chamber containing a Frisch grid.

  2. Dark energy survey and camera

    SciTech Connect

    William Wester

    2004-08-16

    The authors describe the Dark Energy Survey and Camera. The survey will image 5000 sq. deg. in the southern sky to collect 300 million galaxies, 30,000 galaxy clusters and 2000 Type Ia supernovae. They expect to derive a value for the dark energy equation of state parameters, w, to a precision of 5% by combining four distinct measurement techniques. They describe the mosaic camera that will consist of CCDs with enhanced sensitivity in the near infrared. The camera will be mounted at the prime focus of the 4m Blanco telescope.

  3. Three-dimensional Camera Phone

    NASA Astrophysics Data System (ADS)

    Iizuka, Keigo

    2004-12-01

    An inexpensive technique for realizing a three-dimensional (3D) camera phone display is presented. Light from the liquid-crystal screen of a camera phone is linearly polarized, and its direction of polarization is easily manipulated by a cellophane sheet used as a half-waveplate. The novel 3D camera phone display is made possible solely by optical components without resorting to computation, so that the 3D image is displayed in real time. Quality of the original image is not sacrificed in the process of converting it into a 3D image.

  4. Observation of Planetary Motion Using a Digital Camera

    ERIC Educational Resources Information Center

    Meyn, Jan-Peter

    2008-01-01

    A digital SLR camera with a standard lens (50 mm focal length, f/1.4) on a fixed tripod is used to obtain photographs of the sky which contain stars up to 8[superscript m] apparent magnitude. The angle of view is large enough to ensure visual identification of the photograph with a large sky region in a stellar map. The resolution is sufficient to…

  5. Mass movement slope streaks imaged by the Mars Orbiter Camera

    NASA Astrophysics Data System (ADS)

    Sullivan, Robert; Thomas, Peter; Veverka, Joseph; Malin, Michael; Edgett, Kenneth S.

    2001-10-01

    Narrow, fan-shaped dark streaks on steep Martian slopes were originally observed in Viking Orbiter images, but a definitive explanation was not possible because of resolution limitations. Pictures acquired by the Mars Orbiter Camera (MOC) aboard the Mars Global Surveyor (MGS) spacecraft show innumerable examples of dark slope streaks distributed widely, but not uniformly, across the brighter equatorial regions, as well as individual details of these features that were not visible in Viking Orbiter data. Dark slope streaks (as well as much rarer bright slope streaks) represent one of the most widespread and easily recognized styles of mass movement currently affecting the Martian surface. New dark streaks have formed since Viking and even during the MGS mission, confirming earlier suppositions that higher contrast dark streaks are younger, and fade (brighten) with time. The darkest slope streaks represent ~10% contrast with surrounding slope materials. No small outcrops supplying dark material (or bright material, for bright streaks) have been found at streak apexes. Digitate downslope ends indicate slope streak formation involves a ground-hugging flow subject to deflection by minor topographic obstacles. The model we favor explains most dark slope streaks as scars from dust avalanches following oversteepening of air fall deposits. This process is analogous to terrestrial avalanches of oversteepened dry, loose snow which produce shallow avalanche scars with similar morphologies. Low angles of internal friction typically 10-30¡ for terrestrial loess and clay materials suggest that mass movement of (low-cohesion) Martian dusty air fall is possible on a wide range of gradients. Martian gravity, presumed low density of the air fall deposits, and thin (unresolved by MOC) failed layer depths imply extremely low cohesive strength at time of failure, consistent with expectations for an air fall deposit of dust particles. As speed increases during a dust avalanche, a

  6. Solar angle reference manual

    SciTech Connect

    Sibson, R.

    1983-01-01

    The introduction is the only text in the volume; the rest of the book contains easy-to-use graphical methods for building design and construction using solar energy. Isogonic charts and solar angle diagrams are included. Isogonic charts. Solar angle diagrams.

  7. Reading Angles in Maps

    ERIC Educational Resources Information Center

    Izard, Véronique; O'Donnell, Evan; Spelke, Elizabeth S.

    2014-01-01

    Preschool children can navigate by simple geometric maps of the environment, but the nature of the geometric relations they use in map reading remains unclear. Here, children were tested specifically on their sensitivity to angle. Forty-eight children (age 47:15-53:30 months) were presented with fragments of geometric maps, in which angle sections…

  8. Performance of new low-cost 1/3" security cameras for meteor surveillance

    NASA Astrophysics Data System (ADS)

    Samuels, Dave; Wray, James; Gural, Peter S.; Jenniskens, Peter

    2014-02-01

    It has been almost 5 years since the CAMS (Cameras for All-sky Meteor Surveillance) system specifications were designed for video meteor surveillance. CAMS has been based on a relatively expensive black-and-white Watec WAT-902H2 Ultimate camera, which uses a 1/2" sensor. In this paper, we investigate the ability of new, lower cost color cameras based on smaller 1/3" sensors to be able to perform adequately for CAMS. We did not expect them to equal or outperform the sensitivity for the same field of view of the Watec 1/2" camera, but the goal was to see if they could perform within the tolerances of the sensitivity requirements for the CAMS project. Their lower cost brings deployment of meteor surveillance cameras within reach of amateur astronomers and makes it possible to deploy many more cameras to increase yield. The lens focal length is matched to the elevation angle of the camera to maintain an image scale and spatial resolution close to that of the standard CAMS camera and lens combination, crucial for obtaining sufficiently accurate orbital elements. An all-sky array based on 16 such cameras, to be operated from a single computer, was built and the performance of individual cameras was tested.

  9. X-ray Pinhole Camera Measurements

    SciTech Connect

    Nelson, D. S.; Berninger, M. J.; Flores, P. A.; Good, D. E.; Henderson, D. J.; Hogge, K. W.; Huber, S. R.; Lutz, S. S.; Mitchell, S. E.; Howe, R. A.; Mitton, C. V.; Molina, I.; Bozman, D. R.; Cordova, S. R.; Mitchell, D. R.; Oliver, B. V.; Ormond, E. C.

    2013-07-01

    The development of the rod pinch diode [1] has led to high-resolution radiography for dynamic events such as explosive tests. Rod pinch diodes use a small diameter anode rod, which extends through the aperture of a cathode plate. Electrons borne off the aperture surface can self-insulate and pinch onto the tip of the rod, creating an intense, small x-ray source (Primary Pinch). This source has been utilized as the main diagnostic on numerous experiments that include high-value, single-shot events. In such applications there is an emphasis on machine reliability, x-ray reproducibility, and x-ray quality [2]. In tests with the baseline rod pinch diode, we have observed that an additional pinch (Secondary Pinch) occurs at the interface near the anode rod and the rod holder. This suggests that stray electrons exist that are not associated with the Primary Pinch. In this paper we present measurements on both pinches using an x-ray pinhole camera. The camera is placed downstream of the Primary Pinch at an angle of 60° with respect to the diode centerline. This diagnostic will be employed to diagnose x-ray reproducibility and quality. In addition, we will investigate the performance of hybrid diodes relating to the formation of the Primary and Secondary Pinches.

  10. Focal Plane Metrology for the LSST Camera

    SciTech Connect

    A Rasmussen, Andrew P.; Hale, Layton; Kim, Peter; Lee, Eric; Perl, Martin; Schindler, Rafe; Takacs, Peter; Thurston, Timothy; /SLAC

    2007-01-10

    Meeting the science goals for the Large Synoptic Survey Telescope (LSST) translates into a demanding set of imaging performance requirements for the optical system over a wide (3.5{sup o}) field of view. In turn, meeting those imaging requirements necessitates maintaining precise control of the focal plane surface (10 {micro}m P-V) over the entire field of view (640 mm diameter) at the operating temperature (T {approx} -100 C) and over the operational elevation angle range. We briefly describe the hierarchical design approach for the LSST Camera focal plane and the baseline design for assembling the flat focal plane at room temperature. Preliminary results of gravity load and thermal distortion calculations are provided, and early metrological verification of candidate materials under cold thermal conditions are presented. A detailed, generalized method for stitching together sparse metrology data originating from differential, non-contact metrological data acquisition spanning multiple (non-continuous) sensor surfaces making up the focal plane, is described and demonstrated. Finally, we describe some in situ alignment verification alternatives, some of which may be integrated into the camera's focal plane.

  11. Narrow Absorption Components in Be Star Winds

    NASA Technical Reports Server (NTRS)

    Grady, C. A.

    1985-01-01

    The stars omega Ori (B2 IIIe), 66 Oph (B2 IVe), and 59 Cyg (B1.5 IVe) are discussed. The extent to which the narrow absorption components in these Be stars differs from narrow components in the O stars and gamma Cas is explored. Any models or mechanisms for the formation of narrow absorption features in the UV resonance lines of Be star spectra must account for the presence of multiple narrow absorption features which are variable in number, radial velocity, and strength. Models predicting a high and low density structure to the stellar wind caused by instabilities in a flow driven by radiation pressure or by variable mass loss may be more successful in describing the behavior of winds in early Be stars. These models appear to be capable of producing single absorption components in the velocity range observed for O stars and very early Be stars.

  12. Narrow Vertical Caves: Mapping Volcanic Fissure Geometries

    NASA Astrophysics Data System (ADS)

    Parcheta, C.; Nash, J.; Parness, A.; Mitchell, K. L.; Pavlov, C. A.

    2015-10-01

    Volcanic conduits are difficult to quantify, but their geometry fundamentally influences how eruptions occur. We robotically map old fissure conduits - elongated narrow cracks in the ground that transported magma to the surface during an eruption.

  13. Comparative morphometry of coxal joint angles.

    PubMed

    Sugiyama, S; Fujiwara, K

    1997-10-01

    The angles related to the coxal joints were comparatively studied in four-limbed walking animals and two-limbed ones including man and birds. Between animals with both types of walking, no significant difference was observed in the neck-shaft angles (NSA), which was equivalent to the acetabulum angles (ACA) at the connection of the femoral head with the acetabulum. The anteversion angles (AVA) were equivalent to the horizontal ACA. Canine species showed two different forms of the femoral neck with or without modification by the femoral AVA, probably being breed-specific and nutrition-dependent. In the narrow-striped wallaby as well as avian species, the femoral head showed a postversion with a minus-version angle for lifting the body axis in the frontal and upward direction to hold the whole body weight on the hind-limbs, in particular at the anterior part of the acetabulum. In man, the connection between the femur and acetabulum greatly varied among individuals, possibly according to differences in the life style. PMID:9353634

  14. Hot Wax Sweeps Debris From Narrow Passages

    NASA Technical Reports Server (NTRS)

    Ricklefs, Steven K.

    1990-01-01

    Safe and effective technique for removal of debris and contaminants from narrow passages involves entrainment of undesired material in thermoplastic casting material. Semisolid wax slightly below melting temperature pushed along passage by pressurized nitrogen to remove debris. Devised to clean out fuel passages in main combustion chamber of Space Shuttle main engine. Also applied to narrow, intricate passages in internal-combustion-engine blocks, carburetors, injection molds, and other complicated parts.

  15. Overview of Neutrino Mixing Models and Their Mixing Angle Predictions

    SciTech Connect

    Albright, Carl H.

    2009-11-01

    An overview of neutrino-mixing models is presented with emphasis on the types of horizontal flavor and vertical family symmetries that have been invoked. Distributions for the mixing angles of many models are displayed. Ways to differentiate among the models and to narrow the list of viable models are discussed.

  16. Auto-preview camera orientation for environment perception on a mobile robot

    NASA Astrophysics Data System (ADS)

    Radovnikovich, Micho; Vempaty, Pavan K.; Cheok, Ka C.

    2010-01-01

    Using wide-angle or omnidirectional camera lenses to increase a mobile robot's field of view introduces nonlinearity in the image due to the 'fish-eye' effect. This complicates distance perception, and increases image processing overhead. Using multiple cameras avoids the fish-eye complications, but involves using more electrical and processing power to interface them to a computer. Being able to control the orientation of a single camera, both of these disadvantages are minimized while still allowing the robot to preview a wider area. In addition, controlling the orientation allows the robot to optimize its environment perception by only looking where the most useful information can be discovered. In this paper, a technique is presented that creates a two dimensional map of objects of interest surrounding a mobile robot equipped with a panning camera on a telescoping shaft. Before attempting to negotiate a difficult path planning situation, the robot takes snapshots at different camera heights and pan angles and then produces a single map of the surrounding area. Distance perception is performed by making calibration measurements of the camera and applying coordinate transformations to project the camera's findings into the vehicle's coordinate frame. To test the system, obstacles and lines were placed to form a chicane. Several snapshots were taken with different camera orientations, and the information from each were stitched together to yield a very useful map of the surrounding area for the robot to use to plan a path through the chicane.

  17. An Inexpensive Digital Infrared Camera

    ERIC Educational Resources Information Center

    Mills, Allan

    2012-01-01

    Details are given for the conversion of an inexpensive webcam to a camera specifically sensitive to the near infrared (700-1000 nm). Some experiments and practical applications are suggested and illustrated. (Contains 9 figures.)

  18. Mars Orbiter Camera Views the 'Face on Mars' - Best View from Viking

    NASA Technical Reports Server (NTRS)

    1998-01-01

    Shortly after midnight Sunday morning (5 April 1998 12:39 AM PST), the Mars Orbiter Camera (MOC) on the Mars Global Surveyor (MGS) spacecraft successfully acquired a high resolution image of the 'Face on Mars' feature in the Cydonia region. The image was transmitted to Earth on Sunday, and retrieved from the mission computer data base Monday morning (6 April 1998). The image was processed at the Malin Space Science Systems (MSSS) facility 9:15 AM and the raw image immediately transferred to the Jet Propulsion Laboratory (JPL) for release to the Internet. The images shown here were subsequently processed at MSSS.

    The picture was acquired 375 seconds after the spacecraft's 220th close approach to Mars. At that time, the 'Face', located at approximately 40.8o N, 9.6o W, was 275 miles (444 km) from the spacecraft. The 'morning' sun was 25o above the horizon. The picture has a resolution of 14.1 feet (4.3 meters) per pixel, making it ten times higher resolution than the best previous image of the feature, which was taken by the Viking Mission in the mid-1970's. The full image covers an area 2.7 miles (4.4 km) wide and 25.7 miles (41.5 km) long.

    This Viking Orbiter image is one of the best Viking pictures of the area Cydonia where the 'Face' is located. Marked on the image are the 'footprint' of the high resolution (narrow angle) Mars Orbiter Camera image and the area seen in enlarged views (dashed box). See PIA01440-1442 for these images in raw and processed form.

    Malin Space Science Systems and the California Institute of Technology built the MOC using spare hardware from the Mars Observer mission. MSSS operates the camera from its facilities in San Diego, CA. The Jet Propulsion Laboratory's Mars Surveyor Operations Project operates the Mars Global Surveyor spacecraft with its industrial partner, Lockheed Martin Astronautics, from facilities in Pasadena, CA and Denver, CO.

  19. The future of consumer cameras

    NASA Astrophysics Data System (ADS)

    Battiato, Sebastiano; Moltisanti, Marco

    2015-03-01

    In the last two decades multimedia, and in particular imaging devices (camcorders, tablets, mobile phones, etc.) have been dramatically diffused. Moreover the increasing of their computational performances, combined with an higher storage capability, allows them to process large amount of data. In this paper an overview of the current trends of consumer cameras market and technology will be given, providing also some details about the recent past (from Digital Still Camera up today) and forthcoming key issues.

  20. Solid State Television Camera (CID)

    NASA Technical Reports Server (NTRS)

    Steele, D. W.; Green, W. T.

    1976-01-01

    The design, development and test are described of a charge injection device (CID) camera using a 244x248 element array. A number of video signal processing functions are included which maximize the output video dynamic range while retaining the inherently good resolution response of the CID. Some of the unique features of the camera are: low light level performance, high S/N ratio, antiblooming, geometric distortion, sequential scanning and AGC.

  1. Two-Step Camera Calibration Method Developed for Micro UAV'S

    NASA Astrophysics Data System (ADS)

    Gašparović, M.; Gajski, D.

    2016-06-01

    The development of unmanned aerial vehicles (UAVs) and continuous price reduction of unmanned systems attracted us to this research. Professional measuring systems are dozens of times more expensive and often heavier than "amateur", non-metric UAVs. For this reason, we tested the DJI Phantom 2 Vision Plus UAV. Phantom's smaller mass and velocity can develop less kinetic energy in relation to the professional measurement platforms, which makes it potentially less dangerous for use in populated areas. In this research, we wanted to investigate the ability of such non-metric UAV and find the procedures under which this kind of UAV may be used for the photogrammetric survey. It is important to emphasize that UAV is equipped with an ultra wide-angle camera with 14MP sensor. Calibration of such cameras is a complex process. In the research, a new two-step process is presented and developed, and the results are compared with standard one-step camera calibration procedure. Two-step process involves initially removed distortion on all images, and then uses these images in the phototriangulation with self-calibration. The paper presents statistical indicators which proved that the proposed two-step process is better and more accurate procedure for calibrating those types of cameras than standard one-step calibration. Also, we suggest two-step calibration process as the standard for ultra-wideangle cameras for unmanned aircraft.

  2. Fundus Camera Guided Photoacoustic Ophthalmoscopy

    PubMed Central

    Liu, Tan; Li, Hao; Song, Wei; Jiao, Shuliang; Zhang, Hao F.

    2014-01-01

    Purpose To demonstrate the feasibility of fundus camera guided photoacoustic ophthalmoscopy (PAOM) system and its multimodal imaging capabilities. Methods We integrated PAOM and a fundus camera consisting of a white-light illuminator and a high-sensitivity, high-speed CCD. The fundus camera captures both retinal anatomy and PAOM illumination at the same time to provide a real-time feedback when we position the PAOM illuminating light. We applied the integrated system to image rat eyes in vivo and used full-spectrum, visible (VIS), and near infrared (NIR) illuminations in fundus photography. Results Both albino and pigmented rat eyes were imaged in vivo. During alignment, different trajectories of PAOM laser scanning were successfully visualized by the fundus camera, which reduced the PAOM alignment time from several minutes to 30 s. In albino eyes, in addition to retinal vessels, main choroidal vessels were observed using VIS-illumination, which is similar to PAOM images. In pigmented eyes, the radial striations of retinal nerve fiber layer were visualized by fundus photography using full-spectrum illumination; meanwhile, PAOM imaged both retinal vessels and the retinal pigmented epithelium melanin distribution. Conclusions The results demonstrated that PAOM can be well-integrated with fundus camera without affecting its functionality. The fundus camera guidance is faster and easier comparing with our previous work. The integrated system also set the stage for the next-step verification between oximetry methods based on PAOM and fundus photography. PMID:24131226

  3. Streak camera dynamic range optimization

    SciTech Connect

    Wiedwald, J.D.; Lerche, R.A.

    1987-09-01

    The LLNL optical streak camera is used by the Laser Fusion Program in a wide range of applications. Many of these applications require a large recorded dynamic range. Recent work has focused on maximizing the dynamic range of the streak camera recording system. For our streak cameras, image intensifier saturation limits the upper end of the dynamic range. We have developed procedures to set the image intensifier gain such that the system dynamic range is maximized. Specifically, the gain is set such that a single streak tube photoelectron is recorded with an exposure of about five times the recording system noise. This ensures detection of single photoelectrons, while not consuming intensifier or recording system dynamic range through excessive intensifier gain. The optimum intensifier gain has been determined for two types of film and for a lens-coupled CCD camera. We have determined that by recording the streak camera image with a CCD camera, the system is shot-noise limited up to the onset of image intensifier nonlinearity. When recording on film, the film determines the noise at high exposure levels. There is discussion of the effects of slit width and image intensifier saturation on dynamic range. 8 refs.

  4. Wide Dynamic Range CCD Camera

    NASA Astrophysics Data System (ADS)

    Younse, J. M.; Gove, R. J.; Penz, P. A.; Russell, D. E.

    1984-11-01

    A liquid crystal attenuator (LCA) operated as a variable neutral density filter has been attached to a charge-coupled device (CCD) imager to extend the dynamic range of a solid-state TV camera by an order of magnitude. Many applications are best served by a camera with a dynamic range of several thousand. For example, outside security systems must operate unattended with "dawn-to-dusk" lighting conditions. Although this can be achieved with available auto-iris lens assemblies, more elegant solutions which provide the small size, low power, high reliability advantages of solid state technology are now available. This paper will describe one such unique way of achieving these dynamic ranges using standard optics by making the CCD imager's glass cover a controllable neutral density filter. The liquid crystal attenuator's structure and theoretical properties for this application will be described along with measured transmittance. A small integrated TV camera which utilizes a "virtual-phase" CCD sensor coupled to a LCA will be described and test results for a number of the camera's optical and electrical parameters will be given. These include the following camera parameters: dynamic range, Modulation Transfer Function (MTF), spectral response, and uniformity. Also described will be circuitry which senses the ambient scene illuminance and automatically provides feedback signals to appropriately adjust the transmittance of the LCA. Finally, image photographs using this camera, under various scene illuminations, will be shown.

  5. The virtual gamma camera room.

    PubMed

    Penrose, J M; Trowbridge, E A; Tindale, W B

    1996-05-01

    The installation of a gamma camera is time-consuming and costly and, once installed, the camera position is unlikely to be altered during its working life. Poor choice of camera position therefore has long-term consequences. Additional equipment such as collimators and carts, the operator's workstation and wall-mounted display monitors must also be situated to maximize access and ease of use. The layout of a gamma camera room can be optimized prior to installation by creating a virtual environment. Super-Scape VRT software running on an upgraded 486 PC microprocessor was used to create a 'virtual camera room'. The simulation included an operator's viewpoint and a controlled tour of the room. Equipment could be repositioned as required, allowing potential problems to be identified at the design stage. Access for bed-ridden patients, operator ergonomics, operator and patient visibility were addressed. The display can also be used for patient education. Creation of a virtual environment is a valuable tool which allows different camera systems to be compared interactively in terms of dimensions, extent of movement and use of a defined space. Such a system also has applications in radiopharmacy design and simulation. PMID:8736511

  6. Photoelectric angle converter

    NASA Astrophysics Data System (ADS)

    Podzharenko, Volodymyr A.; Kulakov, Pavlo I.

    2001-06-01

    The photo-electric angle transmitter of rotation is offered, at which the output voltage is linear function of entering magnitude. In a transmitter the linear phototransducer is used on the basis of pair photo diode -- operating amplifier, which output voltage is linear function of the area of an illuminated photosensitive stratum, and modulator of a light stream of the special shape, which ensures a linear dependence of this area from an angle of rotation. The transmitter has good frequent properties and can be used for dynamic measurements of an angular velocity and angle of rotation, in systems of exact drives and systems of autocontrol.

  7. The MC and LFC cameras. [metric camera (MC); large format camera (LFC)

    NASA Technical Reports Server (NTRS)

    Norton, Clarice L.; Schroeder, Manfried; Mollberg, Bernard

    1986-01-01

    The characteristics of the shuttle-borne Large Format Camera are listed. The LFC focal plane format was 23 by 46 cm, double the usual size, thereby acquiring approximately double the ground area. Forward motion compensation was employed. With the stable platform (shuttle) it was possible to use the slow exposure, high resolution, Kodak aerial films; 3414 and 3412 black and white, SO-242 color, and SO-131 aerochrome infrared. The camera was designed to maintain stability during varying temperature extremes of space.

  8. The importance of craniovertebral and cervicomedullary angles in cervicogenic headache

    PubMed Central

    Çoban, Gökçen; Çöven, İlker; Çifçi, Bilal Egemen; Yıldırım, Erkan; Yazıcı, Ayşe Canan; Horasanlı, Bahriye

    2014-01-01

    PURPOSE Many studies have indicated that cervicogenic headache may originate from the cervical structures innervated by the upper cervical spinal nerves. To date, no study has investigated whether narrowing of the craniovertebral angle (CVA) or cervicomedullary angle (CMA) affects the three upper cervical spinal nerves. The aim of this study was to investigate the effect of CVA and/or CMA narrowing on the occurrence of cervicogenic headache. MATERIALS AND METHODS Two hundred and five patients diagnosed with cervicogenic headache were included in the study. The pain scores of patients were determined using a visual analog scale. The nonheadache control group consisted of 40 volunteers. CVA and CMA values were measured on sagittal T2-weighted magnetic resonance imaging (MRI), on two occasions by two radiologists. Angle values and categorized pain scores were compared statistically between the groups. RESULTS Intraobserver and interobserver agreement was over 97% for all measurements. Pain scores increased with decreasing CVA and CMA values. Mean angle values were significantly different among the pain categories (P < 0.001). The pain score was negatively correlated with CMA (Spearman correlation coefficient, rs, −0.676; P < 0.001) and CVA values (rs, −0.725; P < 0.001). CONCLUSION CVA or CMA narrowing affects the occurrence of cervicogenic headache. There is an inverse relationship between the angle values and pain scores. PMID:24317332

  9. Reading Angles in Maps

    PubMed Central

    Izard, Véronique; O'Donnell, Evan; Spelke, Elizabeth S.

    2013-01-01

    Preschool children can navigate by simple geometric maps of the environment, but the nature of the geometric relations they use in map reading remains unclear. Here, children were tested specifically on their sensitivity to angle. Forty-eight children (age 47:15–53:30 months) were presented with fragments of geometric maps, in which angle sections appeared without any relevant length or distance information. Children were able to read these map fragments and compare 2D to 3D angles. However, this ability appeared both variable and fragile among the youngest children of the sample. These findings suggest that 4-year-old children begin to form an abstract concept of angle that applies both to 2D and 3D displays and that serves to interpret novel spatial symbols. PMID:23647223

  10. Angles, Time, and Proportion

    ERIC Educational Resources Information Center

    Pagni, David L.

    2005-01-01

    This article describes an investigation making connections between the time on an analog clock and the angle between the minute hand and the hour hand. It was posed by a middle school mathematics teacher. (Contains 8 tables and 6 figures.)

  11. Reading angles in maps.

    PubMed

    Izard, Véronique; O'Donnell, Evan; Spelke, Elizabeth S

    2014-01-01

    Preschool children can navigate by simple geometric maps of the environment, but the nature of the geometric relations they use in map reading remains unclear. Here, children were tested specifically on their sensitivity to angle. Forty-eight children (age 47:15-53:30 months) were presented with fragments of geometric maps, in which angle sections appeared without any relevant length or distance information. Children were able to read these map fragments and compare two-dimensional to three-dimensional angles. However, this ability appeared both variable and fragile among the youngest children of the sample. These findings suggest that 4-year-old children begin to form an abstract concept of angle that applies both to two-dimensional and three-dimensional displays and that serves to interpret novel spatial symbols. PMID:23647223

  12. Solder wetting kinetics in narrow V-grooves

    SciTech Connect

    Yost, F.G.; Rye, R.R.; Mann, J.A. Jr.

    1997-12-01

    Experiments are performed to observe capillary flow in grooves cut into copper surfaces. Flow kinetics of two liquids, 1-heptanol and eutectic Sn-Pb solder, are modeled with modified Washburn kinetics and compared to flow data. It is shown that both liquids flow parabolically in narrow V-grooves, and the data scale as predicted by the modified Washburn model. The early portions of the flow kinetics are characterized by curvature in the length vs time relationship which is not accounted for in the modified Washburn model. This effect is interpreted in terms of a dynamic contact angle. It is concluded that under conditions of rapid flow, solder spreading can be understood as a simple fluid flow process. Slower kinetics, e.g. solder droplet spreading on flat surfaces, may be affected by subsidiary chemical processes such as reaction.

  13. Narrow-field radiometry in a quasi-isotropic atmosphere

    NASA Technical Reports Server (NTRS)

    Holmes, A.; Palmer, J. M.; Tomasko, M. G.

    1979-01-01

    If a radiometer having a narrow field of view is used to measure the radiance of a source such as a quasi-isotropic atmosphere, a knowledge of the out-of-field responsivity is critical. For example, if a radiometer with a field of view of 5 deg (full-angle) has a relative responsivity of 0.0001 for the out-of-field radiation, the contribution of the out-of-field radiation (assuming an isotropic source subtending 2 steradians) is 10.5% of the total signal. Either the stray light suppression of the radiometer must be extremely high or methods of determining the out-of-field response must be developed. A description of one method of determining the effect of out-of-field response and its application to a planetary atmospheric radiometer is presented.

  14. Tailored slice selection in solid-state MRI by DANTE under magic-echo line narrowing

    NASA Astrophysics Data System (ADS)

    Matsui, Shigeru; Masumoto, Hidefumi; Hashimoto, Takeyuki

    2007-06-01

    We propose a method of slice selection in solid-state MRI by combining DANTE selective excitation with magic-echo (ME) line narrowing. The DANTE RF pulses applied at the ME peaks practically do not interfere with the ME line narrowing in the combined ME DANTE sequence. This allows straightforward tailoring of the slice profile simply by introducing an appropriate modulation, such as a sinc modulation, into the flip angles of the applied DANTE RF pulses. The utility of the method has been demonstrated by preliminary experiments performed on a test sample of adamantane.

  15. 'Magic Angle Precession'

    SciTech Connect

    Binder, Bernd

    2008-01-21

    An advanced and exact geometric description of nonlinear precession dynamics modeling very accurately natural and artificial couplings showing Lorentz symmetry is derived. In the linear description it is usually ignored that the geometric phase of relativistic motion couples back to the orbital motion providing for a non-linear recursive precession dynamics. The high coupling strength in the nonlinear case is found to be a gravitomagnetic charge proportional to the precession angle and angular velocity generated by geometric phases, which are induced by high-speed relativistic rotations and are relevant to propulsion technologies but also to basic interactions. In the quantum range some magic precession angles indicating strong coupling in a phase-locked chaotic system are identified, emerging from a discrete time dynamical system known as the cosine map showing bifurcations at special precession angles relevant to heavy nuclei stability. The 'Magic Angle Precession' (MAP) dynamics can be simulated and visualized by cones rolling in or on each other, where the apex and precession angles are indexed by spin, charge or precession quantum numbers, and corresponding magic angles. The most extreme relativistic warping and twisting effect is given by the Dirac spinor half spin constellation with 'Hyperdiamond' MAP, which resembles quark confinement.

  16. Sub-Camera Calibration of a Penta-Camera

    NASA Astrophysics Data System (ADS)

    Jacobsen, K.; Gerke, M.

    2016-03-01

    Penta cameras consisting of a nadir and four inclined cameras are becoming more and more popular, having the advantage of imaging also facades in built up areas from four directions. Such system cameras require a boresight calibration of the geometric relation of the cameras to each other, but also a calibration of the sub-cameras. Based on data sets of the ISPRS/EuroSDR benchmark for multi platform photogrammetry the inner orientation of the used IGI Penta DigiCAM has been analyzed. The required image coordinates of the blocks Dortmund and Zeche Zollern have been determined by Pix4Dmapper and have been independently adjusted and analyzed by program system BLUH. With 4.1 million image points in 314 images respectively 3.9 million image points in 248 images a dense matching was provided by Pix4Dmapper. With up to 19 respectively 29 images per object point the images are well connected, nevertheless the high number of images per object point are concentrated to the block centres while the inclined images outside the block centre are satisfying but not very strongly connected. This leads to very high values for the Student test (T-test) of the finally used additional parameters or in other words, additional parameters are highly significant. The estimated radial symmetric distortion of the nadir sub-camera corresponds to the laboratory calibration of IGI, but there are still radial symmetric distortions also for the inclined cameras with a size exceeding 5μm even if mentioned as negligible based on the laboratory calibration. Radial and tangential effects of the image corners are limited but still available. Remarkable angular affine systematic image errors can be seen especially in the block Zeche Zollern. Such deformations are unusual for digital matrix cameras, but it can be caused by the correlation between inner and exterior orientation if only parallel flight lines are used. With exception of the angular affinity the systematic image errors for corresponding

  17. Photometric Calibration of Consumer Video Cameras

    NASA Technical Reports Server (NTRS)

    Suggs, Robert; Swift, Wesley, Jr.

    2007-01-01

    Equipment and techniques have been developed to implement a method of photometric calibration of consumer video cameras for imaging of objects that are sufficiently narrow or sufficiently distant to be optically equivalent to point or line sources. Heretofore, it has been difficult to calibrate consumer video cameras, especially in cases of image saturation, because they exhibit nonlinear responses with dynamic ranges much smaller than those of scientific-grade video cameras. The present method not only takes this difficulty in stride but also makes it possible to extend effective dynamic ranges to several powers of ten beyond saturation levels. The method will likely be primarily useful in astronomical photometry. There are also potential commercial applications in medical and industrial imaging of point or line sources in the presence of saturation.This development was prompted by the need to measure brightnesses of debris in amateur video images of the breakup of the Space Shuttle Columbia. The purpose of these measurements is to use the brightness values to estimate relative masses of debris objects. In most of the images, the brightness of the main body of Columbia was found to exceed the dynamic ranges of the cameras. A similar problem arose a few years ago in the analysis of video images of Leonid meteors. The present method is a refined version of the calibration method developed to solve the Leonid calibration problem. In this method, one performs an endto- end calibration of the entire imaging system, including not only the imaging optics and imaging photodetector array but also analog tape recording and playback equipment (if used) and any frame grabber or other analog-to-digital converter (if used). To automatically incorporate the effects of nonlinearity and any other distortions into the calibration, the calibration images are processed in precisely the same manner as are the images of meteors, space-shuttle debris, or other objects that one seeks to

  18. Cross-ratio-based line scan camera calibration using a planar pattern

    NASA Astrophysics Data System (ADS)

    Li, Dongdong; Wen, Gongjian; Qiu, Shaohua

    2016-01-01

    A flexible new technique is proposed to calibrate the geometric model of line scan cameras. In this technique, the line scan camera is rigidly coupled to a calibrated frame camera to establish a pair of stereo cameras. The linear displacements and rotation angles between the two cameras are fixed but unknown. This technique only requires the pair of stereo cameras to observe a specially designed planar pattern shown at a few (at least two) different orientations. At each orientation, a stereo pair is obtained including a linear array image and a frame image. Radial distortion of the line scan camera is modeled. The calibration scheme includes two stages. First, point correspondences are established from the pattern geometry and the projective invariance of cross-ratio. Second, with a two-step calibration procedure, the intrinsic parameters of the line scan camera are recovered from several stereo pairs together with the rigid transform parameters between the pair of stereo cameras. Both computer simulation and real data experiments are conducted to test the precision and robustness of the calibration algorithm, and very good calibration results have been obtained. Compared with classical techniques which use three-dimensional calibration objects or controllable moving platforms, our technique is affordable and flexible in close-range photogrammetric applications.

  19. The study of dual camera 3D coordinate vision measurement system using a special probe

    NASA Astrophysics Data System (ADS)

    Liu, Shugui; Peng, Kai; Zhang, Xuefei; Zhang, Haifeng; Huang, Fengshan

    2006-11-01

    Due to high precision and convenient operation, the vision coordinate measurement machine with one probe has become the research focus in visual industry. In general such a visual system can be setup conveniently with just one CCD camera and probe. However, the price of the system will surge up too high to accept while the top performance hardware, such as CCD camera, image captured card and etc, have to be applied in the system to obtain the high axis-oriented measurement precision. In this paper, a new dual CCD camera vision coordinate measurement system based on redundancy principle is proposed to achieve high precision by moderate price. Since two CCD cameras are placed with the angle of camera axis like about 90 degrees to build the system, two sub-systems can be built by each CCD camera and the probe. With the help of the probe the inner and outer parameters of camera are first calibrated, the system by use of redundancy technique is set up now. When axis-oriented error is eliminated within the two sub-systems, which is so large and always exits in the single camera system, the high precision measurement is obtained by the system. The result of experiment compared to that from CMM shows that the system proposed is more excellent in stableness and precision with the uncertainty beyond +/-0.1 mm in xyz orient within the distance of 2m using two common CCD cameras.

  20. Distributed consensus on camera pose.

    PubMed

    Jorstad, Anne; DeMenthon, Daniel; Wang, I-Jeng; Burlina, Philippe

    2010-09-01

    Our work addresses pose estimation in a distributed camera framework. We examine how processing cameras can best reach a consensus about the pose of an object when they are each given a model of the object, defined by a set of point coordinates in the object frame of reference. The cameras can only see a subset of the object feature points in the midst of background clutter points, not knowing which image points match with which object points, nor which points are object points or background points. The cameras individually recover a prediction of the object's pose using their knowledge of the model, and then exchange information with their neighbors, performing consensus updates locally to obtain a single estimate consistent across all cameras, without requiring a common centralized processor. Our main contributions are: 1) we present a novel algorithm performing consensus updates in 3-D world coordinates penalized by a 3-D model, and 2) we perform a thorough comparison of our method with other current consensus methods. Our method is consistently the most accurate, and we confirm that the existing consensus method based upon calculating the Karcher mean of rotations is also reliable and fast. Experiments on simulated and real imagery are reported. PMID:20363678

  1. The Clementine longwave infrared camera

    SciTech Connect

    Priest, R.E.; Lewis, I.T.; Sewall, N.R.; Park, H.S.; Shannon, M.J.; Ledebuhr, A.G.; Pleasance, L.D.; Massie, M.A.; Metschuleit, K.

    1995-04-01

    The Clementine mission provided the first ever complete, systematic surface mapping of the moon from the ultra-violet to the near-infrared regions. More than 1.7 million images of the moon, earth and space were returned from this mission. The longwave-infrared (LWIR) camera supplemented the UV/Visible and near-infrared mapping cameras providing limited strip coverage of the moon, giving insight to the thermal properties of the soils. This camera provided {approximately}100 m spatial resolution at 400 km periselene, and a 7 km across-track swath. This 2.1 kg camera using a 128 x 128 Mercury-Cadmium-Telluride (MCT) FPA viewed thermal emission of the lunar surface and lunar horizon in the 8.0 to 9.5 {micro}m wavelength region. A description of this light-weight, low power LWIR camera along with a summary of lessons learned is presented. Design goals and preliminary on-orbit performance estimates are addressed in terms of meeting the mission`s primary objective for flight qualifying the sensors for future Department of Defense flights.

  2. CARTOGAM: a portable gamma camera

    NASA Astrophysics Data System (ADS)

    Gal, O.; Izac, C.; Lainé, F.; Nguyen, A.

    1997-02-01

    The gamma camera is devised to establish the cartography of radioactive sources against a visible background in quasi real time. This device is designed to spot sources from a distance during the preparation of interventions on active areas of nuclear installations. This implement will permit to optimize interventions especially on the dosimetric level. The camera consists of a double cone collimator, a scintillator and an intensified CCD camera. This chain of detection provides the formation of both gamma images and visible images. Even though it is wrapped in a denal shield, the camera is still portable (mass < 15 kg) and compact (external diameter = 8 cm). The angular resolution is of the order of one degree for gamma rays of 1 MeV. In a few minutes, the device is able to measure a dose rate of 10 μGy/h delivered for instance by a source of 60Co of 90 mCi located at 10 m from the detector. The first images recorded in the laboratory will be presented and will illustrate the performances obtained with this camera.

  3. Traditional gamma cameras are preferred.

    PubMed

    DePuey, E Gordon

    2016-08-01

    Although the new solid-state dedicated cardiac cameras provide excellent spatial and energy resolution and allow for markedly reduced SPECT acquisition times and/or injected radiopharmaceutical activity, they have some distinct disadvantages compared to traditional sodium iodide SPECT cameras. They are expensive. Attenuation correction is not available. Cardio-focused collimation, advantageous to increase depth-dependent resolution and myocardial count density, accentuates diaphragmatic attenuation and scatter from subdiaphragmatic structures. Although supplemental prone imaging is therefore routinely advised, many patients cannot tolerate it. Moreover, very large patients cannot be accommodated in the solid-state camera gantries. Since data are acquired simultaneously with an arc of solid-state detectors around the chest, no temporally dependent "rotating" projection images are obtained. Therefore, patient motion can be neither detected nor corrected. In contrast, traditional sodium iodide SPECT cameras provide rotating projection images to allow technologists and physicians to detect and correct patient motion and to accurately detect the position of soft tissue attenuators and to anticipate associated artifacts. Very large patients are easily accommodated. Low-dose x-ray attenuation correction is widely available. Also, relatively inexpensive low-count density software is provided by many vendors, allowing shorter SPECT acquisition times and reduced injected activity approaching that achievable with solid-state cameras. PMID:27072004

  4. Video camera use at nuclear power plants

    SciTech Connect

    Estabrook, M.L.; Langan, M.O.; Owen, D.E. )

    1990-08-01

    A survey of US nuclear power plants was conducted to evaluate video camera use in plant operations, and determine equipment used and the benefits realized. Basic closed circuit television camera (CCTV) systems are described and video camera operation principles are reviewed. Plant approaches for implementing video camera use are discussed, as are equipment selection issues such as setting task objectives, radiation effects on cameras, and the use of disposal cameras. Specific plant applications are presented and the video equipment used is described. The benefits of video camera use --- mainly reduced radiation exposure and increased productivity --- are discussed and quantified. 15 refs., 6 figs.

  5. Method of rotation angle measurement in machine vision based on calibration pattern with spot array

    SciTech Connect

    Li Weimin; Jin Jing; Li Xiaofeng; Li Bin

    2010-02-20

    We propose a method of rotation angle measurement with high precision in machine vision. An area scan CCD camera, imaging lens, and calibration pattern with a spot array make up the measurement device for measuring the rotation angle. The calibration pattern with a spot array is installed at the rotation part, and the CCD camera is set at a certain distance from the rotation components. The coordinates of the spots on the calibration pattern is acquired through the vision image of the calibration pattern captured by the CCD camera. At the initial position of the calibration pattern, the camera is calibrated with the spot array; the mathematical model of distortion error of the CCD camera is built. With the equation of coordinate rotation measurement, the rotation angle of the spot array is detected. In the theoretic simulation, noise of different levels is added to the coordinates of the spot array. The experiment results show that the measurement device can measure the rotation angle precisely with a noncontact method. The standard deviation of rotation angle measurement is smaller than 3 arc sec. The measurement device can measure both microangles and large angles.

  6. Method of rotation angle measurement in machine vision based on calibration pattern with spot array.

    PubMed

    Li, Weimin; Jin, Jing; Li, Xiaofeng; Li, Bin

    2010-02-20

    We propose a method of rotation angle measurement with high precision in machine vision. An area scan CCD camera, imaging lens, and calibration pattern with a spot array make up the measurement device for measuring the rotation angle. The calibration pattern with a spot array is installed at the rotation part, and the CCD camera is set at a certain distance from the rotation components. The coordinates of the spots on the calibration pattern is acquired through the vision image of the calibration pattern captured by the CCD camera. At the initial position of the calibration pattern, the camera is calibrated with the spot array; the mathematical model of distortion error of the CCD camera is built. With the equation of coordinate rotation measurement, the rotation angle of the spot array is detected. In the theoretic simulation, noise of different levels is added to the coordinates of the spot array. The experiment results show that the measurement device can measure the rotation angle precisely with a noncontact method. The standard deviation of rotation angle measurement is smaller than 3 arc sec. The measurement device can measure both microangles and large angles. PMID:20174168

  7. Discovery of a narrow line quasar

    NASA Technical Reports Server (NTRS)

    Stocke, J.; Liebert, J.; Maccacaro, T.; Griffiths, R. E.; Steiner, J. E.

    1982-01-01

    A stellar object is reported which, while having X-ray and optical luminosities typical of quasars, has narrow permitted and forbidden emission lines over the observed spectral range. The narrow-line spectrum is high-excitation, the Balmer lines seem to be recombinational, and a redder optical spectrum than that of most quasars is exhibited, despite detection as a weak radio source. The object does not conform to the relationships between H-beta parameters and X-ray flux previously claimed for a large sample of the active galactic nuclei. Because reddish quasars with narrow lines, such as the object identified, may not be found by the standard techniques for the discovery of quasars, the object may be a prototype of a new class of quasars analogous to high-luminosity Seyfert type 2 galaxies. It is suggested that these objects cannot comprise more than 10% of all quasars.

  8. Narrowing of intersensory speech perception in infancy.

    PubMed

    Pons, Ferran; Lewkowicz, David J; Soto-Faraco, Salvador; Sebastián-Gallés, Núria

    2009-06-30

    The conventional view is that perceptual/cognitive development is an incremental process of acquisition. Several striking findings have revealed, however, that the sensitivity to non-native languages, faces, vocalizations, and music that is present early in life declines as infants acquire experience with native perceptual inputs. In the language domain, the decline in sensitivity is reflected in a process of perceptual narrowing that is thought to play a critical role during the acquisition of a native-language phonological system. Here, we provide evidence that such a decline also occurs in infant response to multisensory speech. We found that infant intersensory response to a non-native phonetic contrast narrows between 6 and 11 months of age, suggesting that the perceptual system becomes increasingly more tuned to key native-language audiovisual correspondences. Our findings lend support to the notion that perceptual narrowing is a domain-general as well as a pan-sensory developmental process. PMID:19541648

  9. Congenital narrowing of the cervical spinal canal.

    PubMed Central

    Kessler, J T

    1975-01-01

    The clinical and laboratory findings in six patients with congenital narrowing of the cervical spinal canal and neurological symptoms are described. A variable age of onset and an entirely male occurrence were found. Signs and symptoms of spinal cord dysfunction predominated in all but one patient. Symptoms were produced in five patients by increased physical activity alone. Congenital narrowing of the cervical spinal canal may result in cord compression without a history of injury and occasionally without evidence of significant bony degenerative changes. The clinical features may be distinguishable from those found in cervical spondylosis without congenital narrowing. Intermittent claudication of the cervical spinal cord appears to be an important feature of this syndrome. Surgery improved four out of five people. PMID:1219087

  10. Research on the application of single camera stereo vision sensor in three-dimensional point measurement

    NASA Astrophysics Data System (ADS)

    Feng, Xiao-feng; Pan, Di-fu

    2015-09-01

    A single camera stereo vision sensor model based on planar mirror imaging is proposed for measuring a three-dimensional point. The model consists of a CCD camera and a planar mirror. Using planar mirror reflections of a scene, a picture with parallax is obtained by shooting the target object and its virtual image. This is equivalent to shooting the target object from different angles with the camera and the virtual camera in the planar mirror, so it has the function of binocular stereo vision. In addition, the measurement theory of the three-dimensional point is discussed. The mathematical model of a single camera stereo vision sensor is established, the intrinsic and extrinsic parameters are calibrated, and the corresponding experiment has been done. The experimental results show that the measuring method is convenient and effective; it also has the advantages of simple structure, convenient adjustment, and is especially suitable for short-distance measurement with high precision.

  11. Versatility of the CFR algorithm for limited angle reconstruction

    SciTech Connect

    Fujieda, I.; Heiskanen, K.; Perez-Mendez, V. )

    1990-04-01

    The constrained Fourier reconstruction (CFR) algorithm and the iterative reconstruction-reprojection (IRR) algorithm are evaluated based on their accuracy for three types of limited angle reconstruction problems. The cFR algorithm performs better for problems such as Xray CT imaging of a nuclear reactor core with one large data gap due to structural blocking of the source and detector pair. For gated heart imaging by Xray CT, radioisotope distribution imaging by PET or SPECT, using a polygonal array of gamma cameras with insensitive gaps between camera boundaries, the IRR algorithm has a slight advantage over the CFR algorithm but the difference is not significant.

  12. WIDE-FIELD ASTRONOMICAL MULTISCALE CAMERAS

    SciTech Connect

    Marks, Daniel L.; Brady, David J.

    2013-05-15

    In order to produce sufficiently low aberrations with a large aperture, telescopes have a limited field of view. Because of this narrow field, large areas of the sky at a given time are unobserved. We propose several telescopes based on monocentric reflective, catadioptric, and refractive objectives that may be scaled to wide fields of view and achieve 1.''1 resolution, which in most locations is the practical seeing limit of the atmosphere. The reflective and Schmidt catadioptric objectives have relatively simple configurations and enable large fields to be captured at the expense of the obscuration of the mirror by secondary optics, a defect that may be managed by image plane design. The refractive telescope design does not have an obscuration but the objective has substantial bulk. The refractive design is a 38 gigapixel camera which consists of a single monocentric objective and 4272 microcameras. Monocentric multiscale telescopes, with their wide fields of view, may observe phenomena that might otherwise be unnoticed, such as supernovae, glint from orbital space debris, and near-earth objects.

  13. Wide field camera observations of Baade's Window

    NASA Technical Reports Server (NTRS)

    Holtzman, Jon A.; Light, R. M.; Baum, William A.; Worthey, Guy; Faber, S. M.; Hunter, Deidre A.; O'Neil, Earl J., Jr.; Kreidl, Tobias J.; Groth, E. J.; Westphal, James A.

    1993-01-01

    We have observed a field in Baade's Window using the Wide Field Camera (WFC) of the Hubble Space Telescope (HST) and obtain V- and I-band photometry down to V approximately 22.5. These data go several magnitudes fainter than previously obtained from the ground. The location of the break in the luminosity function suggests that there are a significant number of intermediate age (less than 10 Gyr) stars in the Galactic bulge. This conclusion rests on the assumptions that the extinction towards our field is similar to that seen in other parts of Baade's Window, that the distance to the bulge is approximately 8 kpc, and that we can determine fairly accurate zero points for the HST photometry. Changes in any one of these assumptions could increase the inferred age, but a conspiracy of lower reddening, a shorter distance to the bulge, and/or photometric zero-point errors would be needed to imply a population entirely older than 10 Gyr. We infer an initial mass function slope for the main-sequence stars, and find that it is consistent with that measured in the solar neighborhood; unfortunately, the slope is poorly constrained because we sample only a narrow range of stellar mass and because of uncertainties in the observed luminosity function at the faint end.

  14. Compact narrow linewidth semiconductor laser module

    NASA Astrophysics Data System (ADS)

    Poulin, M.; Ayotte, S.; Latrasse, C.; Painchaud, Y.; Cliche, J. F.; Babin, A.; Aubé, M.; Picard, M.; Costin, F.; Têtu, M.; Lafrance, G.

    2009-05-01

    Frequency noise reduction of semiconductor lasers using electrical feedback from an optical frequency discriminator is an efficient and simple approach to realize narrow linewidth lasers. These lasers are of great interest for applications such as LIDAR, RF photonics and interferometric sensing. In this paper, we review the technological choices made by TeraXion for the realization of its Narrow Linewidth Laser modules. The method enables to decrease the linewidth of DFB lasers from several hundreds of kHz to a few kHz. We present the work in progress to integrate such system into a miniature package and to incorporate advanced functionalities such as multi-laser phase locking.

  15. Cameras for semiconductor process control

    NASA Technical Reports Server (NTRS)

    Porter, W. A.; Parker, D. L.

    1977-01-01

    The application of X-ray topography to semiconductor process control is described, considering the novel features of the high speed camera and the difficulties associated with this technique. The most significant results on the effects of material defects on device performance are presented, including results obtained using wafers processed entirely within this institute. Defects were identified using the X-ray camera and correlations made with probe data. Also included are temperature dependent effects of material defects. Recent applications and improvements of X-ray topographs of silicon-on-sapphire and gallium arsenide are presented with a description of a real time TV system prototype and of the most recent vacuum chuck design. Discussion is included of our promotion of the use of the camera by various semiconductor manufacturers.

  16. Camera-on-a-Chip

    NASA Technical Reports Server (NTRS)

    1999-01-01

    Jet Propulsion Laboratory's research on a second generation, solid-state image sensor technology has resulted in the Complementary Metal- Oxide Semiconductor Active Pixel Sensor (CMOS), establishing an alternative to the Charged Coupled Device (CCD). Photobit Corporation, the leading supplier of CMOS image sensors, has commercialized two products of their own based on this technology: the PB-100 and PB-300. These devices are cameras on a chip, combining all camera functions. CMOS "active-pixel" digital image sensors offer several advantages over CCDs, a technology used in video and still-camera applications for 30 years. The CMOS sensors draw less energy, they use the same manufacturing platform as most microprocessors and memory chips, and they allow on-chip programming of frame size, exposure, and other parameters.

  17. Aerial camera auto focusing system

    NASA Astrophysics Data System (ADS)

    Wang, Xuan; Lan, Gongpu; Gao, Xiaodong; Liang, Wei

    2012-10-01

    Before the aerial photographic task, the cameras focusing work should be performed at first to compensate the defocus caused by the changes of the temperature, pressure etc. A new method of aerial camera auto focusing is proposed through traditional photoelectric self-collimation combined with image processing method. Firstly, the basic principles of optical self-collimation and image processing are introduced. Secondly, the limitations of the two are illustrated and the benefits of the new method are detailed. Then the basic principle, the system composition and the implementation of this new method are presented. Finally, the data collection platform is set up reasonably and the focus evaluation function curve is draw. The results showed that: the method can be used in the Aerial camera focusing field, adapt to the aviation equipment trends of miniaturization and lightweight .This paper is helpful to the further work of accurate and automatic focusing.

  18. Dark Energy Camera for Blanco

    SciTech Connect

    Binder, Gary A.; /Caltech /SLAC

    2010-08-25

    In order to make accurate measurements of dark energy, a system is needed to monitor the focus and alignment of the Dark Energy Camera (DECam) to be located on the Blanco 4m Telescope for the upcoming Dark Energy Survey. One new approach under development is to fit out-of-focus star images to a point spread function from which information about the focus and tilt of the camera can be obtained. As a first test of a new algorithm using this idea, simulated star images produced from a model of DECam in the optics software Zemax were fitted. Then, real images from the Mosaic II imager currently installed on the Blanco telescope were used to investigate the algorithm's capabilities. A number of problems with the algorithm were found, and more work is needed to understand its limitations and improve its capabilities so it can reliably predict camera alignment and focus.

  19. Perceptual Color Characterization of Cameras

    PubMed Central

    Vazquez-Corral, Javier; Connah, David; Bertalmío, Marcelo

    2014-01-01

    Color camera characterization, mapping outputs from the camera sensors to an independent color space, such as XY Z, is an important step in the camera processing pipeline. Until now, this procedure has been primarily solved by using a 3 × 3 matrix obtained via a least-squares optimization. In this paper, we propose to use the spherical sampling method, recently published by Finlayson et al., to perform a perceptual color characterization. In particular, we search for the 3 × 3 matrix that minimizes three different perceptual errors, one pixel based and two spatially based. For the pixel-based case, we minimize the CIE ΔE error, while for the spatial-based case, we minimize both the S-CIELAB error and the CID error measure. Our results demonstrate an improvement of approximately 3% for the ΔE error, 7% for the S-CIELAB error and 13% for the CID error measures. PMID:25490586

  20. The GISMO-2 Bolometer Camera

    NASA Technical Reports Server (NTRS)

    Staguhn, Johannes G.; Benford, Dominic J.; Fixsen, Dale J.; Hilton, Gene; Irwin, Kent D.; Jhabvala, Christine A.; Kovacs, Attila; Leclercq, Samuel; Maher, Stephen F.; Miller, Timothy M.; Moseley, Samuel H.; Sharp, Elemer H.; Wollack, Edward J.

    2012-01-01

    We present the concept for the GISMO-2 bolometer camera) which we build for background-limited operation at the IRAM 30 m telescope on Pico Veleta, Spain. GISM0-2 will operate Simultaneously in the 1 mm and 2 mm atmospherical windows. The 1 mm channel uses a 32 x 40 TES-based Backshort Under Grid (BUG) bolometer array, the 2 mm channel operates with a 16 x 16 BUG array. The camera utilizes almost the entire full field of view provided by the telescope. The optical design of GISM0-2 was strongly influenced by our experience with the GISMO 2 mm bolometer camera which is successfully operating at the 30m telescope. GISMO is accessible to the astronomical community through the regular IRAM call for proposals.

  1. Geiger-mode ladar cameras

    NASA Astrophysics Data System (ADS)

    Yuan, Ping; Sudharsanan, Rengarajan; Bai, Xiaogang; Boisvert, Joseph; McDonald, Paul; Labios, Eduardo; Morris, Bryan; Nicholson, John P.; Stuart, Gary M.; Danny, Harrison; Van Duyne, Stephen; Pauls, Greg; Gaalema, Stephen

    2011-06-01

    The performance of Geiger-mode LAser Detection and Ranging (LADAR) cameras is primarily defined by individual pixel attributes, such as dark count rate (DCR), photon detection efficiency (PDE), jitter, and crosstalk. However, for the expanding LADAR imaging applications, other factors, such as image uniformity, component tolerance, manufacturability, reliability, and operational features, have to be considered. Recently we have developed new 32×32 and 32×128 Read-Out Integrated Circuits (ROIC) for LADAR applications. With multiple filter and absorber structures, the 50-μm-pitch arrays demonstrate pixel crosstalk less than 100 ppm level, while maintaining a PDE greater than 40% at 4 V overbias. Besides the improved epitaxial and process uniformity of the APD arrays, the new ROICs implement a Non-uniform Bias (NUB) circuit providing 4-bit bias voltage tunability over a 2.5 V range to individually bias each pixel. All these features greatly increase the performance uniformity of the LADAR camera. Cameras based on these ROICs were integrated with a data acquisition system developed by Boeing DES. The 32×32 version has a range gate of up to 7 μs and can cover a range window of about 1 km with 14-bit and 0.5 ns timing resolution. The 32×128 camera can be operated at a frame rate of up to 20 kHz with 0.3 ns and 14-bit time resolution through a full CameraLink. The performance of the 32×32 LADAR camera has been demonstrated in a series of field tests on various vehicles.

  2. Multi-Angle View of the Canary Islands

    NASA Technical Reports Server (NTRS)

    2000-01-01

    A multi-angle view of the Canary Islands in a dust storm, 29 February 2000. At left is a true-color image taken by the Multi-angle Imaging SpectroRadiometer (MISR) instrument on NASA's Terra satellite. This image was captured by the MISR camera looking at a 70.5-degree angle to the surface, ahead of the spacecraft. The middle image was taken by the MISR downward-looking (nadir) camera, and the right image is from the aftward 70.5-degree camera. The images are reproduced using the same radiometric scale, so variations in brightness, color, and contrast represent true variations in surface and atmospheric reflectance with angle. Windblown dust from the Sahara Desert is apparent in all three images, and is much brighter in the oblique views. This illustrates how MISR's oblique imaging capability makes the instrument a sensitive detector of dust and other particles in the atmosphere. Data for all channels are presented in a Space Oblique Mercator map projection to facilitate their co-registration. The images are about 400 km (250 miles)wide, with a spatial resolution of about 1.1 kilometers (1,200 yards). North is toward the top. MISR was built and is managed by NASA's Jet Propulsion Laboratory, Pasadena, CA, for NASA's Office of Earth Science, Washington, DC. The Terra satellite is managed by NASA's Goddard Space Flight Center, Greenbelt, MD. JPL is a division of the California Institute of Technology.

  3. A testbed for wide-field, high-resolution, gigapixel-class cameras

    NASA Astrophysics Data System (ADS)

    Kittle, David S.; Marks, Daniel L.; Son, Hui S.; Kim, Jungsang; Brady, David J.

    2013-05-01

    The high resolution and wide field of view (FOV) of the AWARE (Advanced Wide FOV Architectures for Image Reconstruction and Exploitation) gigapixel class cameras present new challenges in calibration, mechanical testing, and optical performance evaluation. The AWARE system integrates an array of micro-cameras in a multiscale design to achieve gigapixel sampling at video rates. Alignment and optical testing of the micro-cameras is vital in compositing engines, which require pixel-level accurate mappings over the entire array of cameras. A testbed has been developed to automatically calibrate and measure the optical performance of the entire camera array. This testbed utilizes translation and rotation stages to project a ray into any micro-camera of the AWARE system. A spatial light modulator is projected through a telescope to form an arbitrary object space pattern at infinity. This collimated source is then reflected by an elevation stage mirror for pointing through the aperture of the objective into the micro-optics and eventually the detector of the micro-camera. Different targets can be projected with the spatial light modulator for measuring the modulation transfer function (MTF) of the system, fiducials in the overlap regions for registration and compositing, distortion mapping, illumination profiles, thermal stability, and focus calibration. The mathematics of the testbed mechanics are derived for finding the positions of the stages to achieve a particular incident angle into the camera, along with calibration steps for alignment of the camera and testbed coordinate axes. Measurement results for the AWARE-2 gigapixel camera are presented for MTF, focus calibration, illumination profile, fiducial mapping across the micro-camera for registration and distortion correction, thermal stability, and alignment of the camera on the testbed.

  4. A testbed for wide-field, high-resolution, gigapixel-class cameras.

    PubMed

    Kittle, David S; Marks, Daniel L; Son, Hui S; Kim, Jungsang; Brady, David J

    2013-05-01

    The high resolution and wide field of view (FOV) of the AWARE (Advanced Wide FOV Architectures for Image Reconstruction and Exploitation) gigapixel class cameras present new challenges in calibration, mechanical testing, and optical performance evaluation. The AWARE system integrates an array of micro-cameras in a multiscale design to achieve gigapixel sampling at video rates. Alignment and optical testing of the micro-cameras is vital in compositing engines, which require pixel-level accurate mappings over the entire array of cameras. A testbed has been developed to automatically calibrate and measure the optical performance of the entire camera array. This testbed utilizes translation and rotation stages to project a ray into any micro-camera of the AWARE system. A spatial light modulator is projected through a telescope to form an arbitrary object space pattern at infinity. This collimated source is then reflected by an elevation stage mirror for pointing through the aperture of the objective into the micro-optics and eventually the detector of the micro-camera. Different targets can be projected with the spatial light modulator for measuring the modulation transfer function (MTF) of the system, fiducials in the overlap regions for registration and compositing, distortion mapping, illumination profiles, thermal stability, and focus calibration. The mathematics of the testbed mechanics are derived for finding the positions of the stages to achieve a particular incident angle into the camera, along with calibration steps for alignment of the camera and testbed coordinate axes. Measurement results for the AWARE-2 gigapixel camera are presented for MTF, focus calibration, illumination profile, fiducial mapping across the micro-camera for registration and distortion correction, thermal stability, and alignment of the camera on the testbed. PMID:23742532

  5. Investigation of ionization mechanism of extended narrow line region.

    NASA Astrophysics Data System (ADS)

    Hashimoto, T. H.; Iye, M. I.; Aoki, K. A.

    2006-08-01

    The narrow line regions of active galaxies have the potential to provide key information about the nature of the central energy sources, the triggering and fueling of the activity, and the dynamical and/or chemical evolution of the narrow line region (NLR). However, if we are to use them in this way, it is crucial to understand the dominant physical mechanism of emission line region. Since Dopita et al.1995,1996 suggested that their radiative shock model is capable of explaining to some extent the spectrum of Seyfert 2 galaxies, detailed observations of individual NLR or extended narrow line region (ENLR) have been performed to clarify the actual ionization mechanism for individual object. It is probably fair to say that at present time, although individual studies may favor gas models photoionized by nonthermal power law photons or other ionization scenario, no clear-cut answer has yet emerged. For the purpose of investigating the importance of shock ionization around NLR we performed optical long-slit spectroscopic observation of Seyfert 2 galaxy, NGC7319, using Subaru telescope during guaranteed time of Faint Object Camera And Spectrograph (FOCAS). We confirmed that NGC7319 has the ENLR (~4kpc) with relatively good alignment with radio components and found the distribution of observed regions in the "diagnostic diagrams"(line ratio versus line ratio plot) are approximately consistent with radiative shock ionization models which assumed about shock front velocity of 500km/s. In addition we discovered the clear anti-corelation between the indicators of gas excitation ([OIII]5007/ Hbeta and [SII](6716+6731)/Halpha) and velocity dispersion over whole observed ENLR. Any other photoionization model could not explain gas kinematics-excitation relation and this anti-corelation is strong evidence of radiative shock ionization, which is probably induced by interaction between jet plasma and NLR clouds. The evidence of the importance of jet-induced shock ionization even

  6. Narrow-Band Applications of Communications Satellites.

    ERIC Educational Resources Information Center

    Cowlan, Bert; Horowitz, Andrew

    This paper attempts to describe the advantages of "narrow-band" applications of communications satellites for education. It begins by discussing the general controversy surrounding the use of satellites in education, by placing the concern within the larger context of the general debate over the uses of new technologies in education, and by…

  7. Adverse effects of prohibiting narrow provider networks.

    PubMed

    Howard, David H

    2014-08-14

    Many insurers participating in the new insurance exchanges are controlling costs by offering plans with narrow provider networks. Proposed regulations would promote network adequacy, but a pro-provider stance may not be inherently pro-consumer or even pro-patient. PMID:25119604

  8. Method for shaping and aiming narrow beams. [sonar mapping and target identification

    NASA Technical Reports Server (NTRS)

    Heyser, R. C. (Inventor)

    1981-01-01

    A sonar method and apparatus is discribed which utilizes a linear frequency chirp in a transmitter/receiver having a correlator to synthesize a narrow beamwidth pattern from otherwise broadbeam transducers when there is relative velocity between the transmitter/receiver and the target. The chirp is so produced in a generator in bandwidth, B, and time, T, as to produce a time bandwidth product, TB, that is increased for a narrower angle. A replica of the chirp produced in a generator is time delayed and Doppler shifted for use as a reference in the receiver for correlation of received chirps from targets. This reference is Doppler shifted to select targets preferentially, thereby to not only synthesize a narrow beam but also aim the beam in azimuth and elevation.

  9. Plasmonic Brewster Angle: Broadband Extraordinary Transmission through Optical Gratings

    NASA Astrophysics Data System (ADS)

    Alù, Andrea; D'Aguanno, Giuseppe; Mattiucci, Nadia; Bloemer, Mark J.

    2011-03-01

    Extraordinary optical transmission through metallic gratings is a well established effect based on the collective resonance of corrugated screens. Being based on plasmonic resonances, its bandwidth is inherently narrow, in particular, for thick screens and narrow apertures. We introduce here a different mechanism to achieve total transmission through an otherwise opaque screen, based on an ultrabroadband tunneling that can span from dc to the visible range at a given incidence angle. This phenomenon effectively represents the equivalent of Brewster transmission for plasmonic and opaque screens.

  10. Exploring the Moon at High-Resolution: First Results From the Lunar Reconnaissance Orbiter Camera (LROC)

    NASA Astrophysics Data System (ADS)

    Robinson, Mark; Hiesinger, Harald; McEwen, Alfred; Jolliff, Brad; Thomas, Peter C.; Turtle, Elizabeth; Eliason, Eric; Malin, Mike; Ravine, A.; Bowman-Cisneros, Ernest

    The Lunar Reconnaissance Orbiter (LRO) spacecraft was launched on an Atlas V 401 rocket from the Cape Canaveral Air Force Station Launch Complex 41 on June 18, 2009. After spending four days in Earth-Moon transit, the spacecraft entered a three month commissioning phase in an elliptical 30×200 km orbit. On September 15, 2009, LRO began its planned one-year nominal mapping mission in a quasi-circular 50 km orbit. A multi-year extended mission in a fixed 30×200 km orbit is optional. The Lunar Reconnaissance Orbiter Camera (LROC) consists of a Wide Angle Camera (WAC) and two Narrow Angle Cameras (NACs). The WAC is a 7-color push-frame camera, which images the Moon at 100 and 400 m/pixel in the visible and UV, respectively, while the two NACs are monochrome narrow-angle linescan imagers with 0.5 m/pixel spatial resolution. LROC was specifically designed to address two of the primary LRO mission requirements and six other key science objectives, including 1) assessment of meter-and smaller-scale features in order to select safe sites for potential lunar landings near polar resources and elsewhere on the Moon; 2) acquire multi-temporal synoptic 100 m/pixel images of the poles during every orbit to unambiguously identify regions of permanent shadow and permanent or near permanent illumination; 3) meter-scale mapping of regions with permanent or near-permanent illumination of polar massifs; 4) repeat observations of potential landing sites and other regions to derive high resolution topography; 5) global multispectral observations in seven wavelengths to characterize lunar resources, particularly ilmenite; 6) a global 100-m/pixel basemap with incidence angles (60° -80° ) favorable for morphological interpretations; 7) sub-meter imaging of a variety of geologic units to characterize their physical properties, the variability of the regolith, and other key science questions; 8) meter-scale coverage overlapping with Apollo-era panoramic images (1-2 m/pixel) to document

  11. Exploring the Moon at High-Resolution: First Results From the Lunar Reconnaissance Orbiter Camera (LROC)

    NASA Astrophysics Data System (ADS)

    Robinson, Mark; Hiesinger, Harald; McEwen, Alfred; Jolliff, Brad; Thomas, Peter C.; Turtle, Elizabeth; Eliason, Eric; Malin, Mike; Ravine, A.; Bowman-Cisneros, Ernest

    The Lunar Reconnaissance Orbiter (LRO) spacecraft was launched on an Atlas V 401 rocket from the Cape Canaveral Air Force Station Launch Complex 41 on June 18, 2009. After spending four days in Earth-Moon transit, the spacecraft entered a three month commissioning phase in an elliptical 30×200 km orbit. On September 15, 2009, LRO began its planned one-year nominal mapping mission in a quasi-circular 50 km orbit. A multi-year extended mission in a fixed 30×200 km orbit is optional. The Lunar Reconnaissance Orbiter Camera (LROC) consists of a Wide Angle Camera (WAC) and two Narrow Angle Cameras (NACs). The WAC is a 7-color push-frame camera, which images the Moon at 100 and 400 m/pixel in the visible and UV, respectively, while the two NACs are monochrome narrow-angle linescan imagers with 0.5 m/pixel spatial resolution. LROC was specifically designed to address two of the primary LRO mission requirements and six other key science objectives, including 1) assessment of meter-and smaller-scale features in order to select safe sites for potential lunar landings near polar resources and elsewhere on the Moon; 2) acquire multi-temporal synoptic 100 m/pixel images of the poles during every orbit to unambiguously identify regions of permanent shadow and permanent or near permanent illumination; 3) meter-scale mapping of regions with permanent or near-permanent illumination of polar massifs; 4) repeat observations of potential landing sites and other regions to derive high resolution topography; 5) global multispectral observations in seven wavelengths to characterize lunar resources, particularly ilmenite; 6) a global 100-m/pixel basemap with incidence angles (60° -80° ) favorable for morphological interpretations; 7) sub-meter imaging of a variety of geologic units to characterize their physical properties, the variability of the regolith, and other key science questions; 8) meter-scale coverage overlapping with Apollo-era panoramic images (1-2 m/pixel) to document

  12. NFIRAOS: TMT narrow field near-infrared facility adaptive optics

    NASA Astrophysics Data System (ADS)

    Herriot, Glen; Hickson, Paul; Ellerbroek, B. L.; Andersen, D. A.; Davidge, T.; Erickson, D. A.; Powell, I. P.; Clare, R.; Gilles, L.; Boyer, C.; Smith, M.; Saddlemyer, L.; Véran, J.-P.

    2006-06-01

    Although many of the instruments planned for the TMT (Thirty Meter Telescope) have their own closely-coupled adaptive optics systems, TMT will also have a facility Adaptive Optics (AO) system, NFIRAOS, feeding three instruments on the Nasmyth platform. This Narrow-Field Infrared Adaptive Optics System, employs conventional deformable mirrors with large diameters of about 300 mm. The requirements for NFIRAOS include 1.0-2.5 microns wavelength range, 30 arcsecond diameter science field of view (FOV), excellent sky coverage, and diffraction-limited atmospheric turbulence compensation (specified at 133 nm RMS including residual telescope and science instrument errors.) The reference design for NFIRAOS includes six sodium laser guide stars over a 70 arcsecond FOV, and multiple infrared tip/tilt sensors and a natural guide star focus sensor within instruments. Larger telescopes require greater deformable mirror (DM) stroke. Although initially NFIRAOS will correct a 10 arcsecond science field, it uses two deformable mirrors in series, partly to provide sufficient stroke for atmospheric correction over the 30 m telescope aperture, but mainly to improve sky coverage by sharpening near-IR natural guide stars over a 2 arcminute diameter "technical" field. The planned upgrade to full performance includes replacing the ground-conjugated DM with a higher actuator density, and using a deformable telescope secondary mirror as a "woofer." NFIRAOS feeds three live instruments: a near-Infrared integral field Imaging spectrograph, a near-infrared echelle spectrograph, and after upgrading NFIRAOS to full multi-conjugation, a wide field (30 arcsecond) infrared camera.

  13. Measuring Distances Using Digital Cameras

    ERIC Educational Resources Information Center

    Kendal, Dave

    2007-01-01

    This paper presents a generic method of calculating accurate horizontal and vertical object distances from digital images taken with any digital camera and lens combination, where the object plane is parallel to the image plane or tilted in the vertical plane. This method was developed for a project investigating the size, density and spatial…

  14. Gamma-ray camera flyby

    SciTech Connect

    2010-01-01

    Animation based on an actual classroom demonstration of the prototype CCI-2 gamma-ray camera's ability to image a hidden radioactive source, a cesium-137 line source, in three dimensions. For more information see http://newscenter.lbl.gov/feature-stories/2010/06/02/applied-nuclear-physics/.

  15. Camera assisted multimodal user interaction

    NASA Astrophysics Data System (ADS)

    Hannuksela, Jari; Silvén, Olli; Ronkainen, Sami; Alenius, Sakari; Vehviläinen, Markku

    2010-01-01

    Since more processing power, new sensing and display technologies are already available in mobile devices, there has been increased interest in building systems to communicate via different modalities such as speech, gesture, expression, and touch. In context identification based user interfaces, these independent modalities are combined to create new ways how the users interact with hand-helds. While these are unlikely to completely replace traditional interfaces, they will considerably enrich and improve the user experience and task performance. We demonstrate a set of novel user interface concepts that rely on built-in multiple sensors of modern mobile devices for recognizing the context and sequences of actions. In particular, we use the camera to detect whether the user is watching the device, for instance, to make the decision to turn on the display backlight. In our approach the motion sensors are first employed for detecting the handling of the device. Then, based on ambient illumination information provided by a light sensor, the cameras are turned on. The frontal camera is used for face detection, while the back camera provides for supplemental contextual information. The subsequent applications triggered by the context can be, for example, image capturing, or bar code reading.

  16. Television Camera Operator. Student's Manual.

    ERIC Educational Resources Information Center

    Grimes, L. A., Jr.

    This student manual is one in a series of individualized instructional materials for use under the supervision of an instructor. The self-contained manual was developed for persons training to become television camera operators. Each assignment has all the information needed, including a list of objectives that should be met and exercise questions…

  17. The Camera Comes to Court.

    ERIC Educational Resources Information Center

    Floren, Leola

    After the Lindbergh kidnapping trial in 1935, the American Bar Association sought to eliminate electronic equipment from courtroom proceedings. Eventually, all but two states adopted regulations applying that ban to some extent, and a 1965 Supreme Court decision encouraged the banning of television cameras at trials as well. Currently, some states…

  18. Camera lens adapter magnifies image

    NASA Technical Reports Server (NTRS)

    Moffitt, F. L.

    1967-01-01

    Polaroid Land camera with an illuminated 7-power magnifier adapted to the lens, photographs weld flaws. The flaws are located by inspection with a 10-power magnifying glass and then photographed with this device, thus providing immediate pictorial data for use in remedial procedures.

  19. High speed multiwire photon camera

    NASA Technical Reports Server (NTRS)

    Lacy, Jeffrey L. (Inventor)

    1991-01-01

    An improved multiwire proportional counter camera having particular utility in the field of clinical nuclear medicine imaging. The detector utilizes direct coupled, low impedance, high speed delay lines, the segments of which are capacitor-inductor networks. A pile-up rejection test is provided to reject confused events otherwise caused by multiple ionization events occuring during the readout window.

  20. High speed multiwire photon camera

    NASA Technical Reports Server (NTRS)

    Lacy, Jeffrey L. (Inventor)

    1989-01-01

    An improved multiwire proportional counter camera having particular utility in the field of clinical nuclear medicine imaging. The detector utilizes direct coupled, low impedance, high speed delay lines, the segments of which are capacitor-inductor networks. A pile-up rejection test is provided to reject confused events otherwise caused by multiple ionization events occurring during the readout window.

  1. Making Films without a Camera.

    ERIC Educational Resources Information Center

    Cox, Carole

    1980-01-01

    Describes draw-on filmmaking as an exciting way to introduce children to the plastic, fluid nature of the film medium, to develop their appreciation and understanding of divergent cinematic techniques and themes, and to invite them into the dream world of filmmaking without the need for a camera. (AEA)

  2. Stratoscope 2 integrating television camera

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The development, construction, test and delivery of an integrating television camera for use as the primary data sensor on Flight 9 of Stratoscope 2 is described. The system block diagrams are presented along with the performance data, and definition of the interface of the telescope with the power, telemetry, and communication system.

  3. Development of filter exchangeable 3CCD camera for multispectral imaging acquisition

    NASA Astrophysics Data System (ADS)

    Lee, Hoyoung; Park, Soo Hyun; Kim, Moon S.; Noh, Sang Ha

    2012-05-01

    There are a lot of methods to acquire multispectral images. Dynamic band selective and area-scan multispectral camera has not developed yet. This research focused on development of a filter exchangeable 3CCD camera which is modified from the conventional 3CCD camera. The camera consists of F-mounted lens, image splitter without dichroic coating, three bandpass filters, three image sensors, filer exchangeable frame and electric circuit for parallel image signal processing. In addition firmware and application software have developed. Remarkable improvements compared to a conventional 3CCD camera are its redesigned image splitter and filter exchangeable frame. Computer simulation is required to visualize a pathway of ray inside of prism when redesigning image splitter. Then the dimensions of splitter are determined by computer simulation which has options of BK7 glass and non-dichroic coating. These properties have been considered to obtain full wavelength rays on all film planes. The image splitter is verified by two line lasers with narrow waveband. The filter exchangeable frame is designed to make swap bandpass filters without displacement change of image sensors on film plane. The developed 3CCD camera is evaluated to application of detection to scab and bruise on Fuji apple. As a result, filter exchangeable 3CCD camera could give meaningful functionality for various multispectral applications which need to exchange bandpass filter.

  4. Payload topography camera of Chang'e-3

    NASA Astrophysics Data System (ADS)

    Yu, Guo-Bin; Liu, En-Hai; Zhao, Ru-Jin; Zhong, Jie; Zhou, Xiang-Dong; Zhou, Wu-Lin; Wang, Jin; Chen, Yuan-Pei; Hao, Yong-Jie

    2015-11-01

    Chang'e-3 was China's first soft-landing lunar probe that achieved a successful roving exploration on the Moon. A topography camera functioning as the lander's “eye” was one of the main scientific payloads installed on the lander. It was composed of a camera probe, an electronic component that performed image compression, and a cable assembly. Its exploration mission was to obtain optical images of the lunar topography in the landing zone for investigation and research. It also observed rover movement on the lunar surface and finished taking pictures of the lander and rover. After starting up successfully, the topography camera obtained static images and video of rover movement from different directions, 360° panoramic pictures of the lunar surface around the lander from multiple angles, and numerous pictures of the Earth. All images of the rover, lunar surface, and the Earth were clear, and those of the Chinese national flag were recorded in true color. This paper describes the exploration mission, system design, working principle, quality assessment of image compression, and color correction of the topography camera. Finally, test results from the lunar surface are provided to serve as a reference for scientific data processing and application.

  5. Camera calibration approach based on adaptive active target

    NASA Astrophysics Data System (ADS)

    Zhang, Yalin; Zhou, Fuqiang; Deng, Peng

    2011-12-01

    Aiming at calibrating camera on site, where the lighting condition is hardly controlled and the quality of target images would be declined when the angle between camera and target changes, an adaptive active target is designed and the camera calibration approach based on the target is proposed. The active adaptive target in which LEDs are embedded is flat, providing active feature point. Therefore the brightness of the feature point can be modified via adjusting the electricity, judging from the threshold of image feature criteria. In order to extract features of the image accurately, the concept of subpixel-precise thresholding is also proposed. It converts the discrete representation of the digital image to continuous function by bilinear interpolation, and the sub-pixel contours are acquired by the intersection of the continuous function and the appropriate selection of threshold. According to analysis of the relationship between the features of the image and the brightness of the target, the area ratio of convex hulls and the grey value variance are adopted as the criteria. Result of experiments revealed that the adaptive active target accommodates well to the changing of the illumination in the environment, the camera calibration approach based on adaptive active target can obtain high level of accuracy and fit perfectly for image targeting in various industrial sites.

  6. Characterization of a PET Camera Optimized for ProstateImaging

    SciTech Connect

    Huber, Jennifer S.; Choong, Woon-Seng; Moses, William W.; Qi,Jinyi; Hu, Jicun; Wang, G.C.; Wilson, David; Oh, Sang; Huesman, RonaldH.; Derenzo, Stephen E.

    2005-11-11

    We present the characterization of a positron emission tomograph for prostate imaging that centers a patient between a pair of external curved detector banks (ellipse: 45 cm minor, 70 cm major axis). The distance between detector banks adjusts to allow patient access and to position the detectors as closely as possible for maximum sensitivity with patients of various sizes. Each bank is composed of two axial rows of 20 HR+ block detectors for a total of 80 detectors in the camera. The individual detectors are angled in the transaxial plane to point towards the prostate to reduce resolution degradation in that region. The detectors are read out by modified HRRT data acquisition electronics. Compared to a standard whole-body PET camera, our dedicated-prostate camera has the same sensitivity and resolution, less background (less randoms and lower scatter fraction) and a lower cost. We have completed construction of the camera. Characterization data and reconstructed images of several phantoms are shown. Sensitivity of a point source in the center is 946 cps/mu Ci. Spatial resolution is 4 mm FWHM in the central region.

  7. Casting and Angling.

    ERIC Educational Resources Information Center

    Little, Mildred J.; Bunting, Camille

    The self-contained packet contains background information, lesson plans, 15 transparency and student handout masters, drills and games, 2 objective examinations, and references for teaching a 15-day unit on casting and angling to junior high and senior high school students, either as part of a regular physical education program or as a club…

  8. The Rainbow Angle.

    ERIC Educational Resources Information Center

    Sims, B.

    1978-01-01

    Two articles in the "Scientific American" form the background of this note. The rainbow angle for the primary bow of a monochromatic Cartesian rainbow is calculated. Special projects for senior high school students could be patterned after this quantitative study. (MP)

  9. An Iterative Angle Trisection

    ERIC Educational Resources Information Center

    Muench, Donald L.

    2007-01-01

    The problem of angle trisection continues to fascinate people even though it has long been known that it can't be done with straightedge and compass alone. However, for practical purposes, a good iterative procedure can get you as close as you want. In this note, we present such a procedure. Using only straightedge and compass, our procedure…

  10. Yaw Angle Demonstration

    NASA Technical Reports Server (NTRS)

    1994-01-01

    The Large Angle Magnetic Suspension Test Fixture (LAMSTF) is a 5 degree-of -freedom repulsive force magnetic suspension system designed to study the control of objects over large magnetic gaps. A digital control algorithm uses 6 sets of laser-sheet sensors and 5 control coils to position a cylinder 3' above the plane of electromagnetics

  11. Casting and Angling.

    ERIC Educational Resources Information Center

    Smith, Julian W.

    As part of a series of books and pamphlets on outdoor education, this manual consists of easy-to-follow instructions for fishing activities dealing with casting and angling. The manual may be used as a part of the regular physical education program in schools and colleges or as a club activity for the accomplished weekend fisherman or the…

  12. Boiling Visualization and Critical Heat Flux Phenomena In Narrow Rectangular Gap

    SciTech Connect

    J. J. Kim; Y. H. Kim; S. J. Kim; S. W. Noh; K. Y. Suh; J. Rempe; F. B. Cheung; S. B. Kim

    2004-12-01

    An experimental study was performed to investifate the pool boling critical hear flux (CHF) on one-dimensional inclined rectangular channels with narrow gaps by changing the orientation of a copper test heater assembly. In a pool of saturated water at atmospheric pressure, the test parameters include the gap sizes of 1,2,5, and 10 mm, andthe surface orientation angles from the downward facing position (180 degrees) to the vertical position (90 degress) respectively.

  13. Recent advances in digital camera optics

    NASA Astrophysics Data System (ADS)

    Ishiguro, Keizo

    2012-10-01

    The digital camera market has extremely expanded in the last ten years. The zoom lens for digital camera is especially the key determining factor of the camera body size and image quality. Its technologies have been based on several analog technological progresses including the method of aspherical lens manufacturing and the mechanism of image stabilization. Panasonic is one of the pioneers of both technologies. I will introduce the previous trend in optics of zoom lens as well as original optical technologies of Panasonic digital camera "LUMIX", and in addition optics in 3D camera system. Besides, I would like to suppose the future trend in digital cameras.

  14. Measurement of the surface wavelength distribution of narrow-band radiation by a colorimetric method

    SciTech Connect

    Kraiskii, A V; Mironova, T V; Sultanov, T T

    2010-09-10

    A method is suggested for determining the wavelength of narrow-band light from a digital photograph of a radiating surface. The digital camera used should be appropriately calibrated. The accuracy of the wavelength measurement is better than 1 nm. The method was tested on the yellow doublet of mercury spectrum and on the adjacent continuum of the incandescent lamp radiation spectrum. By means of the method suggested the homogeneity of holographic sensor swelling was studied in stationary and transient cases. (laser applications and other topics in quantum electronics)

  15. Replacing 16-mm film cameras with high-definition digital cameras

    NASA Astrophysics Data System (ADS)

    Balch, Kris S.

    1995-09-01

    For many years 16 mm film cameras have been used in severe environments. These film cameras are used on Hy-G automotive sleds, airborne gun cameras, range tracking and other hazardous environments. The companies and government agencies using these cameras are in need of replacing them with a more cost effective solution. Film-based cameras still produce the best resolving capability, however, film development time, chemical disposal, recurring media cost, and faster digital analysis are factors influencing the desire for a 16 mm film camera replacement. This paper will describe a new camera from Kodak that has been designed to replace 16 mm high speed film cameras.

  16. Resolution limitations and optimization of the LLNL streak camera focus

    SciTech Connect

    Lerche, R.A.; Griffith, R.L.

    1987-09-01

    The RCA C73435 image tube is biased at voltages far from its original design in the LLNL ultrafast (10 ps) streak camera. Its output resolution at streak camera operating potentials has been measured as a function of input slit width, incident-light wavelength, and focus-grid voltage. The temporal resolution is insensitive to focus-grid voltage for a narrow (100 ..mu..m) input slit, but is very sensitive to focus-grid voltage for a wide (2 mm) input slit. At the optimum wide-slit focus voltage, temporal resolution is insensitive to slit width. Spatial resolution is nearly independent of focus-grid voltage for values that give good temporal resolution. Both temporal and spatial resolution depend on the incident-light wavelength. Data for 1.06-..mu..m light show significantly better focusing than for 0.53-..mu..m light. Streak camera operation is simulated with a computer program that calculates photoelectron trajectories. Electron ray tracing describes all of the observed effects of slit width, incident-light wavelength, and focus-grid voltage on output resolution. 7 refs.

  17. a Method for Self-Calibration in Satellite with High Precision of Space Linear Array Camera

    NASA Astrophysics Data System (ADS)

    Liu, Wei; Qian, Fangming; Miao, Yuzhe; Wang, Rongjian

    2016-06-01

    At present, the on-orbit calibration of the geometric parameters of a space surveying camera is usually processed by data from a ground calibration field after capturing the images. The entire process is very complicated and lengthy and cannot monitor and calibrate the geometric parameters in real time. On the basis of a large number of on-orbit calibrations, we found that owing to the influence of many factors, e.g., weather, it is often difficult to capture images of the ground calibration field. Thus, regular calibration using field data cannot be ensured. This article proposes a real time self-calibration method for a space linear array camera on a satellite using the optical auto collimation principle. A collimating light source and small matrix array CCD devices are installed inside the load system of the satellite; these use the same light path as the linear array camera. We can extract the location changes of the cross marks in the matrix array CCD to determine the real-time variations in the focal length and angle parameters of the linear array camera. The on-orbit status of the camera is rapidly obtained using this method. On one hand, the camera's change regulation can be mastered accurately and the camera's attitude can be adjusted in a timely manner to ensure optimal photography; in contrast, self-calibration of the camera aboard the satellite can be realized quickly, which improves the efficiency and reliability of photogrammetric processing.

  18. Surveillance of the plant growth using the camera image

    NASA Astrophysics Data System (ADS)

    Fujiwara, Nobuyuki; Terada, Kenji

    2005-12-01

    In this paper, we propose a method of surveillance of the plant growth using the camera image. This method is able to observe the condition of raising the plant in the greenhouse. The plate which is known as HORIBA is prepared for extracting harmful insect. The image of HORIBA is obtained by the camera and used for processing. The resolution of the image is 1280×960. In first process, region of the harmful insect (fly) is extracted from HORIBA by using color information. In next process the template matching is performed to examine the correlation of shape in four different angles. 16 kinds of results are obtained by four different templates. The sum logical of the results is calculated for estimation. In addition, the experimental results are shown in this paper.

  19. Search for atmospheric holes with the Viking cameras

    SciTech Connect

    Frank, L.A.; Sigwarth, J.B.; Craven, J.D. )

    1989-12-01

    Images taken with the two ultraviolet cameras on board the Viking spacecraft were examined for evidence of transient decreases of Earth's ultraviolet dayglow. Comparison of near-limb observations of dayglow intensities with those at smaller angles to the nadir with the camera sensitive to OI 130.4 nm emissions supports the existence of transient decreases in the near-nadir dayglow. However, the amount of near-nadir imaging is severely limited and only several significant events are found. More decisive confirmation of the existence of such transient decreases must await a larger survey from another spacecraft. The diameters of these regions as detected with Viking are {approximately}50 to 100 km. Occurrence frequencies, intensity decreases, and dimensions for these clusters of darkened pixels are similar to those previously reported for such events, or atmospheric holes, as seen in images of the ultraviolet dayglow with Dynamics Explorer 1.

  20. A Different Angle on Perspective

    ERIC Educational Resources Information Center

    Frantz, Marc

    2012-01-01

    When a plane figure is photographed from different viewpoints, lengths and angles appear distorted. Hence it is often assumed that lengths, angles, protractors, and compasses have no place in projective geometry. Here we describe a sense in which certain angles are preserved by projective transformations. These angles can be constructed with…

  1. The role of contact angle on unstable flow formation during infiltration and drainage in wettable porous media

    NASA Astrophysics Data System (ADS)

    Wallach, Rony; Margolis, Michal; Graber, Ellen R.

    2013-10-01

    The impact of contact angle on 2-D spatial and temporal water-content distribution during infiltration and drainage was experimentally studied. The 0.3-0.5 mm fraction of a quartz dune sand was treated and turned subcritically repellent (contact angle of 33°, 48°, 56°, and 75° for S33, S48, S56, and S75, respectively). The media were packed uniformly in transparent flow chambers and water was supplied to the surface as a point source at different rates (1-20 ml/min). A sequence of gray-value images was taken by CCD camera during infiltration and subsequent drainage; gray values were converted to volumetric water content by water volume balance. Narrow and long plumes with water accumulation behind the downward moving wetting front (tip) and negative water gradient above it (tail) developed in the S56 and S75 media during infiltration at lower water application rates. The plumes became bulbous with spatially uniform water-content distribution as water application rates increased. All plumes in these media propagated downward at a constant rate during infiltration and did not change their shape during drainage. In contrast, regular plume shapes were observed in the S33 and S48 media at all flow rates, and drainage profiles were nonmonotonic with a transition plane at the depth that water reached during infiltration. Given that the studied media have similar pore-size distributions, the conclusion is that imbibition hindered by the nonzero contact angle induced pressure buildup at the wetting front (dynamic water-entry value) that controlled the plume shape and internal water-content distribution during infiltration and drainage.

  2. Creep turns linear in narrow ferromagnetic nanostrips

    PubMed Central

    Leliaert, Jonathan; Van de Wiele, Ben; Vansteenkiste, Arne; Laurson, Lasse; Durin, Gianfranco; Dupré, Luc; Van Waeyenberge, Bartel

    2016-01-01

    The motion of domain walls in magnetic materials is a typical example of a creep process, usually characterised by a stretched exponential velocity-force relation. By performing large-scale micromagnetic simulations, and analyzing an extended 1D model which takes the effects of finite temperatures and material defects into account, we show that this creep scaling law breaks down in sufficiently narrow ferromagnetic strips. Our analysis of current-driven transverse domain wall motion in disordered Permalloy nanostrips reveals instead a creep regime with a linear dependence of the domain wall velocity on the applied field or current density. This originates from the essentially point-like nature of domain walls moving in narrow, line- like disordered nanostrips. An analogous linear relation is found also by analyzing existing experimental data on field-driven domain wall motion in perpendicularly magnetised media. PMID:26843125

  3. Current Propagation in Narrow Bipolar Pulses

    NASA Astrophysics Data System (ADS)

    Watson, S. S.; Marshall, T. C.

    2005-12-01

    We model the observed electric fields of a particular narrow bipolar pulse (NBP) published in Eack [2004]. We assume an exponential growth of current carriers due to a runaway breakdown avalanche and show that this leads to a corresponding increase in current. With specific input values for discharge altitude, length, current, and propagation velocity, the model does a good job of reproducing the observed near and far electric field. The ability of the model to reproduce the observed electric fields is an indication that our assumptions concerning the runaway avalanche may be correct, and this indication is further strengthened by the inability of the simple transmission line model to reproduce simultaneously both the near and far electric fields. Eack, K. B. (2004), Electrical characteristics of narrow bipolar events, Geophys. Res. Lett., 31, L20102, doi:10.1029/2004/GL021117.

  4. Creep turns linear in narrow ferromagnetic nanostrips.

    PubMed

    Leliaert, Jonathan; Van de Wiele, Ben; Vansteenkiste, Arne; Laurson, Lasse; Durin, Gianfranco; Dupré, Luc; Van Waeyenberge, Bartel

    2016-01-01

    The motion of domain walls in magnetic materials is a typical example of a creep process, usually characterised by a stretched exponential velocity-force relation. By performing large-scale micromagnetic simulations, and analyzing an extended 1D model which takes the effects of finite temperatures and material defects into account, we show that this creep scaling law breaks down in sufficiently narrow ferromagnetic strips. Our analysis of current-driven transverse domain wall motion in disordered Permalloy nanostrips reveals instead a creep regime with a linear dependence of the domain wall velocity on the applied field or current density. This originates from the essentially point-like nature of domain walls moving in narrow, line- like disordered nanostrips. An analogous linear relation is found also by analyzing existing experimental data on field-driven domain wall motion in perpendicularly magnetised media. PMID:26843125

  5. Creep turns linear in narrow ferromagnetic nanostrips

    NASA Astrophysics Data System (ADS)

    Leliaert, Jonathan; van de Wiele, Ben; Vansteenkiste, Arne; Laurson, Lasse; Durin, Gianfranco; Dupré, Luc; van Waeyenberge, Bartel

    2016-02-01

    The motion of domain walls in magnetic materials is a typical example of a creep process, usually characterised by a stretched exponential velocity-force relation. By performing large-scale micromagnetic simulations, and analyzing an extended 1D model which takes the effects of finite temperatures and material defects into account, we show that this creep scaling law breaks down in sufficiently narrow ferromagnetic strips. Our analysis of current-driven transverse domain wall motion in disordered Permalloy nanostrips reveals instead a creep regime with a linear dependence of the domain wall velocity on the applied field or current density. This originates from the essentially point-like nature of domain walls moving in narrow, line- like disordered nanostrips. An analogous linear relation is found also by analyzing existing experimental data on field-driven domain wall motion in perpendicularly magnetised media.

  6. Narrow-Line Seyfert 1 Galaxies

    NASA Technical Reports Server (NTRS)

    Leighly, Karen M.

    2000-01-01

    The primary work during this year has been the analysis and interpretation of our HST spectra from two extreme Narrow-line Seyfert 1 galaxies (NLS1s) Infrared Astronomy Satellite (IRAS) 13224-3809 and 1H 0707-495. This work has been presented as an invited talk at the workshop entitled "Observational and theoretical progress in the Study of Narrow-line Seyfert 1 Galaxies" held in Bad Honnef, Germany December 8-11, as a contributed talk at the January 2000 AAS meeting in Atlanta, Georgia, and as a contributed talk at the workshop "Probing the Physics of Active Galactic Nuclei by Multiwavelength Monitoring" held at Goddard Space Flight Center June 20-22, 2000.

  7. Pumice-pumice collisions and the effect of the impact angle

    NASA Astrophysics Data System (ADS)

    Cagnoli, B.; Manga, M.

    2003-06-01

    Using a high-speed video camera, we studied oblique collisions of lapilli-size pumice cylinders (with no rotation before impact) on flat pumice targets. Our results show that the rebound angle, the ratios of the components of velocities and the energy loss vary with the impact angle. In particular, in collisions with an average yaw angle approximately equal to zero, we observed relatively larger rebound angles at small and large impact angles and smaller values in between (the angles are measured from the horizontal surfaces of the targets). We observed also that the ratio of the normal components of velocities decreases and the ratio of the horizontal components increases when the impact angle increases. Furthermore, the ratio of the kinetic energy after to that before collisions, in general, decreases when the impact angle increases. Thus, our experiments reveal features that could be useful in modelling pumice-pumice collisions in geophysical flows.

  8. Multiwatts narrow linewidth fiber Raman amplifiers.

    PubMed

    Feng, Yan; Taylor, Luke; Bonaccini Calia, Domenico

    2008-07-21

    Up to 4.8 W, approximately 10 MHz, 1178 nm laser is obtained by Raman amplification of a distributed feedback diode laser in standard single mode fibers pumped by an 1120 nm Yb fiber laser. More than 10% efficiency and 27 dB amplification is achieved, limited by onset of stimulated Brillouin scattering. The ratio of Raman to Brillouin gain coefficient of a fiber is identified as a figure of merit for building a narrow linewidth fiber Raman amplifier. PMID:18648406

  9. Contact Angle Measurements Using a Simplified Experimental Setup

    ERIC Educational Resources Information Center

    Lamour, Guillaume; Hamraoui, Ahmed; Buvailo, Andrii; Xing, Yangjun; Keuleyan, Sean; Prakash, Vivek; Eftekhari-Bafrooei, Ali; Borguet, Eric

    2010-01-01

    A basic and affordable experimental apparatus is described that measures the static contact angle of a liquid drop in contact with a solid. The image of the drop is made with a simple digital camera by taking a picture that is magnified by an optical lens. The profile of the drop is then processed with ImageJ free software. The ImageJ contact…

  10. The wide field/planetary camera

    NASA Technical Reports Server (NTRS)

    Westphal, J. A.; Baum, W. A.; Code, A. D.; Currie, D. G.; Danielson, G. E.; Gunn, J. E.; Kelsall, T. F.; Kristian, J. A.; Lynds, C. R.; Seidelmann, P. K.

    1982-01-01

    A wide site of potential astronomical and solar system scientific studies using the wide field planetary camera on space telescope are described. The expected performance of the camera as it approaches final assembly and testing is also detailed.

  11. Advanced camera for the Hubble Space Telescope

    NASA Astrophysics Data System (ADS)

    Ford, Holland C.; Feldman, Paul D.; Golimowski, David A.; Tsvetanov, Zlatan; Bartko, Frank; Crocker, James H.; Bely, Pierre Y.; Brown, Robert A.; Burrows, Christopher J.; Clampin, Mark; Hartig, George F.; Postman, Marc; Rafal, Marc D.; Sparks, William B.; White, Richard L.; Broadhurst, Tom; Illingworth, Garth; Kelly, Tim; Woodruff, Robert A.; Cheng, Edward; Kimble, Randy A.; Krebs, Carolyn A.; Neff, Susan G.; Lesser, Michael P.; Miley, George

    1996-10-01

    The Advanced Camera for the Hubble Space Telescope will have three cameras. The first, the Wide Field Camera, will be a high throughput (45% at 700 nm, including the HST optical telescope assembly), wide field (200' X 204'), optical and I-band camera that is half critically sampled at 500 nm. The second, the High Resolution Camera (HRC), is critically sampled at 500 nm, and has a 26' X 29' field of view and 25% throughput at 600 nm. The HRC optical path will include a coronagraph which will improve the HST contrast near bright objects by a factor of approximately 10. The third camera is a far ultraviolet, Solar-Blind Camera that has a relatively high throughput (6% at 121.6 nm) over a 26' X 29' field of view. The Advanced Camera for Surveys will increase HST's capability for surveys and discovery by at least a factor of ten.

  12. Lytro camera technology: theory, algorithms, performance analysis

    NASA Astrophysics Data System (ADS)

    Georgiev, Todor; Yu, Zhan; Lumsdaine, Andrew; Goma, Sergio

    2013-03-01

    The Lytro camera is the first implementation of a plenoptic camera for the consumer market. We consider it a successful example of the miniaturization aided by the increase in computational power characterizing mobile computational photography. The plenoptic camera approach to radiance capture uses a microlens array as an imaging system focused on the focal plane of the main camera lens. This paper analyzes the performance of Lytro camera from a system level perspective, considering the Lytro camera as a black box, and uses our interpretation of Lytro image data saved by the camera. We present our findings based on our interpretation of Lytro camera file structure, image calibration and image rendering; in this context, artifacts and final image resolution are discussed.

  13. Angles in the Sky?

    NASA Astrophysics Data System (ADS)

    Behr, Bradford

    2005-09-01

    Tycho Brahe lived and worked in the late 1500s before the telescope was invented. He made highly accurate observations of the positions of planets, stars, and comets using large angle-measuring devices of his own design. You can use his techniques to observe the sky as well. For example, the degree, a common unit of measurement in astronomy, can be measured by holding your fist at arm's length up to the sky. Open your fist and observe the distance across the sky covered by the width of your pinky fingernail. That is, roughly, a degree! After some practice, and knowing that one degree equals four minutes, you can measure elapsed time by measuring the angle of the distance that the Moon appears to have moved and multiplying that number by four. You can also figure distances and sizes of things. These are not precise measurements, but rough estimates that can give you a "close-enough" answer.

  14. Line Narrowing in Solid-State Proton NMR with Acquisition Delay

    NASA Astrophysics Data System (ADS)

    Fung, B. M.; Tong, Tat-Hung; Dollase, Thilo; Magnuson, Matthew L.

    Organic solids have extensive proton-proton dipolar interactions, and their 1H NMR linewidths are very large even with magic-angle spinning at moderate speeds. Recently it has been shown that substantial narrowing of the proton linewidths of organic solids can be achieved by using single-pulse excitation with acquisition delay or spin echo [S. Ding and C. A. McDowell, J. Magn. Reson. A111, 212 (1994); 115, 141 (1995); 117, 171 (1995)]. This interesting line-narrowing phenomenon has been further examined through the study of several amino acids, their deuterated analogs, and some aromatic compounds. The results confirm that narrow proton peaks are observed with long acquisition delay, and the peaks appear in the appropriate chemical-shift ranges for organic protons (0-10 ppm with respect to tetramethylsilane). However, except for some special cases, the observed peaks cannot be assigned to individual types of protons based on chemical-shift considerations only. To explore the reason for the line narrowing, the effect of acquisition delay on the 19F linewidth of CaF 2was also studied and compared with that on the 1H linewidths of organic solids. It is suggested that the broad proton peak in an organic solid is a superposition of numerous transitions. These transitions have different linewidths, and the narrow peaks in the spectrum remain observable with long acquisition delays.

  15. Analysis and protection of stray light for the space camera at geosynchronous orbit

    NASA Astrophysics Data System (ADS)

    Jin, Xiaorui; Lin, Li

    2012-11-01

    Stray light is the general term for all non-normal transmission of light in the optical system. The influence of stray light is different according to optical system's structure. Large area array camera at geosynchronous orbit is facing more serious influence of stray light, especially for the small incident angle of sunlight on the system. It is in dire need of a detailed analysis of stray light of the basic shape of the optical system .In the paper, the influence on the camera used in space from stray light and the necessity to eliminate stray light are presented. The definition of the stray light coefficient and PST(point source transmittance) is briefed. In Tracepro, analysis of the impact of sunlight incident was made at different angles on the space camera, in the case of stray light factor for the quantitative evaluation. The design principle of the inside and outside hood is presented for the R-C (Ritchey Chretien) optical system. On this basis, in order to reduce stray light interference for the space camera, the primary and secondary mirror's hoods were designed. Finally, when the incidence angle of sunlight is more than 3° incidence on the space camera, the coefficient of stray light is less than 2%. It meets the engineering requirements.

  16. Automatic Kappa Angle Estimation for Air Photos Based on Phase Only Correlation

    NASA Astrophysics Data System (ADS)

    Xiong, Z.; Stanley, D.; Xin, Y.

    2016-06-01

    The approximate value of exterior orientation parameters is needed for air photo bundle adjustment. Usually the air borne GPS/IMU can provide the initial value for the camera position and attitude angle. However, in some cases, the camera's attitude angle is not available due to lack of IMU or other reasons. In this case, the kappa angle needs to be estimated for each photo before bundle adjustment. The kappa angle can be obtained from the Ground Control Points (GCPs) in the photo. Unfortunately it is not the case that enough GCPs are always available. In order to overcome this problem, an algorithm is developed to automatically estimate the kappa angle for air photos based on phase only correlation technique. This function has been embedded in PCI software. Extensive experiments show that this algorithm is fast, reliable, and stable.

  17. Laser angle sensor

    NASA Technical Reports Server (NTRS)

    Pond, C. R.; Texeira, P. D.

    1985-01-01

    A laser angle measurement system was designed and fabricated for NASA Langley Research Center. The instrument is a fringe counting interferometer that monitors the pitch attitude of a model in a wind tunnel. A laser source and detector are mounted above the model. Interference fringes are generated by a small passive element on the model. The fringe count is accumulated and displayed by a processor in the wind tunnel control room. This report includes optical and electrical schematics, system maintenance and operation procedures.

  18. Angle states in quantum mechanics

    NASA Astrophysics Data System (ADS)

    de la Torre, A. C.; Iguain, J. L.

    1998-12-01

    Angle states and angle operators are defined for a system with arbitrary angular momentum. They provide a reasonable formalization of the concept of angle provided that we accept that the angular orientation is quantized. The angle operator is the generator of boosts in angular momentum and is, almost everywhere, linearly related to the logarithm of the shift operator. Angle states for fermions and bosons behave differently under parity transformation.

  19. Small Angle Neutron Scattering

    SciTech Connect

    Urban, Volker S

    2012-01-01

    Small Angle Neutron Scattering (SANS) probes structural details at the nanometer scale in a non-destructive way. This article gives an introduction to scientists who have no prior small-angle scattering knowledge, but who seek a technique that allows elucidating structural information in challenging situations that thwart approaches by other methods. SANS is applicable to a wide variety of materials including metals and alloys, ceramics, concrete, glasses, polymers, composites and biological materials. Isotope and magnetic interactions provide unique methods for labeling and contrast variation to highlight specific structural features of interest. In situ studies of a material s responses to temperature, pressure, shear, magnetic and electric fields, etc., are feasible as a result of the high penetrating power of neutrons. SANS provides statistical information on significant structural features averaged over the probed sample volume, and one can use SANS to quantify with high precision the structural details that are observed, for example, in electron microscopy. Neutron scattering is non-destructive; there is no need to cut specimens into thin sections, and neutrons penetrate deeply, providing information on the bulk material, free from surface effects. The basic principles of a SANS experiment are fairly simple, but the measurement, analysis and interpretation of small angle scattering data involves theoretical concepts that are unique to the technique and that are not widely known. This article includes a concise description of the basics, as well as practical know-how that is essential for a successful SANS experiment.

  20. Electronographic cameras for space astronomy.

    NASA Technical Reports Server (NTRS)

    Carruthers, G. R.; Opal, C. B.

    1972-01-01

    Magnetically-focused electronographic cameras have been under development at the Naval Research Laboratory for use in far-ultraviolet imagery and spectrography, primarily in astronomical and optical-geophysical observations from sounding rockets and space vehicles. Most of this work has been with cameras incorporating internal optics of the Schmidt or wide-field all-reflecting types. More recently, we have begun development of electronographic spectrographs incorporating an internal concave grating, operating at normal or grazing incidence. We also are developing electronographic image tubes of the conventional end-window-photo-cathode type, for far-ultraviolet imagery at the focus of a large space telescope, with image formats up to 120 mm in diameter.

  1. Combustion pinhole-camera system

    DOEpatents

    Witte, A.B.

    1982-05-19

    A pinhole camera system is described utilizing a sealed optical-purge assembly which provides optical access into a coal combustor or other energy conversion reactors. The camera system basically consists of a focused-purge pinhole optical port assembly, a conventional TV vidicon receiver, an external, variable density light filter which is coupled electronically to the vidicon automatic gain control (agc). The key component of this system is the focused-purge pinhole optical port assembly which utilizes a purging inert gas to keep debris from entering the port and a lens arrangement which transfers the pinhole to the outside of the port assembly. One additional feature of the port assembly is that it is not flush with the interior of the combustor.

  2. The Dark Energy Survey Camera

    NASA Astrophysics Data System (ADS)

    Flaugher, Brenna

    2012-03-01

    The Dark Energy Survey Collaboration has built the Dark Energy Camera (DECam), a 3 square degree, 520 Megapixel CCD camera which is being mounted on the Blanco 4-meter telescope at CTIO. DECam will be used to carry out the 5000 sq. deg. Dark Energy Survey, using 30% of the telescope time over a 5 year period. During the remainder of the time, and after the survey, DECam will be available as a community instrument. Construction of DECam is complete. The final components were shipped to Chile in Dec. 2011 and post-shipping checkout is in progress in Dec-Jan. Installation and commissioning on the telescope are taking place in 2012. A summary of lessons learned and an update of the performance of DECam and the status of the DECam installation and commissioning will be presented.

  3. A 10-microm infrared camera.

    PubMed

    Arens, J F; Jernigan, J G; Peck, M C; Dobson, C A; Kilk, E; Lacy, J; Gaalema, S

    1987-09-15

    An IR camera has been built at the University of California at Berkeley for astronomical observations. The camera has been used primarily for high angular resolution imaging at mid-IR wavelengths. It has been tested at the University of Arizona 61- and 90-in. telescopes near Tucson and the NASA Infrared Telescope Facility on Mauna Kea, HI. In the observations the system has been used as an imager with interference coated and Fabry-Perot filters. These measurements have demonstrated a sensitivity consistent with photon shot noise, showing that the system is limited by the radiation from the telescope and atmosphere. Measurements of read noise, crosstalk, and hysteresis have been made in our laboratory. PMID:20490151

  4. Corrective Optics For Camera On Telescope

    NASA Technical Reports Server (NTRS)

    Macenka, Steven A.; Meinel, Aden B.

    1994-01-01

    Assembly of tilted, aspherical circularly symmetric mirrors used as corrective optical subsystem for camera mounted on telescope exhibiting both large spherical wave-front error and inherent off-axis astigmatism. Subsystem provides unobscured camera aperture and diffraction-limited camera performance, despite large telescope aberrations. Generic configuration applied in other optical systems in which aberations deliberately introduced into telescopes and corrected in associated cameras. Concept of corrective optical subsystem provides designer with additional degrees of freedom used to optimize optical system.

  5. SPEIR: A Ge Compton Camera

    SciTech Connect

    Mihailescu, L; Vetter, K M; Burks, M T; Hull, E L; Craig, W W

    2004-02-11

    The SPEctroscopic Imager for {gamma}-Rays (SPEIR) is a new concept of a compact {gamma}-ray imaging system of high efficiency and spectroscopic resolution with a 4-{pi} field-of-view. The system behind this concept employs double-sided segmented planar Ge detectors accompanied by the use of list-mode photon reconstruction methods to create a sensitive, compact Compton scatter camera.

  6. Graphic design of pinhole cameras

    NASA Technical Reports Server (NTRS)

    Edwards, H. B.; Chu, W. P.

    1979-01-01

    The paper describes a graphic technique for the analysis and optimization of pinhole size and focal length. The technique is based on the use of the transfer function of optical elements described by Scott (1959) to construct the transfer function of a circular pinhole camera. This transfer function is the response of a component or system to a pattern of lines having a sinusoidally varying radiance at varying spatial frequencies. Some specific examples of graphic design are presented.

  7. Empirical Study on Designing of Gaze Tracking Camera Based on the Information of User's Head Movement.

    PubMed

    Pan, Weiyuan; Jung, Dongwook; Yoon, Hyo Sik; Lee, Dong Eun; Naqvi, Rizwan Ali; Lee, Kwan Woo; Park, Kang Ryoung

    2016-01-01

    Gaze tracking is the technology that identifies a region in space that a user is looking at. Most previous non-wearable gaze tracking systems use a near-infrared (NIR) light camera with an NIR illuminator. Based on the kind of camera lens used, the viewing angle and depth-of-field (DOF) of a gaze tracking camera can be different, which affects the performance of the gaze tracking system. Nevertheless, to our best knowledge, most previous researches implemented gaze tracking cameras without ground truth information for determining the optimal viewing angle and DOF of the camera lens. Eye-tracker manufacturers might also use ground truth information, but they do not provide this in public. Therefore, researchers and developers of gaze tracking systems cannot refer to such information for implementing gaze tracking system. We address this problem providing an empirical study in which we design an optimal gaze tracking camera based on experimental measurements of the amount and velocity of user's head movements. Based on our results and analyses, researchers and developers might be able to more easily implement an optimal gaze tracking system. Experimental results show that our gaze tracking system shows high performance in terms of accuracy, user convenience and interest. PMID:27589768

  8. Ortho-Rectification of Narrow Band Multi-Spectral Imagery Assisted by Dslr RGB Imagery Acquired by a Fixed-Wing Uas

    NASA Astrophysics Data System (ADS)

    Rau, J.-Y.; Jhan, J.-P.; Huang, C.-Y.

    2015-08-01

    Miniature Multiple Camera Array (MiniMCA-12) is a frame-based multilens/multispectral sensor composed of 12 lenses with narrow band filters. Due to its small size and light weight, it is suitable to mount on an Unmanned Aerial System (UAS) for acquiring high spectral, spatial and temporal resolution imagery used in various remote sensing applications. However, due to its wavelength range is only 10 nm that results in low image resolution and signal-to-noise ratio which are not suitable for image matching and digital surface model (DSM) generation. In the meantime, the spectral correlation among all 12 bands of MiniMCA images are low, it is difficult to perform tie-point matching and aerial triangulation at the same time. In this study, we thus propose the use of a DSLR camera to assist automatic aerial triangulation of MiniMCA-12 imagery and to produce higher spatial resolution DSM for MiniMCA12 ortho-image generation. Depending on the maximum payload weight of the used UAS, these two kinds of sensors could be collected at the same time or individually. In this study, we adopt a fixed-wing UAS to carry a Canon EOS 5D Mark2 DSLR camera and a MiniMCA-12 multi-spectral camera. For the purpose to perform automatic aerial triangulation between a DSLR camera and the MiniMCA-12, we choose one master band from MiniMCA-12 whose spectral range has overlap with the DSLR camera. However, all lenses of MiniMCA-12 have different perspective centers and viewing angles, the original 12 channels have significant band misregistration effect. Thus, the first issue encountered is to reduce the band misregistration effect. Due to all 12 MiniMCA lenses being frame-based, their spatial offsets are smaller than 15 cm and all images are almost 98% overlapped, we thus propose a modified projective transformation (MPT) method together with two systematic error correction procedures to register all 12 bands of imagery on the same image space. It means that those 12 bands of images acquired at

  9. Coaxial fundus camera for opthalmology

    NASA Astrophysics Data System (ADS)

    de Matos, Luciana; Castro, Guilherme; Castro Neto, Jarbas C.

    2015-09-01

    A Fundus Camera for ophthalmology is a high definition device which needs to meet low light illumination of the human retina, high resolution in the retina and reflection free image1. Those constraints make its optical design very sophisticated, but the most difficult to comply with is the reflection free illumination and the final alignment due to the high number of non coaxial optical components in the system. Reflection of the illumination, both in the objective and at the cornea, mask image quality, and a poor alignment make the sophisticated optical design useless. In this work we developed a totally axial optical system for a non-midriatic Fundus Camera. The illumination is performed by a LED ring, coaxial with the optical system and composed of IR of visible LEDs. The illumination ring is projected by the objective lens in the cornea. The Objective, LED illuminator, CCD lens are coaxial making the final alignment easily to perform. The CCD + capture lens module is a CCTV camera with autofocus and Zoom built in, added to a 175 mm focal length doublet corrected for infinity, making the system easily operated and very compact.

  10. Solid-state array cameras.

    PubMed

    Strull, G; List, W F; Irwin, E L; Farnsworth, D L

    1972-05-01

    Over the past few years there has been growing interest shown in the rapidly maturing technology of totally solid-state imaging. This paper presents a synopsis of developments made in this field at the Westinghouse ATL facilities with emphasis on row-column organized monolithic arrays of diffused junction phototransistors. The complete processing sequence applicable to the fabrication of modern highdensity arrays is described from wafer ingot preparation to final sensor testing. Special steps found necessary for high yield processing, such as surface etching prior to both sawing and lapping, are discussed along with the rationale behind their adoption. Camera systems built around matrix array photosensors are presented in a historical time-wise progression beginning with the first 50 x 50 element converter developed in 1965 and running through the most recent 400 x 500 element system delivered in 1972. The freedom of mechanical architecture made available to system designers by solid-state array cameras is noted from the description of a bare-chip packaged cubic inch camera. Hybrid scan systems employing one-dimensional line arrays are cited, and the basic tradeoffs to their use are listed. PMID:20119094

  11. Unassisted 3D camera calibration

    NASA Astrophysics Data System (ADS)

    Atanassov, Kalin; Ramachandra, Vikas; Nash, James; Goma, Sergio R.

    2012-03-01

    With the rapid growth of 3D technology, 3D image capture has become a critical part of the 3D feature set on mobile phones. 3D image quality is affected by the scene geometry as well as on-the-device processing. An automatic 3D system usually assumes known camera poses accomplished by factory calibration using a special chart. In real life settings, pose parameters estimated by factory calibration can be negatively impacted by movements of the lens barrel due to shaking, focusing, or camera drop. If any of these factors displaces the optical axes of either or both cameras, vertical disparity might exceed the maximum tolerable margin and the 3D user may experience eye strain or headaches. To make 3D capture more practical, one needs to consider unassisted (on arbitrary scenes) calibration. In this paper, we propose an algorithm that relies on detection and matching of keypoints between left and right images. Frames containing erroneous matches, along with frames with insufficiently rich keypoint constellations, are detected and discarded. Roll, pitch yaw , and scale differences between left and right frames are then estimated. The algorithm performance is evaluated in terms of the remaining vertical disparity as compared to the maximum tolerable vertical disparity.

  12. Interference-induced angle-independent acoustical transparency

    SciTech Connect

    Qi, Lehua; Yu, Gaokun Wang, Ning; Wang, Xinlong; Wang, Guibo

    2014-12-21

    It is revealed that the Fano-like interference leads to the extraordinary acoustic transmission through a slab metamaterial of thickness much smaller than the wavelength, with each unit cell consisting of a Helmholtz resonator and a narrow subwavelength slit. More importantly, both the theoretical analysis and experimental measurement show that the angle-independent acoustical transparency can be realized by grafting a Helmholtz resonator and a quarter-wave resonator to the wall of a narrow subwavelength slit in each unit cell of a slit array. The observed phenomenon results from the interferences between the waves propagating in the slit, those re-radiated by the Helmholtz resonator, and those re-radiated by the quarter-wave resonator. The proposed design may find its applications in designing angle-independent acoustical filters and controlling the phase of the transmitted waves.

  13. 21 CFR 886.1120 - Ophthalmic camera.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 8 2013-04-01 2013-04-01 false Ophthalmic camera. 886.1120 Section 886.1120 Food... DEVICES OPHTHALMIC DEVICES Diagnostic Devices § 886.1120 Ophthalmic camera. (a) Identification. An ophthalmic camera is an AC-powered device intended to take photographs of the eye and the surrounding...

  14. 21 CFR 886.1120 - Opthalmic camera.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 8 2012-04-01 2012-04-01 false Opthalmic camera. 886.1120 Section 886.1120 Food... DEVICES OPHTHALMIC DEVICES Diagnostic Devices § 886.1120 Opthalmic camera. (a) Identification. An ophthalmic camera is an AC-powered device intended to take photographs of the eye and the surrounding...

  15. 21 CFR 886.1120 - Ophthalmic camera.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 8 2014-04-01 2014-04-01 false Ophthalmic camera. 886.1120 Section 886.1120 Food... DEVICES OPHTHALMIC DEVICES Diagnostic Devices § 886.1120 Ophthalmic camera. (a) Identification. An ophthalmic camera is an AC-powered device intended to take photographs of the eye and the surrounding...

  16. 21 CFR 886.1120 - Opthalmic camera.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Opthalmic camera. 886.1120 Section 886.1120 Food... DEVICES OPHTHALMIC DEVICES Diagnostic Devices § 886.1120 Opthalmic camera. (a) Identification. An ophthalmic camera is an AC-powered device intended to take photographs of the eye and the surrounding...

  17. 16 CFR 501.1 - Camera film.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 16 Commercial Practices 1 2011-01-01 2011-01-01 false Camera film. 501.1 Section 501.1 Commercial... 500 § 501.1 Camera film. Camera film packaged and labeled for retail sale is exempt from the net... should be expressed, provided: (a) The net quantity of contents on packages of movie film and bulk...

  18. 16 CFR 501.1 - Camera film.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 16 Commercial Practices 1 2010-01-01 2010-01-01 false Camera film. 501.1 Section 501.1 Commercial... 500 § 501.1 Camera film. Camera film packaged and labeled for retail sale is exempt from the net... should be expressed, provided: (a) The net quantity of contents on packages of movie film and bulk...

  19. 16 CFR 501.1 - Camera film.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 16 Commercial Practices 1 2012-01-01 2012-01-01 false Camera film. 501.1 Section 501.1 Commercial... 500 § 501.1 Camera film. Camera film packaged and labeled for retail sale is exempt from the net... should be expressed, provided: (a) The net quantity of contents on packages of movie film and bulk...

  20. 16 CFR 501.1 - Camera film.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 16 Commercial Practices 1 2014-01-01 2014-01-01 false Camera film. 501.1 Section 501.1 Commercial... 500 § 501.1 Camera film. Camera film packaged and labeled for retail sale is exempt from the net... should be expressed, provided: (a) The net quantity of contents on packages of movie film and bulk...

  1. 16 CFR 501.1 - Camera film.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 16 Commercial Practices 1 2013-01-01 2013-01-01 false Camera film. 501.1 Section 501.1 Commercial... 500 § 501.1 Camera film. Camera film packaged and labeled for retail sale is exempt from the net... should be expressed, provided: (a) The net quantity of contents on packages of movie film and bulk...

  2. 21 CFR 892.1110 - Positron camera.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Positron camera. 892.1110 Section 892.1110 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES RADIOLOGY DEVICES Diagnostic Devices § 892.1110 Positron camera. (a) Identification. A positron camera is a device intended to image...

  3. Speckle Camera Imaging of the Planet Pluto

    NASA Astrophysics Data System (ADS)

    Howell, Steve B.; Horch, Elliott P.; Everett, Mark E.; Ciardi, David R.

    2012-10-01

    We have obtained optical wavelength (692 nm and 880 nm) speckle imaging of the planet Pluto and its largest moon Charon. Using our DSSI speckle camera attached to the Gemini North 8 m telescope, we collected high resolution imaging with an angular resolution of ~20 mas, a value at the Gemini-N telescope diffraction limit. We have produced for this binary system the first speckle reconstructed images, from which we can measure not only the orbital separation and position angle for Charon, but also the diameters of the two bodies. Our measurements of these parameters agree, within the uncertainties, with the current best values for Pluto and Charon. The Gemini-N speckle observations of Pluto are presented to illustrate the capabilities of our instrument and the robust production of high accuracy, high spatial resolution reconstructed images. We hope our results will suggest additional applications of high resolution speckle imaging for other objects within our solar system and beyond. Based on observations obtained at the Gemini Observatory, which is operated by the Association of Universities for Research in Astronomy, Inc., under a cooperative agreement with the National Science Foundation on behalf of the Gemini partnership: the National Science Foundation (United States), the Science and Technology Facilities Council (United Kingdom), the National Research Council (Canada), CONICYT (Chile), the Australian Research Council (Australia), Ministério da Ciência, Tecnologia e Inovação (Brazil) and Ministerio de Ciencia, Tecnología e Innovación Productiva (Argentina).

  4. Bio-inspired hemispherical compound eye camera

    NASA Astrophysics Data System (ADS)

    Xiao, Jianliang; Song, Young Min; Xie, Yizhu; Malyarchuk, Viktor; Jung, Inhwa; Choi, Ki-Joong; Liu, Zhuangjian; Park, Hyunsung; Lu, Chaofeng; Kim, Rak-Hwan; Li, Rui; Crozier, Kenneth B.; Huang, Yonggang; Rogers, John A.

    2014-03-01

    Compound eyes in arthropods demonstrate distinct imaging characteristics from human eyes, with wide angle field of view, low aberrations, high acuity to motion and infinite depth of field. Artificial imaging systems with similar geometries and properties are of great interest for many applications. However, the challenges in building such systems with hemispherical, compound apposition layouts cannot be met through established planar sensor technologies and conventional optics. We present our recent progress in combining optics, materials, mechanics and integration schemes to build fully functional artificial compound eye cameras. Nearly full hemispherical shapes (about 160 degrees) with densely packed artificial ommatidia were realized. The number of ommatidia (180) is comparable to those of the eyes of fire ants and bark beetles. The devices combine elastomeric compound optical elements with deformable arrays of thin silicon photodetectors, which were fabricated in the planar geometries and then integrated and elastically transformed to hemispherical shapes. Imaging results and quantitative ray-tracing-based simulations illustrate key features of operation. These general strategies seem to be applicable to other compound eye devices, such as those inspired by moths and lacewings (refracting superposition eyes), lobster and shrimp (reflecting superposition eyes), and houseflies (neural superposition eyes).

  5. Cooling the dark energy camera instrument

    SciTech Connect

    Schmitt, R.L.; Cease, H.; DePoy, D.; Diehl, H.T.; Estrada, J.; Flaugher, B.; Kuhlmann, S.; Onal, Birce; Stefanik, A.; /Fermilab

    2008-06-01

    DECam, camera for the Dark Energy Survey (DES), is undergoing general design and component testing. For an overview see DePoy, et al in these proceedings. For a description of the imager, see Cease, et al in these proceedings. The CCD instrument will be mounted at the prime focus of the CTIO Blanco 4m telescope. The instrument temperature will be 173K with a heat load of 113W. In similar applications, cooling CCD instruments at the prime focus has been accomplished by three general methods. Liquid nitrogen reservoirs have been constructed to operate in any orientation, pulse tube cryocoolers have been used when tilt angles are limited and Joule-Thompson or Stirling cryocoolers have been used with smaller heat loads. Gifford-MacMahon cooling has been used at the Cassegrain but not at the prime focus. For DES, the combined requirements of high heat load, temperature stability, low vibration, operation in any orientation, liquid nitrogen cost and limited space available led to the design of a pumped, closed loop, circulating nitrogen system. At zenith the instrument will be twelve meters above the pump/cryocooler station. This cooling system expected to have a 10,000 hour maintenance interval. This paper will describe the engineering basis including the thermal model, unbalanced forces, cooldown time, the single and two-phase flow model.

  6. Silicone Contamination Camera for Developed for Shuttle Payloads

    NASA Technical Reports Server (NTRS)

    1996-01-01

    On many shuttle missions, silicone contamination from unknown sources from within or external to the shuttle payload bay has been a chronic problem plaguing experiment payloads. There is currently a wide range of silicone usage on the shuttle. Silicones are used to coat the shuttle tiles to enhance their ability to shed rain, and over 100 kg of RTV 560 silicone is used to seal white tiles to the shuttle surfaces. Silicones are also used in electronic components, potting compounds, and thermal control blankets. Efforts to date to identify and eliminate the sources of silicone contamination have not been highly successful and have created much controversy. To identify the sources of silicone contamination on the space shuttle, the NASA Lewis Research Center developed a contamination camera. This specially designed pinhole camera utilizes low-Earth-orbit atomic oxygen to develop a picture that identifies sources of silicone contamination on shuttle-launched payloads. The volatile silicone species travel through the aperture of the pinhole camera, and since volatile silicone species lose their hydrocarbon functionalities under atomic oxygen attack, the silicone adheres to the substrate as SiO_x. This glassy deposit should be spatially arranged in the image of the sources of silicone contamination. To view the contamination image, one can use ultrasensitive thickness measurement techniques, such as scanning variable-angle ellipsometry, to map the surface topography of the camera's substrate. The demonstration of a functional contamination camera would resolve the controversial debate concerning the amount and location of contamination sources, would allow corrective actions to be taken, and would demonstrate a useful tool for contamination documentation on future shuttle payloads, with near negligible effect on cost and weight.

  7. Image quality testing of assembled IR camera modules

    NASA Astrophysics Data System (ADS)

    Winters, Daniel; Erichsen, Patrik

    2013-10-01

    Infrared (IR) camera modules for the LWIR (8-12_m) that combine IR imaging optics with microbolometer focal plane array (FPA) sensors with readout electronics are becoming more and more a mass market product. At the same time, steady improvements in sensor resolution in the higher priced markets raise the requirement for imaging performance of objectives and the proper alignment between objective and FPA. This puts pressure on camera manufacturers and system integrators to assess the image quality of finished camera modules in a cost-efficient and automated way for quality control or during end-of-line testing. In this paper we present recent development work done in the field of image quality testing of IR camera modules. This technology provides a wealth of additional information in contrast to the more traditional test methods like minimum resolvable temperature difference (MRTD) which give only a subjective overall test result. Parameters that can be measured are image quality via the modulation transfer function (MTF) for broadband or with various bandpass filters on- and off-axis and optical parameters like e.g. effective focal length (EFL) and distortion. If the camera module allows for refocusing the optics, additional parameters like best focus plane, image plane tilt, auto-focus quality, chief ray angle etc. can be characterized. Additionally, the homogeneity and response of the sensor with the optics can be characterized in order to calculate the appropriate tables for non-uniformity correction (NUC). The technology can also be used to control active alignment methods during mechanical assembly of optics to high resolution sensors. Other important points that are discussed are the flexibility of the technology to test IR modules with different form factors, electrical interfaces and last but not least the suitability for fully automated measurements in mass production.

  8. MTF measurement and imaging quality evaluation of digital camera with slanted-edge method

    NASA Astrophysics Data System (ADS)

    Xiang, Chunchang; Chen, Xinhua; Chen, Yuheng; Zhou, Jiankang; Shen, Weimin

    2010-11-01

    Modulation Transfer Function (MTF) is the spatial frequency response of imaging systems and now develops as an objective merit performance for evaluating both quality of lens and camera. Slanted-edge method and its principle for measuring MTF of digital camera are introduced in this paper. The setup and software for testing digital camera is respectively established and developed. Measurement results with different tilt angle of the knife edge are compared to discuss the influence of the tilt angle. Also carefully denoise of the knife edge image is performed to decrease the noise sensitivity of knife edge measurement. Comparisons have been made between the testing results gained by slanted-edge method and grating target technique, and their deviation is analyzed.

  9. Hubble Space Telescope, Wide Field Planetary Camera

    NASA Technical Reports Server (NTRS)

    1981-01-01

    This illustration is a diagram of the Hubble Space Telescope's (HST's), Wide Field Planetary Camera (WF/PC), one of the five Scientific Instruments. The WF/PC uses a four-sided pyramid mirror to split a light image into quarters. It then focuses each quadrant onto one of two sets of four sensors. The sensors are charge-coupled detectors and function as the electronic equivalent of extremely sensitive photographic plates. The WF/PC operates in two modes. The Wide-Field mode that will view 7.2-arcmin sections of the sky, and the Planetary mode that will look at narrower fields of view, such as planets or areas within other galaxies. The purpose of the HST, the most complex and sensitive optical telescope ever made, is to study the cosmos from a low-Earth orbit. By placing the telescope in space, astronomers are able to collect data that is free of the Earth's atmosphere. The HST detects objects 25 times fainter than the dimmest objects seen from Earth and provides astronomers with an observable universe 250 times larger than visible from ground-based telescopes, perhaps as far away as 14 billion light-years. The HST views galaxies, stars, planets, comets, possibly other solar systems, and even unusual phenomena such as quasars, with 10 times the clarity of ground-based telescopes. The HST was deployed from the Space Shuttle Discovery (STS-31 mission) into Earth orbit in April 1990. The Marshall Space Flight Center had responsibility for design, development, and construction of the HST. The Perkin-Elmer Corporation, in Danbury, Cornecticut, developed the optical system and guidance sensors.

  10. Pinhole Camera For Viewing Electron Beam Materials Processing

    NASA Astrophysics Data System (ADS)

    Rushford, M. C.; Kuzmenko, P. J.

    1986-10-01

    surface intensity variation is contrast reduced if the observation wavelength is a narrow band as far red as high intensity blooming will allow an observable picture. A three eyepiece camera allows an image plane where photo gray glass functions as a neutral density filter only over the high intensity portion of the image, thus reducing blooming. This system is enclosed in a water-cooled housing which can dissipate 15 watts/cm2, keeping the camera below 40 degrees Celsius. Single frames of video output are acquired for feature enhancement and location by a Data Translation DT2803 image processing board housed in an IBM PC.

  11. LDEF yaw and pitch angle estimates

    NASA Technical Reports Server (NTRS)

    Banks, Bruce A.; Gebauer, Linda

    1992-01-01

    Quantification of the LDEF yaw and pitch misorientations is crucial to the knowledge of atomic oxygen exposure of samples placed on LDEF. Video camera documentation of the LDEF spacecraft prior to grapple attachment, atomic oxygen shadows on experiment trays and longerons, and a pinhole atomic oxygen camera placed on LDEF provided sources of documentation of the yaw and pitch misorientation. Based on uncertainty-weighted averaging of data, the LDEF yaw offset was found to be 8.1 plus or minus 0.6 degrees, allowing higher atomic oxygen exposure of row 12 than initially anticipated. The LDEF pitch angle offset was found to be 0.8 plus or minus 0.4 degrees, such that the space end was tipped forward toward the direction of travel. The resulting consequences of the yaw and pitch misorientation of LDEF on the atomic oxygen fluence is a factor of 2.16 increase for samples located on row 12, and a factor of 1.18 increase for samples located on the space end compared to that which would be expected for perfect orientation.

  12. Mini gamma camera, camera system and method of use

    DOEpatents

    Majewski, Stanislaw; Weisenberger, Andrew G.; Wojcik, Randolph F.

    2001-01-01

    A gamma camera comprising essentially and in order from the front outer or gamma ray impinging surface: 1) a collimator, 2) a scintillator layer, 3) a light guide, 4) an array of position sensitive, high resolution photomultiplier tubes, and 5) printed circuitry for receipt of the output of the photomultipliers. There is also described, a system wherein the output supplied by the high resolution, position sensitive photomultipiler tubes is communicated to: a) a digitizer and b) a computer where it is processed using advanced image processing techniques and a specific algorithm to calculate the center of gravity of any abnormality observed during imaging, and c) optional image display and telecommunications ports.

  13. Passive Millimeter Wave Camera (PMMWC) at TRW

    NASA Technical Reports Server (NTRS)

    1997-01-01

    Engineers at TRW, Redondo Beach, California, inspect the Passive Millimeter Wave Camera, a weather-piercing camera designed to 'see' through fog, clouds, smoke and dust. Operating in the millimeter wave portion of the electromagnetic spectrum, the camera creates visual-like video images of objects, people, runways, obstacles and the horizon. A demonstration camera (shown in photo) has been completed and is scheduled for checkout tests and flight demonstration. Engineer (left) holds a compact, lightweight circuit board containing 40 complete radiometers, including antenna, monolithic millimeter wave integrated circuit (MMIC) receivers and signal processing and readout electronics that forms the basis for the camera's 1040-element focal plane array.

  14. Passive Millimeter Wave Camera (PMMWC) at TRW

    NASA Technical Reports Server (NTRS)

    1997-01-01

    Engineers at TRW, Redondo Beach, California, inspect the Passive Millimeter Wave Camera, a weather-piercing camera designed to see through fog, clouds, smoke and dust. Operating in the millimeter wave portion of the electromagnetic spectrum, the camera creates visual-like video images of objects, people, runways, obstacles and the horizon. A demonstration camera (shown in photo) has been completed and is scheduled for checkout tests and flight demonstration. Engineer (left) holds a compact, lightweight circuit board containing 40 complete radiometers, including antenna, monolithic millimeter wave integrated circuit (MMIC) receivers and signal processing and readout electronics that forms the basis for the camera's 1040-element focal plane array.

  15. HHEBBES! All sky camera system: status update

    NASA Astrophysics Data System (ADS)

    Bettonvil, F.

    2015-01-01

    A status update is given of the HHEBBES! All sky camera system. HHEBBES!, an automatic camera for capturing bright meteor trails, is based on a DSLR camera and a Liquid Crystal chopper for measuring the angular velocity. Purpose of the system is to a) recover meteorites; b) identify origin/parental bodies. In 2015, two new cameras were rolled out: BINGO! -alike HHEBBES! also in The Netherlands-, and POgLED, in Serbia. BINGO! is a first camera equipped with a longer focal length fisheye lens, to further increase the accuracy. Several minor improvements have been done and the data reduction pipeline was used for processing two prominent Dutch fireballs.

  16. Advances in fast 2D camera data handling and analysis on NSTX

    SciTech Connect

    Davis, W. M.; Patel, R. I.; Boeglin, W. U.; Roquemore, A. L.; Maqueda, R. J.; Zweben, S. J.

    2010-07-01

    The use of fast 2D cameras on NSTX continues to grow. There are 6 cameras with the capability of taking up to 1–2 gigabytes (GBs) of data apiece during each plasma shot on the National Spherical Torus Experiment (NSTX). Efficient storage and retrieval of this data remains a challenge. Performance comparisons are presented for reading data stored in MDSplus, using both compressed data and segmented records, and direct access I/O with different read sizes. Encouragingly, fast 2D camera data provides considerable insight into plasma complexities, such as small-scale turbulence and particle transport. The last part of this paper is an example of more recent uses: dual cameras looking at the same region of the plasma from different angles, which can provide trajectories of incandescent particles in 3D. A laboratory simulation of the 3D trajectories is presented, as well as corresponding data from NSTX plasma where glowing dust particles can be followed.

  17. Analysis of Pictures Taken with an Underwater Camera

    NASA Astrophysics Data System (ADS)

    Biezeveld, Hubert; Elsinga, Nienke; Harmsen, Floor-Jolijn; Koopman, Rose

    2005-03-01

    In the Dutch high school system students are required to carry out a research project on a subject of their own choosing. During the section on optics, the teacher (Hubert) mentioned that a fish sees the world above the water in a cone with half-angle equal to the critical angle for the air/water interface (49°). This follows from Snell's law and has been discussed in some detail by Jearl Walker. He describes the intersection of this cone with the surface of the water as the "window" through which the fish sees the outside world. A related paper has appeared in this journal.2 Stimulated by the teacher's remark, three students (Nienke, Floor-Jolijn, and Rose) made photographs with an underwater camera in the local swimming pool in Hoorn.

  18. Laser angle measurement system

    NASA Technical Reports Server (NTRS)

    Pond, C. R.; Texeira, P. D.; Wilbert, R. E.

    1980-01-01

    The design and fabrication of a laser angle measurement system is described. The instrument is a fringe counting interferometer that monitors the pitch attitude of a model in a wind tunnel. A laser source and detector are mounted above the mode. Interference fringes are generated by a small passive element on the model. The fringe count is accumulated and displayed by a processor in the wind tunnel control room. Optical and electrical schematics, system maintenance and operation procedures are included, and the results of a demonstration test are given.

  19. Initial laboratory evaluation of color video cameras

    SciTech Connect

    Terry, P L

    1991-01-01

    Sandia National Laboratories has considerable experience with monochrome video cameras used in alarm assessment video systems. Most of these systems, used for perimeter protection, were designed to classify rather than identify an intruder. Monochrome cameras are adequate for that application and were selected over color cameras because of their greater sensitivity and resolution. There is a growing interest in the identification function of security video systems for both access control and insider protection. Color information is useful for identification purposes, and color camera technology is rapidly changing. Thus, Sandia National Laboratories established an ongoing program to evaluate color solid-state cameras. Phase one resulted in the publishing of a report titled, Initial Laboratory Evaluation of Color Video Cameras (SAND--91-2579).'' It gave a brief discussion of imager chips and color cameras and monitors, described the camera selection, detailed traditional test parameters and procedures, and gave the results of the evaluation of twelve cameras. In phase two six additional cameras were tested by the traditional methods and all eighteen cameras were tested by newly developed methods. This report details both the traditional and newly developed test parameters and procedures, and gives the results of both evaluations.

  20. 3D camera tracking from disparity images

    NASA Astrophysics Data System (ADS)

    Kim, Kiyoung; Woo, Woontack

    2005-07-01

    In this paper, we propose a robust camera tracking method that uses disparity images computed from known parameters of 3D camera and multiple epipolar constraints. We assume that baselines between lenses in 3D camera and intrinsic parameters are known. The proposed method reduces camera motion uncertainty encountered during camera tracking. Specifically, we first obtain corresponding feature points between initial lenses using normalized correlation method. In conjunction with matching features, we get disparity images. When the camera moves, the corresponding feature points, obtained from each lens of 3D camera, are robustly tracked via Kanade-Lukas-Tomasi (KLT) tracking algorithm. Secondly, relative pose parameters of each lens are calculated via Essential matrices. Essential matrices are computed from Fundamental matrix calculated using normalized 8-point algorithm with RANSAC scheme. Then, we determine scale factor of translation matrix by d-motion. This is required because the camera motion obtained from Essential matrix is up to scale. Finally, we optimize camera motion using multiple epipolar constraints between lenses and d-motion constraints computed from disparity images. The proposed method can be widely adopted in Augmented Reality (AR) applications, 3D reconstruction using 3D camera, and fine surveillance systems which not only need depth information, but also camera motion parameters in real-time.

  1. Initial laboratory evaluation of color video cameras

    NASA Astrophysics Data System (ADS)

    Terry, P. L.

    1991-12-01

    Sandia National Laboratories has considerable experience with monochrome video cameras used in alarm assessment video systems. Most of these systems, used for perimeter protection, were designed to classify rather than identify an intruder. Monochrome cameras are adequate for that application and were selected over color cameras because of their greater sensitivity and resolution. There is a growing interest in the identification function of security video systems for both access control and insider protection. Color information is useful for identification purposes, and color camera technology is rapidly changing. Thus, Sandia National Laboratories established an ongoing program to evaluate color solid-state cameras. Phase one resulted in the publishing of a report titled, 'Initial Laboratory Evaluation of Color Video Cameras (SAND--91-2579).' It gave a brief discussion of imager chips and color cameras and monitors, described the camera selection, detailed traditional test parameters and procedures, and gave the results of the evaluation of twelve cameras. In phase two, six additional cameras were tested by the traditional methods and all eighteen cameras were tested by newly developed methods. This report details both the traditional and newly developed test parameters and procedures, and gives the results of both evaluations.

  2. Phenology cameras observing boreal ecosystems of Finland

    NASA Astrophysics Data System (ADS)

    Peltoniemi, Mikko; Böttcher, Kristin; Aurela, Mika; Kolari, Pasi; Tanis, Cemal Melih; Linkosalmi, Maiju; Loehr, John; Metsämäki, Sari; Nadir Arslan, Ali

    2016-04-01

    Cameras have become useful tools for monitoring seasonality of ecosystems. Low-cost cameras facilitate validation of other measurements and allow extracting some key ecological features and moments from image time series. We installed a network of phenology cameras at selected ecosystem research sites in Finland. Cameras were installed above, on the level, or/and below the canopies. Current network hosts cameras taking time lapse images in coniferous and deciduous forests as well as at open wetlands offering thus possibilities to monitor various phenological and time-associated events and elements. In this poster, we present our camera network and give examples of image series use for research. We will show results about the stability of camera derived color signals, and based on that discuss about the applicability of cameras in monitoring time-dependent phenomena. We will also present results from comparisons between camera-derived color signal time series and daily satellite-derived time series (NVDI, NDWI, and fractional snow cover) from the Moderate Resolution Imaging Spectrometer (MODIS) at selected spruce and pine forests and in a wetland. We will discuss the applicability of cameras in supporting phenological observations derived from satellites, by considering the possibility of cameras to monitor both above and below canopy phenology and snow.

  3. Narrow line photoassociation in an optical lattice.

    PubMed

    Zelevinsky, T; Boyd, M M; Ludlow, A D; Ido, T; Ye, J; Ciuryło, R; Naidon, P; Julienne, P S

    2006-05-26

    With ultracold 88Sr in a 1D magic wavelength optical lattice, we performed narrow-line photoassociation spectroscopy near the 1S0 - 3P1 intercombination transition. Nine least-bound vibrational molecular levels associated with the long-range 0u and 1u potential energy surfaces were measured and identified. A simple theoretical model accurately describes the level positions and treats the effects of the lattice confinement on the line shapes. The measured resonance strengths show that optical tuning of the ground state scattering length should be possible without significant atom loss. PMID:16803171

  4. The Flow of Gases in Narrow Channels

    NASA Technical Reports Server (NTRS)

    Rasmussen, R E H

    1951-01-01

    Measurements were made of the flow of gases through various narrow channels a few microns wide at average pressures from 0.00003 to 40 cm. Hg. The flow rate, defined as the product of pressure and volume rate of flow at unit pressure difference, first decreased linearly with decrease in mean pressure in the channel, in agreement with laminar-flow theory, reached a minimum when the mean path length was approximately equal to the channel width, and then increased to a constant value. The product of flow rate and square root of molecular number was approximately the same function of mean path length for all gases for a given channel.

  5. Automatic calibration method for plenoptic camera

    NASA Astrophysics Data System (ADS)

    Luan, Yinsen; He, Xing; Xu, Bing; Yang, Ping; Tang, Guomao

    2016-04-01

    An automatic calibration method is proposed for a microlens-based plenoptic camera. First, all microlens images on the white image are searched and recognized automatically based on digital morphology. Then, the center points of microlens images are rearranged according to their relative position relationships. Consequently, the microlens images are located, i.e., the plenoptic camera is calibrated without the prior knowledge of camera parameters. Furthermore, this method is appropriate for all types of microlens-based plenoptic cameras, even the multifocus plenoptic camera, the plenoptic camera with arbitrarily arranged microlenses, or the plenoptic camera with different sizes of microlenses. Finally, we verify our method by the raw data of Lytro. The experiments show that our method has higher intelligence than the methods published before.

  6. Characterization of the Series 1000 Camera System

    SciTech Connect

    Kimbrough, J; Moody, J; Bell, P; Landen, O

    2004-04-07

    The National Ignition Facility requires a compact network addressable scientific grade CCD camera for use in diagnostics ranging from streak cameras to gated x-ray imaging cameras. Due to the limited space inside the diagnostic, an analog and digital input/output option in the camera controller permits control of both the camera and the diagnostic by a single Ethernet link. The system consists of a Spectral Instruments Series 1000 camera, a PC104+ controller, and power supply. The 4k by 4k CCD camera has a dynamic range of 70 dB with less than 14 electron read noise at a 1MHz readout rate. The PC104+ controller includes 16 analog inputs, 4 analog outputs and 16 digital input/output lines for interfacing to diagnostic instrumentation. A description of the system and performance characterization is reported.

  7. The Spacelab Wide Angle Telescope (SWAT)

    NASA Technical Reports Server (NTRS)

    West, R. M.; Gull, T. R.; Henize, K. G.; Bertola, F.

    1979-01-01

    A fast wide angle telescope that will be capable of imaging to the darker sky limit and in the ultraviolet wavelength region available above the atmosphere is described. The telescope (SWAT) has a resolution comparable to that of the large ground-based Schmidt telescope and a field of at least five degrees. A number of astrophysically important investigations can only be accomplished with such a telescope, e.g., detection of hidden, hot objects like hot white dwarfs and subwarfs in stellar binary systems, and energetic regions in globular clusters and galaxy nuclei. It permits unique studies of the UV-morphology of extended objects and allows discovery of very faint extensions, halos, jets, and filaments in galaxies. It can contribute to the investigation of dust in the Milky Way and in other galaxies and, with an objective prism, spectra of very faint objects can be obtained. The SWAT will localize objects for further study with the narrow-field Space Telescope.

  8. On the absolute calibration of SO2 cameras

    NASA Astrophysics Data System (ADS)

    Lübcke, P.; Bobrowski, N.; Illing, S.; Kern, C.; Alvarez Nieves, J. M.; Vogel, L.; Zielcke, J.; Delgado Granados, H.; Platt, U.

    2013-03-01

    Sulphur dioxide emission rate measurements are an important tool for volcanic monitoring and eruption risk assessment. The SO2 camera technique remotely measures volcanic emissions by analysing the ultraviolet absorption of SO2 in a narrow spectral window between 300 and 320 nm using solar radiation scattered in the atmosphere. The SO2 absorption is selectively detected by mounting band-pass interference filters in front of a two-dimensional, UV-sensitive CCD detector. One important step for correct SO2 emission rate measurements that can be compared with other measurement techniques is a correct calibration. This requires conversion from the measured optical density to the desired SO2 column density (CD). The conversion factor is most commonly determined by inserting quartz cells (cuvettes) with known amounts of SO2 into the light path. Another calibration method uses an additional narrow field-of-view Differential Optical Absorption Spectroscopy system (NFOV-DOAS), which measures the column density simultaneously in a small area of the camera's field-of-view. This procedure combines the very good spatial and temporal resolution of the SO2 camera technique with the more accurate column densities obtainable from DOAS measurements. This work investigates the uncertainty of results gained through the two commonly used, but quite different, calibration methods (DOAS and calibration cells). Measurements with three different instruments, an SO2 camera, a NFOV-DOAS system and an Imaging DOAS (I-DOAS), are presented. We compare the calibration-cell approach with the calibration from the NFOV-DOAS system. The respective results are compared with measurements from an I-DOAS to verify the calibration curve over the spatial extent of the image. The results show that calibration cells, while working fine in some cases, can lead to an overestimation of the SO2 CD by up to 60% compared with CDs from the DOAS measurements. Besides these errors of calibration, radiative transfer

  9. Variable angle correlation spectroscopy

    SciTech Connect

    Lee, Y K

    1994-05-01

    In this dissertation, a novel nuclear magnetic resonance (NMR) technique, variable angle correlation spectroscopy (VACSY) is described and demonstrated with {sup 13}C nuclei in rapidly rotating samples. These experiments focus on one of the basic problems in solid state NMR: how to extract the wealth of information contained in the anisotropic component of the NMR signal while still maintaining spectral resolution. Analysis of the anisotropic spectral patterns from poly-crystalline systems reveal information concerning molecular structure and dynamics, yet in all but the simplest of systems, the overlap of spectral patterns from chemically distinct sites renders the spectral analysis difficult if not impossible. One solution to this problem is to perform multi-dimensional experiments where the high-resolution, isotropic spectrum in one dimension is correlated with the anisotropic spectral patterns in the other dimensions. The VACSY technique incorporates the angle between the spinner axis and the static magnetic field as an experimental parameter that may be incremented during the course of the experiment to help correlate the isotropic and anisotropic components of the spectrum. The two-dimensional version of the VACSY experiments is used to extract the chemical shift anisotropy tensor values from multi-site organic molecules, study molecular dynamics in the intermediate time regime, and to examine the ordering properties of partially oriented samples. The VACSY technique is then extended to three-dimensional experiments to study slow molecular reorientations in a multi-site polymer system.

  10. Measures on mixing angles

    SciTech Connect

    Gibbons, Gary W.; Gielen, Steffen; Pope, C. N.; Turok, Neil

    2009-01-01

    We address the problem of the apparently very small magnitude of CP violation in the standard model, measured by the Jarlskog invariant J. In order to make statements about probabilities for certain values of J, we seek to find a natural measure on the space of Kobayashi-Maskawa matrices, the double quotient U(1){sup 2}/SU(3)/U(1){sup 2}. We review several possible, geometrically motivated choices of the measure, and compute expectation values for powers of J for these measures. We find that different choices of the measure generically make the observed magnitude of CP violation appear finely tuned. Since the quark masses and the mixing angles are determined by the same set of Yukawa couplings, we then do a second calculation in which we take the known quark mass hierarchy into account. We construct the simplest measure on the space of 3x3 Hermitian matrices which reproduces this known hierarchy. Calculating expectation values for powers of J in this second approach, we find that values of J close to the observed value are now rather likely, and there does not seem to be any fine-tuning. Our results suggest that the choice of Kobayashi-Maskawa angles is closely linked to the observed mass hierarchy. We close by discussing the corresponding case of neutrinos.

  11. Advanced camera image data acquisition system for Pi-of-the-Sky

    NASA Astrophysics Data System (ADS)

    Kwiatkowski, Maciej; Kasprowicz, Grzegorz; Pozniak, Krzysztof; Romaniuk, Ryszard; Wrochna, Grzegorz

    2008-11-01

    The paper describes a new generation of high performance, remote control, CCD cameras designed for astronomical applications. A completely new camera PCB was designed, manufactured, tested and commissioned. The CCD chip was positioned in a different way than previously resulting in better performance of the astronomical video data acquisition system. The camera was built using a low-noise, 4Mpixel CCD circuit by STA. The electronic circuit of the camera is highly parameterized and reconfigurable, as well as modular in comparison with the solution of first generation, due to application of open software solutions and FPGA circuit, Altera Cyclone EP1C6. New algorithms were implemented into the FPGA chip. There were used the following advanced electronic circuit in the camera system: microcontroller CY7C68013a (core 8051) by Cypress, image processor AD9826 by Analog Devices, GigEth interface RTL8169s by Realtec, memory SDRAM AT45DB642 by Atmel, CPU typr microprocessor ARM926EJ-S AT91SAM9260 by ARM and Atmel. Software solutions for the camera and its remote control, as well as image data acquisition are based only on the open source platform. There were used the following image interfaces ISI and API V4L2, data bus AMBA, AHB, INDI protocol. The camera will be replicated in 20 pieces and is designed for continuous on-line, wide angle observations of the sky in the research program Pi-of-the-Sky.

  12. Rover mast calibration, exact camera pointing, and camara handoff for visual target tracking

    NASA Technical Reports Server (NTRS)

    Kim, Won S.; Ansar, Adnan I.; Steele, Robert D.

    2005-01-01

    This paper presents three technical elements that we have developed to improve the accuracy of the visual target tracking for single-sol approach-and-instrument placement in future Mars rover missions. An accurate, straightforward method of rover mast calibration is achieved by using a total station, a camera calibration target, and four prism targets mounted on the rover. The method was applied to Rocky8 rover mast calibration and yielded a 1.1-pixel rms residual error. Camera pointing requires inverse kinematic solutions for mast pan and tilt angles such that the target image appears right at the center of the camera image. Two issues were raised. Mast camera frames are in general not parallel to the masthead base frame. Further, the optical axis of the camera model in general does not pass through the center of the image. Despite these issues, we managed to derive non-iterative closed-form exact solutions, which were verified with Matlab routines. Actual camera pointing experiments aver 50 random target image paints yielded less than 1.3-pixel rms pointing error. Finally, a purely geometric method for camera handoff using stereo views of the target has been developed. Experimental test runs show less than 2.5 pixels error on high-resolution Navcam for Pancam-to-Navcam handoff, and less than 4 pixels error on lower-resolution Hazcam for Navcam-to-Hazcam handoff.

  13. Photometric Calibration and Image Stitching for a Large Field of View Multi-Camera System.

    PubMed

    Lu, Yu; Wang, Keyi; Fan, Gongshu

    2016-01-01

    A new compact large field of view (FOV) multi-camera system is introduced. The camera is based on seven tiny complementary metal-oxide-semiconductor sensor modules covering over 160° × 160° FOV. Although image stitching has been studied extensively, sensor and lens differences have not been considered in previous multi-camera devices. In this study, we have calibrated the photometric characteristics of the multi-camera device. Lenses were not mounted on the sensor in the process of radiometric response calibration to eliminate the influence of the focusing effect of uniform light from an integrating sphere. Linearity range of the radiometric response, non-linearity response characteristics, sensitivity, and dark current of the camera response function are presented. The R, G, and B channels have different responses for the same illuminance. Vignetting artifact patterns have been tested. The actual luminance of the object is retrieved by sensor calibration results, and is used to blend images to make panoramas reflect the objective luminance more objectively. This compensates for the limitation of stitching images that are more realistic only through the smoothing method. The dynamic range limitation of can be resolved by using multiple cameras that cover a large field of view instead of a single image sensor with a wide-angle lens. The dynamic range is expanded by 48-fold in this system. We can obtain seven images in one shot with this multi-camera system, at 13 frames per second. PMID:27077857

  14. Effects of red light cameras on violations and crashes: a review of the international literature.

    PubMed

    Retting, Richard A; Ferguson, Susan A; Hakkert, A Shalom

    2003-03-01

    Red light running is a frequent cause of motor vehicle crashes and injuries. A primary countermeasure for red light running crashes is police traffic enforcement. In recent years, many police agencies have begun using automated red light cameras as a supplement to conventional enforcement methods. The present study reviewed and evaluated available evidence in the international literature regarding the effectiveness of cameras to reduce both red light violations and crashes. Camera enforcement generally reduces violations by an estimated 40-50%. In terms of crash effects, most studies contain methodological flaws that, to varying degrees, either overestimate (failure to adjust for regression to the mean) or underestimate (comparison with nearby signalized intersections affected by cameras) crash effects. Mindful of these limitations, the research generally indicates that camera enforcement can significantly reduce injury crashes at signalized intersections, in particular right-angle injury crashes. Most studies reported increases in rear-end crashes following camera installation. Taken together the studies indicate that, overall, injury crashes, including rear-end collisions, were reduced by 25-30% as a result of camera enforcement. PMID:14522657

  15. Photometric Calibration and Image Stitching for a Large Field of View Multi-Camera System

    PubMed Central

    Lu, Yu; Wang, Keyi; Fan, Gongshu

    2016-01-01

    A new compact large field of view (FOV) multi-camera system is introduced. The camera is based on seven tiny complementary metal-oxide-semiconductor sensor modules covering over 160° × 160° FOV. Although image stitching has been studied extensively, sensor and lens differences have not been considered in previous multi-camera devices. In this study, we have calibrated the photometric characteristics of the multi-camera device. Lenses were not mounted on the sensor in the process of radiometric response calibration to eliminate the influence of the focusing effect of uniform light from an integrating sphere. Linearity range of the radiometric response, non-linearity response characteristics, sensitivity, and dark current of the camera response function are presented. The R, G, and B channels have different responses for the same illuminance. Vignetting artifact patterns have been tested. The actual luminance of the object is retrieved by sensor calibration results, and is used to blend images to make panoramas reflect the objective luminance more objectively. This compensates for the limitation of stitching images that are more realistic only through the smoothing method. The dynamic range limitation of can be resolved by using multiple cameras that cover a large field of view instead of a single image sensor with a wide-angle lens. The dynamic range is expanded by 48-fold in this system. We can obtain seven images in one shot with this multi-camera system, at 13 frames per second. PMID:27077857

  16. On the absolute calibration of SO2 cameras

    NASA Astrophysics Data System (ADS)

    Lübcke, P.; Bobrowski, N.; Illing, S.; Kern, C.; Alvarez Nieves, J. M.; Vogel, L.; Zielcke, J.; Delgado Granados, H.; Platt, U.

    2012-09-01

    Sulphur dioxide emission flux measurements are an important tool for volcanic monitoring and eruption risk assessment. The SO2 camera technique remotely measures volcanic emissions by analysing the ultraviolet absorption of SO2 in a narrow spectral window between 305 nm and 320 nm using solar radiation scattered in the atmosphere. The SO2 absorption is selectively detected by mounting band-pass interference filters in front of a two-dimensional, UV-sensitive CCD detector. While this approach is simple and delivers valuable insights into the two-dimensional SO2 distribution, absolute calibration has proven to be difficult. An accurate calibration of the SO2 camera (i.e., conversion from optical density to SO2 column density, CD) is crucial to obtain correct SO2 CDs and flux measurements that are comparable to other measurement techniques and can be used for volcanological applications. The most common approach for calibrating SO2 camera measurements is based on inserting quartz cells (cuvettes) containing known amounts of SO2 into the light path. It has been found, however, that reflections from the windows of the calibration cell can considerably affect the signal measured by the camera. Another possibility for calibration relies on performing simultaneous measurements in a small area of the camera's field-of-view (FOV) by a narrow-field-of-view Differential Optical Absorption Spectroscopy (NFOV-DOAS) system. This procedure combines the very good spatial and temporal resolution of the SO2 camera technique with the more accurate column densities obtainable from DOAS measurements. This work investigates the uncertainty of results gained through the two commonly used, but quite different calibration methods (DOAS and calibration cells). Measurements with three different instruments, an SO2 camera, a NFOV-DOAS system and an Imaging DOAS (IDOAS), are presented. We compare the calibration-cell approach with the calibration from the NFOV-DOAS system. The respective

  17. NARROW-K-BAND OBSERVATIONS OF THE GJ 1214 SYSTEM

    SciTech Connect

    Colón, Knicole D.; Gaidos, Eric

    2013-10-10

    GJ 1214 is a nearby M dwarf star that hosts a transiting super-Earth-size planet, making this system an excellent target for atmospheric studies. Most studies find that the transmission spectrum of GJ 1214b is flat, which favors either a high mean molecular weight or cloudy/hazy hydrogen (H) rich atmosphere model. Photometry at short wavelengths (<0.7 μm) and in the K band can discriminate the most between these different atmosphere models for GJ 1214b, but current observations do not have sufficiently high precision. We present photometry of seven transits of GJ 1214b through a narrow K-band (2.141 μm) filter with the Wide Field Camera on the 3.8 m United Kingdom Infrared Telescope. Our photometric precision is typically 1.7 × 10{sup –3} (for a single transit), comparable with other ground-based observations of GJ 1214b. We measure a planet-star radius ratio of 0.1158 ± 0.0013, which, along with other studies, also supports a flat transmission spectrum for GJ 1214b. Since this does not exclude a scenario where GJ 1214b has an H-rich envelope with heavy elements that are sequestered below a cloud/haze layer, we compare K-band observations with models of H{sub 2} collision-induced absorption in an atmosphere for a range of temperatures. While we find no evidence for deviation from a flat spectrum (slope s = 0.0016 ± 0.0038), an H{sub 2}-dominated upper atmosphere (<60 mbar) cannot be excluded. More precise observations at <0.7 μm and in the K band, as well as a uniform analysis of all published data, would be useful for establishing more robust limits on atmosphere models for GJ 1214b.

  18. A Note on Angle Construction

    ERIC Educational Resources Information Center

    Francis, Richard L.

    1978-01-01

    The author investigates the construction of angles (using Euclidean tools) through a numerical approach. He calls attention to the surprising impossibility of constructing the conventional units of angle measure--the degree, minute, second, radian, and mil. (MN)

  19. Optical Enhancement of Exoskeleton-Based Estimation of Glenohumeral Angles

    PubMed Central

    Cortés, Camilo; Unzueta, Luis; de los Reyes-Guzmán, Ana; Ruiz, Oscar E.; Flórez, Julián

    2016-01-01

    In Robot-Assisted Rehabilitation (RAR) the accurate estimation of the patient limb joint angles is critical for assessing therapy efficacy. In RAR, the use of classic motion capture systems (MOCAPs) (e.g., optical and electromagnetic) to estimate the Glenohumeral (GH) joint angles is hindered by the exoskeleton body, which causes occlusions and magnetic disturbances. Moreover, the exoskeleton posture does not accurately reflect limb posture, as their kinematic models differ. To address the said limitations in posture estimation, we propose installing the cameras of an optical marker-based MOCAP in the rehabilitation exoskeleton. Then, the GH joint angles are estimated by combining the estimated marker poses and exoskeleton Forward Kinematics. Such hybrid system prevents problems related to marker occlusions, reduced camera detection volume, and imprecise joint angle estimation due to the kinematic mismatch of the patient and exoskeleton models. This paper presents the formulation, simulation, and accuracy quantification of the proposed method with simulated human movements. In addition, a sensitivity analysis of the method accuracy to marker position estimation errors, due to system calibration errors and marker drifts, has been carried out. The results show that, even with significant errors in the marker position estimation, method accuracy is adequate for RAR. PMID:27403044

  20. Optical Enhancement of Exoskeleton-Based Estimation of Glenohumeral Angles.

    PubMed

    Cortés, Camilo; Unzueta, Luis; de Los Reyes-Guzmán, Ana; Ruiz, Oscar E; Flórez, Julián

    2016-01-01

    In Robot-Assisted Rehabilitation (RAR) the accurate estimation of the patient limb joint angles is critical for assessing therapy efficacy. In RAR, the use of classic motion capture systems (MOCAPs) (e.g., optical and electromagnetic) to estimate the Glenohumeral (GH) joint angles is hindered by the exoskeleton body, which causes occlusions and magnetic disturbances. Moreover, the exoskeleton posture does not accurately reflect limb posture, as their kinematic models differ. To address the said limitations in posture estimation, we propose installing the cameras of an optical marker-based MOCAP in the rehabilitation exoskeleton. Then, the GH joint angles are estimated by combining the estimated marker poses and exoskeleton Forward Kinematics. Such hybrid system prevents problems related to marker occlusions, reduced camera detection volume, and imprecise joint angle estimation due to the kinematic mismatch of the patient and exoskeleton models. This paper presents the formulation, simulation, and accuracy quantification of the proposed method with simulated human movements. In addition, a sensitivity analysis of the method accuracy to marker position estimation errors, due to system calibration errors and marker drifts, has been carried out. The results show that, even with significant errors in the marker position estimation, method accuracy is adequate for RAR. PMID:27403044