Science.gov

Sample records for camera narrow angle

  1. Thermal Design of the Cassini Narrow Angle Camera

    NASA Technical Reports Server (NTRS)

    Hoffman, Pamela

    1994-01-01

    The Narrow Angle Camera (NAC) is one of two cameras in the Imaging Science Subsystem (ISS) on the Cassini Spacecraft (S/C), the second camera is a Voyager-inherited Wide Angle Camera (WAC). Cassini is currently planned to be launched in October 1997 and will arrive at Saturn for a four year tour in June 2004. The Narrow Angle Optics are a Ritchey Chretien type Optics, has a focal length of 2000 mm, a relative aperture of f/10.5, a spectral range of 200 to 1100 nm, 24 filters, a pixel field of view of 6.0 microradian/pixel, and has a field of view of 3.5 x 3.5 degrees. The sensor is a Charged Couple Device (CCD), 1024 x 1024 pixels with a pixel size of 12 x 12 um, a full well greater than 50,000 e-, on chip processing of up to 800,000 e- pixel summation, a dark current of less than 0.1 e-/pixel/sec at operating temperature, and a charge transfer efficiency of 0.99999 at operating temperature...

  2. Flight Calibration of the LROC Narrow Angle Camera

    NASA Astrophysics Data System (ADS)

    Humm, D. C.; Tschimmel, M.; Brylow, S. M.; Mahanti, P.; Tran, T. N.; Braden, S. E.; Wiseman, S.; Danton, J.; Eliason, E. M.; Robinson, M. S.

    2015-09-01

    Characterization and calibration are vital for instrument commanding and image interpretation in remote sensing. The Lunar Reconnaissance Orbiter Camera Narrow Angle Camera (LROC NAC) takes 500 Mpixel greyscale images of lunar scenes at 0.5 meters/pixel. It uses two nominally identical line scan cameras for a larger crosstrack field of view. Stray light, spatial crosstalk, and nonlinearity were characterized using flight images of the Earth and the lunar limb. These are important for imaging shadowed craters, studying ˜1 meter size objects, and photometry respectively. Background, nonlinearity, and flatfield corrections have been implemented in the calibration pipeline. An eight-column pattern in the background is corrected. The detector is linear for DN = 600-2000 but a signal-dependent additive correction is required and applied for DN<600. A predictive model of detector temperature and dark level was developed to command dark level offset. This avoids images with a cutoff at DN=0 and minimizes quantization error in companding. Absolute radiometric calibration is derived from comparison of NAC images with ground-based images taken with the Robotic Lunar Observatory (ROLO) at much lower spatial resolution but with the same photometric angles.

  3. Flight Calibration of the LROC Narrow Angle Camera

    NASA Astrophysics Data System (ADS)

    Humm, D. C.; Tschimmel, M.; Brylow, S. M.; Mahanti, P.; Tran, T. N.; Braden, S. E.; Wiseman, S.; Danton, J.; Eliason, E. M.; Robinson, M. S.

    2016-04-01

    Characterization and calibration are vital for instrument commanding and image interpretation in remote sensing. The Lunar Reconnaissance Orbiter Camera Narrow Angle Camera (LROC NAC) takes 500 Mpixel greyscale images of lunar scenes at 0.5 meters/pixel. It uses two nominally identical line scan cameras for a larger crosstrack field of view. Stray light, spatial crosstalk, and nonlinearity were characterized using flight images of the Earth and the lunar limb. These are important for imaging shadowed craters, studying ˜1 meter size objects, and photometry respectively. Background, nonlinearity, and flatfield corrections have been implemented in the calibration pipeline. An eight-column pattern in the background is corrected. The detector is linear for DN = 600--2000 but a signal-dependent additive correction is required and applied for DN<600. A predictive model of detector temperature and dark level was developed to command dark level offset. This avoids images with a cutoff at DN=0 and minimizes quantization error in companding. Absolute radiometric calibration is derived from comparison of NAC images with ground-based images taken with the Robotic Lunar Observatory (ROLO) at much lower spatial resolution but with the same photometric angles.

  4. Lunar Reconnaissance Orbiter Camera Narrow Angle Cameras: Laboratory and Initial Flight Calibration

    NASA Astrophysics Data System (ADS)

    Humm, D. C.; Tschimmel, M.; Denevi, B. W.; Lawrence, S.; Mahanti, P.; Tran, T. N.; Thomas, P. C.; Eliason, E.; Robinson, M. S.

    2009-12-01

    The Lunar Reconnaissance Orbiter Camera (LROC) has two identical Narrow Angle Cameras (NACs). Each NAC is a monochrome pushbroom scanner, providing images with a pixel scale of 50 cm from a 50-km orbit. A single NAC image has a swath width of 2.5 km and a length of up to 26 km. The NACs are mounted to acquire side-by-side imaging for a combined swath width of 5 km. The NAC is designed to fully characterize future human and robotic landing sites in terms of scientific and resource merit, trafficability, and hazards. The North and South poles will be mapped at 1-meter-scale poleward of 85.5 degrees latitude. Stereo coverage is achieved by pointing the NACs off-nadir, which requires planning in advance. Read noise is 91 and 93 e- and the full well capacity is 334,000 and 352,000 e- for NAC-L and NAC-R respectively. Signal-to-noise ranges from 42 for low-reflectance material with 70 degree illumination to 230 for high-reflectance material with 0 degree illumination. Longer exposure times and 2x binning are available to further increase signal-to-noise with loss of spatial resolution. Lossy data compression from 12 bits to 8 bits uses a companding table selected from a set optimized for different signal levels. A model of focal plane temperatures based on flight data is used to command dark levels for individual images, optimizing the performance of the companding tables and providing good matching of the NAC-L and NAC-R images even before calibration. The preliminary NAC calibration pipeline includes a correction for nonlinearity at low signal levels with an offset applied for DN>600 and a logistic function for DN<600. Flight images taken on the limb of the Moon provide a measure of stray light performance. Averages over many lines of images provide a measure of flat field performance in flight. These are comparable with laboratory data taken with a diffusely reflecting uniform panel.

  5. Two Years of Digital Terrain Model Production Using the Lunar Reconnaissance Orbiter Narrow Angle Camera

    NASA Astrophysics Data System (ADS)

    Burns, K.; Robinson, M. S.; Speyerer, E.; LROC Science Team

    2011-12-01

    One of the primary objectives of the Lunar Reconnaissance Orbiter Camera (LROC) is to gather stereo observations with the Narrow Angle Camera (NAC). These stereo observations are used to generate digital terrain models (DTMs). The NAC has a pixel scale of 0.5 to 2.0 meters but was not designed for stereo observations and thus requires the spacecraft to roll off-nadir to acquire these images. Slews interfere with the data collection of the other instruments, so opportunities are currently limited to four per day. Arizona State University has produced DTMs from 95 stereo pairs for 11 Constellation Project (CxP) sites (Aristarchus, Copernicus crater, Gruithuisen domes, Hortensius domes, Ina D-caldera, Lichtenberg crater, Mare Ingenii, Marius hills, Reiner Gamma, South Pole-Aitkin Rim, Sulpicius Gallus) as well as 30 other regions of scientific interest (including: Bhabha crater, highest and lowest elevation points, Highland Ponds, Kugler Anuchin, Linne Crater, Planck Crater, Slipher crater, Sears Crater, Mandel'shtam Crater, Virtanen Graben, Compton/Belkovich, Rumker Domes, King Crater, Luna 16/20/23/24 landing sites, Ranger 6 landing site, Wiener F Crater, Apollo 11/14/15/17, fresh craters, impact melt flows, Larmor Q crater, Mare Tranquillitatis pit, Hansteen Alpha, Moore F Crater, and Lassell Massif). To generate DTMs, the USGS ISIS software and SOCET SET° from BAE Systems are used. To increase the absolute accuracy of the DTMs, data obtained from the Lunar Orbiter Laser Altimeter (LOLA) is used to coregister the NAC images and define the geodetic reference frame. NAC DTMs have been used in examination of several sites, e.g. Compton-Belkovich, Marius Hills and Ina D-caldera [1-3]. LROC will continue to acquire high-resolution stereo images throughout the science phase of the mission and any extended mission opportunities, thus providing a vital dataset for scientific research as well as future human and robotic exploration. [1] B.L. Jolliff (2011) Nature Geoscience, in press. [2] Lawrence et al. (2011) LPSC XLII, Abst 2228. [3] Garry et al. (2011) LPSC XLII, Abst 2605.

  6. Narrow Angle movie

    NASA Technical Reports Server (NTRS)

    1999-01-01

    This brief three-frame movie of the Moon was made from three Cassini narrow-angle images as the spacecraft passed by the Moon on the way to its closest approach with Earth on August 17, 1999. The purpose of this particular set of images was to calibrate the spectral response of the narrow-angle camera and to test its 'on-chip summing mode' data compression technique in flight. From left to right, they show the Moon in the green, blue and ultraviolet regions of the spectrum in 40, 60 and 80 millisecond exposures, respectively. All three images have been scaled so that the brightness of Crisium basin, the dark circular region in the upper right, is the same in each image. The spatial scale in the blue and ultraviolet images is 1.4 miles per pixel (2.3 kilometers). The original scale in the green image (which was captured in the usual manner and then reduced in size by 2x2 pixel summing within the camera system) was 2.8 miles per pixel (4.6 kilometers). It has been enlarged for display to the same scale as the other two. The imaging data were processed and released by the Cassini Imaging Central Laboratory for Operations (CICLOPS) at the University of Arizona's Lunar and Planetary Laboratory, Tucson, AZ.

    Photo Credit: NASA/JPL/Cassini Imaging Team/University of Arizona

    Cassini, launched in 1997, is a joint mission of NASA, the European Space Agency and Italian Space Agency. The mission is managed by NASA's Jet Propulsion Laboratory, Pasadena, CA, for NASA's Office of Space Science, Washington DC. JPL is a division of the California Institute of Technology, Pasadena, CA.

  7. NFC - Narrow Field Camera

    NASA Astrophysics Data System (ADS)

    Koukal, J.; Srba, J.; Gorková, S.

    2015-01-01

    We have been introducing a low-cost CCTV video system for faint meteor monitoring and here we describe the first results from 5 months of two-station operations. Our system called NFC (Narrow Field Camera) with a meteor limiting magnitude around +6.5mag allows research on trajectories of less massive meteoroids within individual parent meteor showers and the sporadic background. At present 4 stations (2 pairs with coordinated fields of view) of NFC system are operated in the frame of CEMeNt (Central European Meteor Network). The heart of each NFC station is a sensitive CCTV camera Watec 902 H2 and a fast cinematographic lens Meopta Meostigmat 1/50 - 52.5 mm (50 mm focal length and fixed aperture f/1.0). In this paper we present the first results based on 1595 individual meteors, 368 of which were recorded from two stations simultaneously. This data set allows the first empirical verification of theoretical assumptions for NFC system capabilities (stellar and meteor magnitude limit, meteor apparent brightness distribution and accuracy of single station measurements) and the first low mass meteoroid trajectory calculations. Our experimental data clearly showed the capabilities of the proposed system for low mass meteor registration and for calculations based on NFC data to lead to a significant refinement in the orbital elements for low mass meteoroids.

  8. High-resolution topomapping of candidate MER landing sites with Mars Orbiter Camera narrow-angle images

    USGS Publications Warehouse

    Kirk, R.L.; Howington-Kraus, E.; Redding, B.; Galuszka, D.; Hare, T.M.; Archinal, B.A.; Soderblom, L.A.; Barrett, J.M.

    2003-01-01

    We analyzed narrow-angle Mars Orbiter Camera (MOC-NA) images to produce high-resolution digital elevation models (DEMs) in order to provide topographic and slope information needed to assess the safety of candidate landing sites for the Mars Exploration Rovers (MER) and to assess the accuracy of our results by a variety of tests. The mapping techniques developed also support geoscientific studies and can be used with all present and planned Mars-orbiting scanner cameras. Photogrammetric analysis of MOC stereopairs yields DEMs with 3-pixel (typically 10 m) horizontal resolution, vertical precision consistent with ???0.22 pixel matching errors (typically a few meters), and slope errors of 1-3??. These DEMs are controlled to the Mars Orbiter Laser Altimeter (MOLA) global data set and consistent with it at the limits of resolution. Photoclinometry yields DEMs with single-pixel (typically ???3 m) horizontal resolution and submeter vertical precision. Where the surface albedo is uniform, the dominant error is 10-20% relative uncertainty in the amplitude of topography and slopes after "calibrating" photoclinometry against a stereo DEM to account for the influence of atmospheric haze. We mapped portions of seven candidate MER sites and the Mars Pathfinder site. Safety of the final four sites (Elysium, Gusev, Isidis, and Meridiani) was assessed by mission engineers by simulating landings on our DEMs of "hazard units" mapped in the sites, with results weighted by the probability of landing on those units; summary slope statistics show that most hazard units are smooth, with only small areas of etched terrain in Gusev crater posing a slope hazard.

  9. Wide angle pinhole camera

    NASA Technical Reports Server (NTRS)

    Franke, J. M.

    1978-01-01

    Hemispherical refracting element gives pinhole camera 180 degree field-of-view without compromising its simplicity and depth-of-field. Refracting element, located just behind pinhole, bends light coming in from sides so that it falls within image area of film. In contrast to earlier pinhole cameras that used water or other transparent fluids to widen field, this model is not subject to leakage and is easily loaded and unloaded with film. Moreover, by selecting glass with different indices of refraction, field at film plane can be widened or reduced.

  10. Photogrammetric measurement of 3D freeform millimetre-sized objects with micro features: an experimental validation of the close-range camera calibration model for narrow angles of view

    NASA Astrophysics Data System (ADS)

    Percoco, Gianluca; Sánchez Salmerón, Antonio J.

    2015-09-01

    The measurement of millimetre and micro-scale features is performed by high-cost systems based on technologies with narrow working ranges to accurately control the position of the sensors. Photogrammetry would lower the costs of 3D inspection of micro-features and would be applicable to the inspection of non-removable micro parts of large objects too. Unfortunately, the behaviour of photogrammetry is not known when photogrammetry is applied to micro-features. In this paper, the authors address these issues towards the application of digital close-range photogrammetry (DCRP) to the micro-scale, taking into account that in literature there are research papers stating that an angle of view (AOV) around 10° is the lower limit to the application of the traditional pinhole close-range calibration model (CRCM), which is the basis of DCRP. At first a general calibration procedure is introduced, with the aid of an open-source software library, to calibrate narrow AOV cameras with the CRCM. Subsequently the procedure is validated using a reflex camera with a 60 mm macro lens, equipped with extension tubes (20 and 32 mm) achieving magnification of up to 2 times approximately, to verify literature findings with experimental photogrammetric 3D measurements of millimetre-sized objects with micro-features. The limitation experienced by the laser printing technology, used to produce the bi-dimensional pattern on common paper, has been overcome using an accurate pattern manufactured with a photolithographic process. The results of the experimental activity prove that the CRCM is valid for AOVs down to 3.4° and that DCRP results are comparable with the results of existing and more expensive commercial techniques.

  11. Argon laser trabeculoplasty in narrow angle glaucoma.

    PubMed

    Wishart, P K; Nagasubramanian, S; Hitchings, R A

    1987-01-01

    A prospective trial of argon laser trabeculoplasty (ALT) in narrow angle glaucoma (NAG) was undertaken. In eyes with NAG the mechanism of the glaucoma could be a combination of pupil block with subsequent irido-trabecular adhesion and trabecular damage with an increase in outflow resistance. To achieve relief of pupil block, eyes were randomly assigned to treatment with short pulsed laser iridotomy (LI) with the YAG or Dye lasers, or surgical peripheral iridectomy (PI). Alternatively, argon laser iridoplasty (IP) was performed to widen the anterior chamber angle sufficiently to permit ALT. Fifty-two eyes were treated and follow-up was from 12 to 22 months. A high rate of failure to control IOP with topical medication and progression of visual field loss occurred in all treatment groups. Iridoplasty followed by ALT was particularly unsuccessful as, in 50 per cent of cases, progressive synechial closure of the anterior chamber angle occurred following treatment. In eyes treated with PI/LI and ALT, the IOP control was improved in 12 per cent, unchanged in 30 per cent and remained uncontrolled in 58 per cent. By 15 months follow-up, a satisfactory outcome (IOP less than 21 mmHg on topical medication, visual field and acuity stable) was obtained in 24 per cent of the 33 eyes treated with PI/LI and ALT. Thirty-one of these eyes showed visual field loss. Of the 10 eyes that did not receive ALT following PI or LI, 90 per cent had a satisfactory outcome. Eight of these eyes showed little or no visual field loss. The authors conclude that iridoplasty followed by ALT is an unsuitable treatment for eyes with NAG. We further conclude that ALT is unlikely to be of benefit in eyes with NAG and visual field loss, even after pupil block has been relieved. Relief of pupil block alone may help eyes with early NAG without visual field loss. PMID:3446535

  12. Inflight Calibration of the Lunar Reconnaissance Orbiter Camera Wide Angle Camera

    NASA Astrophysics Data System (ADS)

    Mahanti, P.; Humm, D. C.; Robinson, M. S.; Boyd, A. K.; Stelling, R.; Sato, H.; Denevi, B. W.; Braden, S. E.; Bowman-Cisneros, E.; Brylow, S. M.; Tschimmel, M.

    2016-04-01

    The Lunar Reconnaissance Orbiter Camera (LROC) Wide Angle Camera (WAC) has acquired more than 250,000 images of the illuminated lunar surface and over 190,000 observations of space and non-illuminated Moon since 1 January 2010. These images, along with images from the Narrow Angle Camera (NAC) and other Lunar Reconnaissance Orbiter instrument datasets are enabling new discoveries about the morphology, composition, and geologic/geochemical evolution of the Moon. Characterizing the inflight WAC system performance is crucial to scientific and exploration results. Pre-launch calibration of the WAC provided a baseline characterization that was critical for early targeting and analysis. Here we present an analysis of WAC performance from the inflight data. In the course of our analysis we compare and contrast with the pre-launch performance wherever possible and quantify the uncertainty related to various components of the calibration process. We document the absolute and relative radiometric calibration, point spread function, and scattered light sources and provide estimates of sources of uncertainty for spectral reflectance measurements of the Moon across a range of imaging conditions.

  13. Inflight Calibration of the Lunar Reconnaissance Orbiter Camera Wide Angle Camera

    NASA Astrophysics Data System (ADS)

    Mahanti, P.; Humm, D. C.; Robinson, M. S.; Boyd, A. K.; Stelling, R.; Sato, H.; Denevi, B. W.; Braden, S. E.; Bowman-Cisneros, E.; Brylow, S. M.; Tschimmel, M.

    2015-09-01

    The Lunar Reconnaissance Orbiter Camera (LROC) Wide Angle Camera (WAC) has acquired more than 250,000 images of the illuminated lunar surface and over 190,000 observations of space and non-illuminated Moon since 1 January 2010. These images, along with images from the Narrow Angle Camera (NAC) and other Lunar Reconnaissance Orbiter instrument datasets are enabling new discoveries about the morphology, composition, and geologic/geochemical evolution of the Moon. Characterizing the inflight WAC system performance is crucial to scientific and exploration results. Pre-launch calibration of the WAC provided a baseline characterization that was critical for early targeting and analysis. Here we present an analysis of WAC performance from the inflight data. In the course of our analysis we compare and contrast with the pre-launch performance wherever possible and quantify the uncertainty related to various components of the calibration process. We document the absolute and relative radiometric calibration, point spread function, and scattered light sources and provide estimates of sources of uncertainty for spectral reflectance measurements of the Moon across a range of imaging conditions.

  14. SCDU (Spectral Calibration Development Unit) Testbed Narrow Angle Astrometric Performance

    NASA Technical Reports Server (NTRS)

    Wang, Xu; Goullioud, Renaud; Nemati, Bijan; Shao, Michael; Wehmeier, Udo J.; Weilert, Mark A.; Werne, Thomas A.; Wu, Janet P.; Zhai, Chengxing

    2010-01-01

    The most stringent astrometric performance requirements on NASA's SIM(Space Interferometer Mission)-Lite mission will come from the so-called Narrow-Angle (NA) observing scenario, aimed at finding Earth-like exoplanets, where the interferometer chops between the target star and several nearby reference stars multiple times over the course of a single visit. Previously, about 20 pm NA error with various shifts was reported. Since then, investigation has been under way to understand the mechanisms that give rise to these shifts. In this paper we report our findings, the adopted mitigation strategies, and the resulting testbed performance.

  15. WIDE-ANGLE, NARROW-ANGLE, AND IMAGING BASELINES OF OPTICAL LONG-BASELINE INTERFEROMETERS

    SciTech Connect

    Woillez, J.; Lacour, S. E-mail: sylvestre.lacour@obspm.fr

    2013-02-10

    For optical interferometers, the baseline is typically defined as the vector joining two perfectly identical telescopes. However, when the telescopes are naturally different or when the requirements on the baseline vector challenge the telescope perfection, the baseline definition depends on how the interferometer is used. This is where the notions of wide-angle, narrow-angle, and imaging baselines come into play. This article explores this variety of baselines, with the purpose of presenting a coherent set of definitions, describing how they relate to each other, and suggesting baseline metrology requirements. Ultimately, this work aims at supporting upcoming long-baseline optical interferometers with narrow-angle astrometry and phase-referenced imaging capabilities at the microarcsecond level.

  16. Associations between Narrow Angle and Adult Anthropometry: The Liwan Eye Study

    PubMed Central

    Jiang, Yuzhen; He, Mingguang; Friedman, David S.; Khawaja, Anthony P.; Lee, Pak Sang; Nolan, Winifred P.; Yin, Qiuxia; Foster, Paul J.

    2015-01-01

    Purpose To assess the associations between narrow angle and adult anthropometry. Methods Chinese adults aged 50 years and older were recruited from a population-based survey in the Liwan District of Guangzhou, China. Narrow angle was defined as the posterior trabecular meshwork not visible under static gonioscopy in at least three quadrants (i.e. a circumference of at least 270°). Logistic regression models were used to examine the associations between narrow angle and anthropomorphic measures (height, weight and body mass index, BMI). Results Among the 912 participants, lower weight, shorter height, and lower BMI were significantly associated with narrower angle width (tests for trend: mean angle width in degrees vs weight p<0.001; vs height p<0.001; vs BMI p = 0.012). In univariate analyses, shorter height, lower weight and lower BMI were all significantly associated with greater odds of narrow angle. The crude association between height and narrow angle was largely attributable to a stronger association with age and sex. Lower BMI and weight remained significantly associated with narrow angle after adjustment for height, age, sex, axial ocular biometric measures and education. In analyses stratified by sex, the association between BMI and narrow angle was only observed in women. Conclusion Lower BMI and weight were associated with significantly greater odds of narrow angle after adjusting for age, education, axial ocular biometric measures and height. The odds of narrow angle increased 7% per 1 unit decrease in BMI. This association was most evident in women. PMID:24707840

  17. Improved wide-angle, fisheye and omnidirectional camera calibration

    NASA Astrophysics Data System (ADS)

    Urban, Steffen; Leitloff, Jens; Hinz, Stefan

    2015-10-01

    In this paper an improved method for calibrating wide-angle, fisheye and omnidirectional imaging systems is presented. We extend the calibration procedure proposed by Scaramuzza et al. by replacing the residual function and joint refinement of all parameters. In doing so, we achieve a more stable, robust and accurate calibration (up to factor 7) and can reduce the number of necessary calibration steps from five to three. After introducing the camera model and highlighting the differences from the current calibration procedure, we perform a comprehensive performance evaluation using several data sets and show the impact of the proposed calibration procedure on the calibration results.

  18. Spectral data of specular reflectance, narrow-angle transmittance and angle-resolved surface scattering of materials for solar concentrators.

    PubMed

    Good, Philipp; Cooper, Thomas; Querci, Marco; Wiik, Nicolay; Ambrosetti, Gianluca; Steinfeld, Aldo

    2016-03-01

    The spectral specular reflectance of conventional and novel reflective materials for solar concentrators is measured with an acceptance angle of 17.5 mrad over the wavelength range 300-2500 nm at incidence angles 15-60° using a spectroscopic goniometry system. The same experimental setup is used to determine the spectral narrow-angle transmittance of semi-transparent materials for solar collector covers at incidence angles 0-60°. In addition, the angle-resolved surface scattering of reflective materials is recorded by an area-scan CCD detector over the spectral range 350-1050 nm. A comprehensive summary, discussion, and interpretation of the results are included in the associated research article "Spectral reflectance, transmittance, and angular scattering of materials for solar concentrators" in Solar Energy Materials and Solar Cells. PMID:26862556

  19. Spectral data of specular reflectance, narrow-angle transmittance and angle-resolved surface scattering of materials for solar concentrators

    PubMed Central

    Good, Philipp; Cooper, Thomas; Querci, Marco; Wiik, Nicolay; Ambrosetti, Gianluca; Steinfeld, Aldo

    2015-01-01

    The spectral specular reflectance of conventional and novel reflective materials for solar concentrators is measured with an acceptance angle of 17.5 mrad over the wavelength range 300−2500 nm at incidence angles 15–60° using a spectroscopic goniometry system. The same experimental setup is used to determine the spectral narrow-angle transmittance of semi-transparent materials for solar collector covers at incidence angles 0–60°. In addition, the angle-resolved surface scattering of reflective materials is recorded by an area-scan CCD detector over the spectral range 350–1050 nm. A comprehensive summary, discussion, and interpretation of the results are included in the associated research article “Spectral reflectance, transmittance, and angular scattering of materials for solar concentrators” in Solar Energy Materials and Solar Cells. PMID:26862556

  20. Narrow Field-Of Visual Odometry Based on a Focused Plenoptic Camera

    NASA Astrophysics Data System (ADS)

    Zeller, N.; Quint, F.; Stilla, U.

    2015-03-01

    In this article we present a new method for visual odometry based on a focused plenoptic camera. This method fuses the depth data gained by a monocular Simultaneous Localization and Mapping (SLAM) algorithm and the one received from a focused plenoptic camera. Our algorithm uses the depth data and the totally focused images supplied by the plenoptic camera to run a real-time semi-dense direct SLAM algorithm. Based on this combined approach, the scale ambiguity of a monocular SLAM system can be overcome. Furthermore, the additional light-field information highly improves the tracking capabilities of the algorithm. Thus, visual odometry even for narrow field of view (FOV) cameras is possible. We show that not only tracking profits from the additional light-field information. By accumulating the depth information over multiple tracked images, also the depth accuracy of the focused plenoptic camera can be highly improved. This novel approach improves the depth error by one order of magnitude compared to the one received from a single light-field image.

  1. Improved iris localization by using wide and narrow field of view cameras for iris recognition

    NASA Astrophysics Data System (ADS)

    Kim, Yeong Gon; Shin, Kwang Yong; Park, Kang Ryoung

    2013-10-01

    Biometrics is a method of identifying individuals by their physiological or behavioral characteristics. Among other biometric identifiers, iris recognition has been widely used for various applications that require a high level of security. When a conventional iris recognition camera is used, the size and position of the iris region in a captured image vary according to the X, Y positions of a user's eye and the Z distance between a user and the camera. Therefore, the searching area of the iris detection algorithm is increased, which can inevitably decrease both the detection speed and accuracy. To solve these problems, we propose a new method of iris localization that uses wide field of view (WFOV) and narrow field of view (NFOV) cameras. Our study is new as compared to previous studies in the following four ways. First, the device used in our research acquires three images, one each of the face and both irises, using one WFOV and two NFOV cameras simultaneously. The relation between the WFOV and NFOV cameras is determined by simple geometric transformation without complex calibration. Second, the Z distance (between a user's eye and the iris camera) is estimated based on the iris size in the WFOV image and anthropometric data of the size of the human iris. Third, the accuracy of the geometric transformation between the WFOV and NFOV cameras is enhanced by using multiple matrices of the transformation according to the Z distance. Fourth, the searching region for iris localization in the NFOV image is significantly reduced based on the detected iris region in the WFOV image and the matrix of geometric transformation corresponding to the estimated Z distance. Experimental results showed that the performance of the proposed iris localization method is better than that of conventional methods in terms of accuracy and processing time.

  2. Integral three-dimensional capture system with enhanced viewing angle by using camera array

    NASA Astrophysics Data System (ADS)

    Miura, Masato; Okaichi, Naoto; Arai, Jun; Mishina, Tomoyuki

    2015-03-01

    A three-dimensional (3D) capture system based on integral imaging with an enhanced viewing zone by using a camera array was developed. The viewing angle of the 3D image can be enlarged depending on the number of cameras consisting of the camera array. The 3D image was captured by using seven high-definition cameras, and converted to be displayed by using a 3D display system with a 4K LCD panel, and it was confirmed that the viewing angle of the 3D image can be enlarged by a factor of 2.5 compared with that of a single camera.

  3. 12. 22'X34' original blueprint, VariableAngle Launcher, 'GENERAL SIDE VIEW CAMERA ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    12. 22'X34' original blueprint, Variable-Angle Launcher, 'GENERAL SIDE VIEW CAMERA STATIONS' drawn at 1/2'=1'-0'. (BUORD Sketch # 209111). - Variable Angle Launcher Complex, Camera Stations, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  4. 11. 22'X34' original blueprint, VariableAngle Launcher, 'SIDE VIEW CAMERA WOOD ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    11. 22'X34' original blueprint, Variable-Angle Launcher, 'SIDE VIEW CAMERA WOOD FRAME SUPERSTRUCTURE' drawn at 1/2'=1'-0'. (BOURD Sketch # 209125). - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  5. 12. 22'X34' original vellum, VariableAngle Launcher, 'SIDE VIEW CAMERA TRACK ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    12. 22'X34' original vellum, Variable-Angle Launcher, 'SIDE VIEW CAMERA TRACK H-20 BRIDGE MODIFICATIONS' drawn at 3/16'=1'-0' and 1/2'1'-0'. (BUORD Sketch # 208784, PAPW 907). - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  6. 13. 22'X34' original vellum, VariableAngle Launcher, 'SIDEVIEW CAMERA CAR TRACK ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    13. 22'X34' original vellum, Variable-Angle Launcher, 'SIDEVIEW CAMERA CAR TRACK DETAILS' drawn at 1/4'=1'-0' (BUORD Sketch # 208078, PAPW 908). - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  7. 10. 22'X34' original blueprint, VariableAngle Launcher, 'SIDE VIEW CAMERA CARSTEEL ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    10. 22'X34' original blueprint, Variable-Angle Launcher, 'SIDE VIEW CAMERA CAR-STEEL FRAME AND AXLES' drawn at 1/2'=1'-0'. (BOURD Sketch # 209124). - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  8. The design and fabricate of wide angle 905nm narrow band filter

    NASA Astrophysics Data System (ADS)

    Shi, Baohua; Li, Zaijin; Li, Hongyu; Qu, Yi

    2014-12-01

    All-dielectric film narrow band filter is widely used in laser system owing to its excellent optical capability, manufacturability and environmental adaptability. But 905nm infrared semiconductor laser system have large divergence angel so we designed entrance light cone angle 905nm narrow band filter. And center wavelength shift, due to entrance light cone angle, affects its spectral selective power seriously. In order to reduce these impacts, an informal dielectric film narrowband filter is designed. Changes of transmission characteristics with oblique incidence of Gaussian beam of uneven illumination are analyzed. The relationship between the angle of incidence and the central wavelength shift quantificational are Solved. A ± 30 ° incident 905nm narrowband filter was fabricated. Between 880nm and 950nm, the average transmittance is above 90%, and at the cut-off band the average transmittance is below 1%.

  9. 93. 22'X34' original blueprint, VariableAngle Launcher, 'OVERHEAD CAMERA SUSPENSION SYSTEM, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    93. 22'X34' original blueprint, Variable-Angle Launcher, 'OVERHEAD CAMERA SUSPENSION SYSTEM, TOWER STAY CABLES' drawn at 3/4'=1'-0'. (BUORD Sketch # 208783). - Variable Angle Launcher Complex, Variable Angle Launcher, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  10. 92. 22'X34' original blueprint, VariableAngle Launcher, 'CAMERA CABLE TOWER PLAN ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    92. 22'X34' original blueprint, Variable-Angle Launcher, 'CAMERA CABLE TOWER PLAN AND ELEVATION' drawn at 3/8'=1'0' (BUORD Sketch # 208580). - Variable Angle Launcher Complex, Variable Angle Launcher, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  11. Wide-Angle, Reflective Strip-Imaging Camera

    NASA Technical Reports Server (NTRS)

    Vaughan, Arthur H.

    1992-01-01

    Proposed camera images thin, striplike portion of field of view of 180 degrees wide. Hemispherical concave reflector forms image onto optical fibers, which transfers it to strip of photodetectors or spectrograph. Advantages include little geometric distortion, achromatism, and ease of athermalization. Uses include surveillance of clouds, coarse mapping of terrain, measurements of bidirectional reflectance distribution functions of aerosols, imaging spectrometry, oceanography, and exploration of planets.

  12. Narrow-angle tail radio sources and the distribution of galaxy orbits in Abell clusters

    NASA Technical Reports Server (NTRS)

    O'Dea, Christopher P.; Sarazin, Craig L.; Owen, Frazer N.

    1987-01-01

    The present data on the orientations of the tails with respect to the cluster centers of a sample of 70 narrow-angle-tail (NAT) radio sources in Abell clusters show the distribution of tail angles to be inconsistent with purely radial or circular orbits in all the samples, while being consistent with isotropic orbits in (1) the whole sample, (2) the sample of NATs far from the cluster center, and (3) the samples of morphologically regular Abell clusters. Evidence for very radial orbits is found, however, in the sample of NATs near the cluster center. If these results can be generalized to all cluster galaxies, then the presence of radial orbits near the center of Abell clusters suggests that violent relaxation may not have been fully effective even within the cores of the regular clusters.

  13. On-Orbit Geometric Calibration of the Lunar Reconnaissance Orbiter Wide Angle Camera

    NASA Astrophysics Data System (ADS)

    Speyerer, E. J.; Wagner, R.; Robinson, M. S.

    2013-12-01

    Lunar Reconnaissance Orbiter (LRO) is equipped with a single Wide Angle Camera (WAC) [1] designed to collect monochromatic and multispectral observations of the lunar surface. Cartographically accurate image mosaics and stereo image based terrain models requires the position of each pixel in a given image be known to a corresponding point on the lunar surface with a high degree of accuracy and precision. The Lunar Reconnaissance Orbiter Camera (LROC) team initially characterized the WAC geometry prior to launch at the Malin Space Science Systems calibration facility. After lunar orbit insertion, the LROC team recognized spatially varying geometric offsets between color bands. These misregistrations made analysis of the color data problematic and showed that refinements to the pre-launch geometric analysis were necessary. The geometric parameters that define the WAC optical system were characterized from statistics gathered from co-registering over 84,000 image pairs. For each pair, we registered all five visible WAC bands to a precisely rectified Narrow Angle Camera (NAC) image (accuracy <15 m) [2] to compute key geometric parameters. In total, we registered 2,896 monochrome and 1,079 color WAC observations to nearly 34,000 NAC observations and collected over 13.7 million data points across the visible portion of the WAC CCD. Using the collected statistics, we refined the relative pointing (yaw, pitch and roll), effective focal length, principal point coordinates, and radial distortion coefficients. This large dataset also revealed spatial offsets between bands after orthorectification due to chromatic aberrations in the optical system. As white light enters the optical system, the light bends at different magnitudes as a function of wavelength, causing a single incident ray to disperse in a spectral spread of color [3,4]. This lateral chromatic aberration effect, also known as 'chromatic difference in magnification' [5] introduces variation to the effective focal length for each WAC band. Secondly, tangential distortions caused by minor decentering in the optical system altered the derived exterior orientation parameters for each 14-line WAC band. We computed the geometric parameter sets separately for each band to characterize the lateral chromatic aberrations and the decentering components in the WAC optical system. From this approach, we negated the need for additional tangential terms in the distortion model, thus reducing the number of computations during image orthorectification and therefore expediting the orthorectification process. We undertook a similar process for refining the geometry for the UV bands (321 and 360 nm), except we registered each UV bands to orthorectified visible bands of the same WAC observation (the visible bands have resolutions 4 times greater than the UV). The resulting 7-band camera model with refined geometric parameters enables map projection with sub-pixel accuracy. References: [1] Robinson et al. (2010) Space Sci. Rev. 150, 81-124 [2] Wagner et al. (2013) Lunar Sci Forum [3] Mahajan, V.N. (1998) Optical Imaging and Aberrations [4] Fiete, R.D. (2013), Manual of Photogrammetry, pp. 359-450 [5] Brown, D.C. (1966) Photometric Eng. 32, 444-462.

  14. Narrow-angle tail radio sources and evidence for radial orbits in Abell clusters

    NASA Technical Reports Server (NTRS)

    O'Dea, Christopher P.; Owen, Frazer N.; Sarazin, Craig L.

    1986-01-01

    Published observational data on the tail orientations (TOs) of 60 narrow-angle-tail (NAT) radio sources in Abell clusters of galaxies are analyzed statistically using a maximum-likelihood approach. The results are presented in a table, and it is found that the observed TO distributions in the whole sample and in subsamples of morphologically regular NATs and NATs with pericentric distances d greater than 500 kpc are consistent with isotropic orbits, whereas the TOs for NATs with d less than 500 kpc are consistent with highly radial orbits. If radial orbits were observed near the centers of other types of cluster galaxies as well, it could be inferred that violent relaxation during cluster formation was incomplete, and that clusters form by spherical collapse and secondary infall, as proposed by Gunn (1977).

  15. Narrow Angle Wide Spectral Range Radiometer Design FEANICS/REEFS Radiometer Design Report

    NASA Technical Reports Server (NTRS)

    Camperchioli, William

    2005-01-01

    A critical measurement for the Radiative Enhancement Effects on Flame Spread (REEFS) microgravity combustion experiment is the net radiative flux emitted from the gases and from the solid fuel bed. These quantities are measured using a set of narrow angle, wide spectral range radiometers. The radiometers are required to have an angular field of view of 1.2 degrees and measure over the spectral range of 0.6 to 30 microns, which presents a challenging design effort. This report details the design of this radiometer system including field of view, radiometer response, radiometric calculations, temperature effects, error sources, baffling and amplifiers. This report presents some radiometer specific data but does not present any REEFS experiment data.

  16. Calibration of a trinocular system formed with wide angle lens cameras.

    PubMed

    Ricolfe-Viala, Carlos; Sanchez-Salmeron, Antonio-Jose; Valera, Angel

    2012-12-01

    To obtain 3D information of large areas, wide angle lens cameras are used to reduce the number of cameras as much as possible. However, since images are high distorted, errors in point correspondences increase and 3D information could be erroneous. To increase the number of data from images and to improve the 3D information, trinocular sensors are used. In this paper a calibration method for a trinocular sensor formed with wide angle lens cameras is proposed. First pixels locations in the images are corrected using a set of constraints which define the image formation in a trinocular system. When pixels location are corrected, lens distortion and trifocal tensor is computed. PMID:23262716

  17. A two camera video imaging system with application to parafoil angle of attack measurements

    NASA Technical Reports Server (NTRS)

    Meyn, Larry A.; Bennett, Mark S.

    1991-01-01

    This paper describes the development of a two-camera, video imaging system for the determination of three-dimensional spatial coordinates from stereo images. This system successfully measured angle of attack at several span-wise locations for large-scale parafoils tested in the NASA Ames 80- by 120-Foot Wind Tunnel. Measurement uncertainty for angle of attack was less than 0.6 deg. The stereo ranging system was the primary source for angle of attack measurements since inclinometers sewn into the fabric ribs of the parafoils had unknown angle offsets acquired during installation. This paper includes discussions of the basic theory and operation of the stereo ranging system, system measurement uncertainty, experimental set-up, calibration results, and test results. Planned improvements and enhancements to the system are also discussed.

  18. Development of ultrawide-angle compact camera using free-form optics

    NASA Astrophysics Data System (ADS)

    Takahashi, Koichi

    2011-01-01

    Digital imaging enables us to easily obtain, store, process and display images as digital data, and it is used not only for digital cameras but also for surveillance and in-vehicle cameras, measuring devices, and so on. On the basis of our studies, a free-form optic is not suitable for a high-definition zooming capability; however, it makes optical devices smaller, thinner and lighter. Therefore, it is worth considering the applications of free-form optics other than camera modules for cellular phones. We have investigated the possibilities of miniaturization, which is the most significant feature of free-form optics, and developed a practical application. In this paper, we describe the results of design, prototyping, and evaluation of our ultrawide-angle compact imaging system using free-form optics.

  19. Topview stereo: combining vehicle-mounted wide-angle cameras to a distance sensor array

    NASA Astrophysics Data System (ADS)

    Houben, Sebastian

    2015-03-01

    The variety of vehicle-mounted sensors in order to fulfill a growing number of driver assistance tasks has become a substantial factor in automobile manufacturing cost. We present a stereo distance method exploiting the overlapping field of view of a multi-camera fisheye surround view system, as they are used for near-range vehicle surveillance tasks, e.g. in parking maneuvers. Hence, we aim at creating a new input signal from sensors that are already installed. Particular properties of wide-angle cameras (e.g. hanging resolution) demand an adaptation of the image processing pipeline to several problems that do not arise in classical stereo vision performed with cameras carefully designed for this purpose. We introduce the algorithms for rectification, correspondence analysis, and regularization of the disparity image, discuss reasons and avoidance of the shown caveats, and present first results on a prototype topview setup.

  20. Narrow Angle Diversity using ACTS Ka-band Signal with Two USAT Ground Stations

    NASA Technical Reports Server (NTRS)

    Kalu, A.; Emrich, C.; Ventre, J.; Wilson, W.; Acosta, R.

    1998-01-01

    Two ultra small aperture terminal (USAT) ground stations, separated by 1.2 km in a narrow angle diversity configuration, received a continuous Ka-band tone sent from Cleveland Link Evaluation Terminal (LET). The signal was transmitted to the USAT ground stations via NASA's Advanced Communications Technology Satellite (ACTS) steerable beam. Received signal power at the two sites was measured and analyzed. A dedicated datalogger at each site recorded time-of-tip data from tipping bucket rain gauges, providing rain amount and instantaneous rain rate. WSR-88D data was also obtained for the collection period. Eleven events with ground-to-satellite slant-path precipitation and resultant signal attenuation were observed during the data collection period. Fade magnitude and duration were compared at the two sites and diversity gain was calculated. These results exceeded standard diversity gain model predictions by several decibels. Rain statistics from tipping bucket data and from radar data were also compared to signal attenuation. The nature of Florida's subtropical rainfall, specifically its impact on signal attenuation at the sites, was addressed.

  1. The first demonstration of the concept of "narrow-FOV Si/CdTe semiconductor Compton camera"

    NASA Astrophysics Data System (ADS)

    Ichinohe, Yuto; Uchida, Yuusuke; Watanabe, Shin; Edahiro, Ikumi; Hayashi, Katsuhiro; Kawano, Takafumi; Ohno, Masanori; Ohta, Masayuki; Takeda, Shin`ichiro; Fukazawa, Yasushi; Katsuragawa, Miho; Nakazawa, Kazuhiro; Odaka, Hirokazu; Tajima, Hiroyasu; Takahashi, Hiromitsu; Takahashi, Tadayuki; Yuasa, Takayuki

    2016-01-01

    The Soft Gamma-ray Detector (SGD), to be deployed on board the ASTRO-H satellite, has been developed to provide the highest sensitivity observations of celestial sources in the energy band of 60-600 keV by employing a detector concept which uses a Compton camera whose field-of-view is restricted by a BGO shield to a few degree (narrow-FOV Compton camera). In this concept, the background from outside the FOV can be heavily suppressed by constraining the incident direction of the gamma ray reconstructed by the Compton camera to be consistent with the narrow FOV. We, for the first time, demonstrate the validity of the concept using background data taken during the thermal vacuum test and the low-temperature environment test of the flight model of SGD on ground. We show that the measured background level is suppressed to less than 10% by combining the event rejection using the anti-coincidence trigger of the active BGO shield and by using Compton event reconstruction techniques. More than 75% of the signals from the field-of-view are retained against the background rejection, which clearly demonstrates the improvement of signal-to-noise ratio. The estimated effective area of 22.8 cm2 meets the mission requirement even though not all of the operational parameters of the instrument have been fully optimized yet.

  2. Synthesizing wide-angle and arbitrary view-point images from a circular camera array

    NASA Astrophysics Data System (ADS)

    Fukushima, Norishige; Yendo, Tomohiro; Fujii, Toshiaki; Tanimoto, Masayuki

    2006-02-01

    We propose a technique of Imaged-Based Rendering(IBR) using a circular camera array. By the result of having recorded the scene as surrounding the surroundings, we can synthesize a more dynamic arbitrary viewpoint images and a wide angle images like a panorama . This method is based on Ray- Space, one of the image-based rendering, like Light Field. Ray-Space is described by the position (x, y) and a direction (θ, φ) of the ray's parameter which passes a reference plane. All over this space, when the camera has been arranged circularly, the orbit of the point equivalent to an Epipor Plane Image(EPI) at the time of straight line arrangement draws a sin curve. Although described in a very clear form, in case a rendering is performed, pixel of which position of which camera being used and the work for which it asks become complicated. Therefore, the position (u, v) of the position (s, t) pixel of a camera like Light Filed redescribes space expression. It makes the position of a camera a polar-coordinates system (r, theta), and is making it close to description of Ray-Space. Thereby, although the orbit of a point serves as a complicated periodic function of periodic 2pi, the handling of a rendering becomes easy. From such space, the same as straight line arrangement, arbitrary viewpoint picture synthesizing is performed only due to a geometric relationship between cameras. Moreover, taking advantage of the characteristic of concentrating on one circular point, we propose the technique of generating a wide-angle picture like a panorama. When synthesizing a viewpoint, since it is overlapped and is recording the ray of all the directions of the same position, this becomes possible. Having stated until now is the case where it is a time of the camera fully having been arranged and a plenoptic sampling being filled. The discrete thing which does not fill a sampling is described from here. When arranging a camera in a straight line and compounding a picture, in spite of assuming the pinhole camera model, an effect like a focus shows up. This is an effect peculiar to Light Field when a sampling is not fully performed, and is called a synthetic aperture. We have compounded all focal images by processing called an "Adaptive Filter" to such a phenomenon. An adaptive filter is the method of making the parallax difference map of perfect viewpoint dependence centering on a viewpoint to make. This is a phenomenon produced even when it has arranged circularly. Then, in circular camera arrangement, this adaptive filter is extended, and all focal pictures are compounded. Although there is a problem that an epipor line is not parallel etc. when it has arranged circularly, extension obtains enough, it comes only out of geometric information, and a certain thing is clarified By taking such a method, it succeeded in performing a wide angle and arbitrary viewpoint image synthesis also from discrete space also from the fully sampled space.

  3. Optical design of the wide angle camera for the Rosetta mission.

    PubMed

    Naletto, Giampiero; Da, Deppo Vania; Pelizzo, Maria Guglielmina; Ragazzoni, Roberto; Marchetti, Enrico

    2002-03-01

    The final optical design of the Wide Angle Camera for the Rosetta mission to the P/Wirtanen comet is described. This camera is an F/5.6 telescope with a rather large 12 degrees x 12 degrees field of view. To satisfy the scientific requirements for spatial resolution, contrast capability, and spectral coverage, a two-mirror, off-axis, and unobstructed optical design, believed to be novel, has been adopted. This configuration has been simulated with a ray-tracing code, showing that theoretically more than 80% of the collimated beam energy falls within a single pixel (20" x 20") over the whole camera field of view and that the possible contrast ratio is smaller than 1/1000. Moreover, this novel optical design is rather simple from a mechanical point of view and is compact and relatively easy to align. All these characteristics make this type of camera rather flexible and also suitable for other space missions with similar performance requirements. PMID:11900025

  4. A New Approach to Micro-arcsecond Astrometry with SIM Allowing Early Mission Narrow Angle Measurements of Compelling Astronomical Targets

    NASA Technical Reports Server (NTRS)

    Shaklan, Stuart; Pan, Xiaopei

    2004-01-01

    The Space Interferometry Mission (SIM) is capable of detecting and measuring the mass of terrestrial planets around stars other than our own. It can measure the mass of black holes and the visual orbits of radio and x-ray binary sources. SIM makes possible a new level of understanding of complex astrophysical processes. SIM achieves its high precision in the so-called narrow-angle regime. This is defined by a 1 degree diameter field in which the position of a target star is measured with respect to a set of reference stars. The observation is performed in two parts: first, SIM observes a grid of stars that spans the full sky. After a few years, repeated observations of the grid allow one to determine the orientation of the interferometer baseline. Second, throughout the mission, SIM periodically observes in the narrow-angle mode. Every narrow-angle observation is linked to the grid to determine the precise attitude and length of the baseline. The narrow angle process demands patience. It is not until five years after launch that SIM achieves its ultimate accuracy of 1 microarcsecond. The accuracy is degraded by a factor of approx. 2 at mid-mission. Our work proposes a technique for narrow angle astrometry that does not rely on the measurement of grid stars. This technique, called Gridless Narrow Angle Astrometry (GNAA) can obtain microarcsecond accuracy and can detect extra-solar planets and other exciting objects with a few days of observation. It can be applied as early as during the first six months of in-orbit calibration (IOC). The motivations for doing this are strong. First, and obviously, it is an insurance policy against a catastrophic mid-mission failure. Second, at the start of the mission, with several space-based interferometers in the planning or implementation phase, NASA will be eager to capture the public's imagination with interferometric science. Third, early results and a technique that can duplicate those results throughout the mission will give the analysts important experience in the proper use and calibration of SIM.

  5. Mesosphere light scattering depolarization during the Perseids activity epoch by wide-angle polarization camera measurements

    NASA Astrophysics Data System (ADS)

    Ugolnikov, Oleg S.; Maslov, Igor A.

    2014-03-01

    The paper describes the study of scattered radiation field in the mesosphere basing on wide-angle polarization camera (WAPC) measurements of the twilight sky background and single scattering separation procedure. Mid-August observations in 2012 and 2013 show the decrease of single scattering polarization value probably related with Perseids meteor dust moderation in the upper mesosphere. Effect correlates with activity of tiny fraction of Perseids shower. Polarization and temperature analysis allows estimating the altitude of dust layer and character polarization of dust scattering.

  6. Development of soft x-ray large solid angle camera onboard WF-MAXI

    NASA Astrophysics Data System (ADS)

    Kimura, Masashi; Tomida, Hiroshi; Ueno, Shiro; Kawai, Nobuyuki; Yatsu, Yoichi; Arimoto, Makoto; Mihara, Tatehiro; Serino, Motoko; Tsunemi, Hiroshi; Yoshida, Atsumasa; Sakamoto, Takanori; Kohmura, Takayoshi; Negoro, Hitoshi

    2014-07-01

    Wide-Field MAXI (WF-MAXI) planned to be installed in Japanese Experiment Module "Kibo" Exposed Facility of the international space station (ISS). WF-MAXI consists of two types of cameras, Soft X-ray Large Solid Angle Camera (SLC) and Hard X-ray Monitor (HXM). HXM is multi-channel arrays of CsI scintillators coupled with avalanche photodiodes (APDs) which covers the energy range of 20 - 200 keV. SLC is arrays of CCD, which is evolved version of MAXI/SSC. Instead of slit and collimator in SSC, SLC is equipped with coded mask allowing its field of view to 20% of all sky at any given time, and its location determination accuracy to few arcminutes. In older to achieve larger effective area, the number of CCD chip and the size of each chip will be larger than that of SSC. We are planning to use 59 x 31 mm2 CCD chip provided by Hamamatsu Photonics. Each camera will be quipped with 16 CCDs and total of 4 cameras will be installed in WF-MAXI. Since SLC utilize X-ray CCDs it must equip active cooling system for CCDs. Instead of using the peltier cooler, we use mechanical coolers that are also employed in Astro-H. In this way we can cool the CCDs down to -100C. ISS orbit around the earth in 90 minutes; therefore a point source moves 4 arcminutes per second. In order to achieve location determination accuracy, we need fast readout from CCD. The pulse heights are stacked into a single row along the vertical direction. Charge is transferred continuously, thus the spatial information along the vertical direction is lost and replaced with the precise arrival time information. Currently we are making experimental model of the camera body including the CCD and electronics for the CCDs. In this paper, we show the development status of SLC.

  7. Limitations of the narrow-angle convergent pair. [of Viking Orbiter photographs for triangulation and topographic mapping

    NASA Technical Reports Server (NTRS)

    Arthur, D. W. G.

    1977-01-01

    Spatial triangulations and topographies of the Martian surface derived from Viking Orbiter pictures depend on the use of symmetric narrow-angle convergent pairs. The overlap in each pair is close to 100 percent and the ground principal points virtually coincide. The analysis of this paper reveals a high degree of indeterminacy in such pairs and at least in part explains the rather disappointing precision of the associated spatial triangulations.

  8. Small-angle approximation to the transfer of narrow laser beams in anisotropic scattering media

    NASA Technical Reports Server (NTRS)

    Box, M. A.; Deepak, A.

    1981-01-01

    The broadening and the signal power detected of a laser beam traversing an anisotropic scattering medium were examined using the small-angle approximation to the radiative transfer equation in which photons suffering large-angle deflections are neglected. To obtain tractable answers, simple Gaussian and non-Gaussian functions for the scattering phase functions are assumed. Two other approximate approaches employed in the field to further simplify the small-angle approximation solutions are described, and the results obtained by one of them are compared with those obtained using small-angle approximation. An exact method for obtaining the contribution of each higher order scattering to the radiance field is examined but no results are presented.

  9. Empirical Photometric Normalization for the Seven Band UV-VIS Lunar Reconnaissance Orbiter Wide Angle Camera

    NASA Astrophysics Data System (ADS)

    Boyd, A. K.; Robinson, M. S.; Nuno, R. G.; Sato, H.

    2014-12-01

    We present results on a near-global (80°S to 80°N) seven color Wide Angle Camera (WAC) photometric normalization and color analysis. Over 100,000 WAC color observations were calibrated to reflectance (radiance factor: IoF), and photometric angles (i,e,g), latitude, and longitude were calculated and stored for each WAC pixel. Photometric angles were calculated using the WAC GLD100 [1], and a six-dimensional data set (3 spatial and 3 photometric) was reduced to three by photometrically normalizing the IoF with a global wavelength-dependent, 3rd-order multivariate polynomial. The multispectral mosaic was normalized to a standard viewing geometry (incidence angle=30°, emission angle=0°, phase angle=30°).The WAC has a 60° cross-track field-of-view in color mode, which allows the acquisition of a near global data set each month; however, the phase angle can change by as much as 60° across each image. These large changes in viewing geometry present challenges to the required photometric normalization. In the ratio of the 321 nm and 689 nm wavelengths, the Moon has a standard deviation less than 3% in the highlands and 7% globally; thus to allow confident identification of true color differences, the photometric normalization must be precise. Pyroclastic deposits in Marius Hills, Sinus Aestuum, and Mare Serenitatis are among the least reflective materials, with 643 nm normalized reflectance values less than 0.036.Low-reflectance deposits are generally concentrated close to the equator on the nearside, whereas high-reflectance materials are dispersed globally. The highest reflectance materials occur at Giordano Bruno and Virtanen craters and are attributed to exposure of immature materials. Immature ejecta has shallower spectral slope compared to the mean highlands spectra (321 nm to 689 nm), and UV weathering characteristics can be seen when comparing different aged Copernican ejecta [2]. Copernican ejecta is found to have 643 nm reflectance values greater than 0.36 in some areas. The range of reflectance on the Moon is 10x from the least to most reflective.The new empirical normalized reflectance presented here correlates with an independent Hapke model based normalization [3] with an R-squared value of 0.985.[1] Scholten et al. LPSC XVII (2011) [2] Denevi et al. JGR Planets (2014) [3] Sato et al. JGR Planets (2014)

  10. Automatic screening of narrow anterior chamber angle and angle-closure glaucoma based on slit-lamp image analysis by using support vector machine.

    PubMed

    Theeraworn, C; Kongprawechnon, W; Kondo, T; Bunnun, P; Nishihara, A; Manassakorn, A

    2013-01-01

    At present, Van Herick's method is a standard technique used to screen a Narrow Anterior Chamber Angle (NACA) and Angle-Closure Glaucoma (ACG). It can identify a patient who suffers from NACA and ACG by considering the width of peripheral anterior chamber depth (PACD) and corneal thickness. However, the screening result of this method often varies among ophthalmologists. So, an automatic screening of NACA and ACG based on slit-lamp image analysis by using Support Vector Machine (SVM) is proposed. SVM can automatically generate the classification model, which is used to classify the result as an angle-closure likely or an angle-closure unlikely. It shows that it can improve the accuracy of the screening result. To develop the classification model, the width of PACD and corneal thickness from many positions are measured and selected to be features. A statistic analysis is also used in the PACD and corneal thickness estimation in order to reduce the error from reflection on the cornea. In this study, it is found that the generated models are evaluated by using 5-fold cross validation and give a better result than the result classified by Van Herick's method. PMID:24111078

  11. Measurement of the Atmospheric Limit to Narrow Angle Interferometric Astrometry Using the Mark-Iii Stellar Interferometer

    NASA Astrophysics Data System (ADS)

    Colavita, M. M.

    1994-03-01

    Measurements were made with the Mark III stellar interferometer in order to verify predictions for the accuracy of very-narrow-angle interferometric astrometry. The Mark III was modified to observe simultaneously on its 12-m baseline the phase of the fringe packets of the primary and secondary of the long-period visual binary star α Gem. The residuals of the phase difference between primary and secondary were analyzed for 6 data segments taken over two nights. Examination of the Allan variances of the data out to a measurement limit of 8 min indicates that the error is white, as predicted. The mean fluctuations of the residuals corresponds to an astrometric accuracy of 21 μas/√h, which is in good agreement with the predictions of atmospheric models. An accurate separation for Gem was also determined: 3".2811±0".01 at position angle 73°.23±0°.15 for B1992.9589.

  12. The Mars observer camera

    NASA Technical Reports Server (NTRS)

    Malin, M. C.; Danielson, G. E.; Ingersoll, A. P.; Masursky, H.; Veverka, J.; Soulanille, T.; Ravine, M.

    1987-01-01

    A camera designed to operate under the extreme constraints of the Mars Observer Mission was selected by NASA in April, 1986. Contingent upon final confirmation in mid-November, the Mars Observer Camera (MOC) will begin acquiring images of the surface and atmosphere of Mars in September-October 1991. The MOC incorporates both a wide angle system for low resolution global monitoring and intermediate resolution regional targeting, and a narrow angle system for high resolution selective surveys. Camera electronics provide control of image clocking and on-board, internal editing and buffering to match whatever spacecraft data system capabilities are allocated to the experiment. The objectives of the MOC experiment follow.

  13. Design of high-power and narrow divergence angle photonic crystal surface emitting lasers

    NASA Astrophysics Data System (ADS)

    Guo, Xiaojie; Wang, Yufei; Qi, Aiyi; Liu, Lei; Qi, Fan; Zheng, W. H.

    2014-11-01

    We design photonic crystal (PC) array surface emitting lasers with large-area coherence. The structure has six-fold rotational symmetry. By finite-difference time-domain method, we investigate the far-field characteristics of the individual element and the array. We demonstrate theoretically that the coherent PC array has lower far-field divergence angles and higher power compared to those of individual elements. Our PC array exhibits strong leaky coupling which has high mode stability and high intermodal discrimination. Thus, the coherent PC array shows great potential for high power low divergence in-phase surface laser emitting.

  14. Utilizing Angled O-Ball Narrow-Diameter Implants to Solve the Restorative Challenge Posed by Alveolar Resorption: A Case Report.

    PubMed

    Patel, Paresh B

    2015-09-01

    An angled narrow-diameter implant has been introduced for use in cases where the atrophic edentulous ridge is wide enough to accommodate narrow-diameter implants but the necessary implant angulations would make it impossible to fabricate an esthetically acceptable overdenture. A case is described in which such implants were placed and restored. PMID:26355445

  15. A triple axis double crystal multiple reflection camera for ultra small angle X-ray scattering

    NASA Astrophysics Data System (ADS)

    Lambard, Jacques; Lesieur, Pierre; Zemb, Thomas

    1992-06-01

    To extend the domain of small angle X-ray scattering requires multiple reflection crystals to collimate the beam. A double crystal, triple axis X-ray camera using multiple reflection channel cut crystals is described. Procedures for measuring the desmeared scattering cross-section on absolute scale are described as well as the measurement from several typical samples : fibrils of collagen, 0.3 μm diameter silica spheres, 0.16 μm diameter interacting latex spheres, porous lignite coal, liquid crystals in a surfactant-water system, colloidal crystal of 0.32 μm diameter silica spheres. L'extension du domaine de diffusion des rayons-X vers les petits angles demande l'emploi de cristaux à réflexions multiples pour collimater le faisceau. Nous décrivons une caméra à rayons-X à trois axes où les réflexions multiples sont réalisées dans deux cristaux à gorge. Nous donnons ensuite les procédures de déconvolution pour obtenir la section efficace de diffusion en échelle absolue, ainsi que les résultats des mesures effectuées avec plusieurs échantillons typiques : fibres de collagène, sphères de silice de 0,3 μm de diamètre, sphères de latex de 0,16 μm de diamètre en interaction, charbon lignite poreux, cristaux liquides formés dans un système eau-tensioactif, solution colloïdale de sphères de silice de 0,32 μm de diamètre.

  16. Numerical simulations of the bending of narrow-angle-tail radio jets by ram pressure or pressure gradients

    NASA Technical Reports Server (NTRS)

    Soker, Noam; Sarazin, Craig L.; O'Dea, Christopher P.

    1988-01-01

    Three-dimensional numerical hydrodynamic simulations are used to study the bending of radio jets. The simulations are compared with observations of jets in narrow-angle-tail radio sources. Two mechanisms for the observed bending are considered: direct bending of quasi-continuous jets by ram pressure from intergalactic gas and bending by pressure gradients in the interstellar gas of the host galaxy, the pressure gradients themselves being the result of ram pressure by intergalactic gas. It is shown that the pressure gradients are much less effective in bending jets, implying that the jets have roughly 30 times lower momentum fluxes if they are bent by this mechanism. Ram-pressure bending produces jets with 'kidney-shaped' cross sections; when observed from the side, these jets appear to have diffuse extensions on the downstream side. On the other hand, pressure-gradient bending causes the jets to be densest near their upstream side.

  17. Numerical simulations of the bending of narrow-angle-tail radio jets by ram pressure or pressure gradients

    SciTech Connect

    Soker, N.; Sarazin, C.L.; O'Dea, C.P.

    1988-04-01

    Three-dimensional numerical hydrodynamic simulations are used to study the bending of radio jets. The simulations are compared with observations of jets in narrow-angle-tail radio sources. Two mechanisms for the observed bending are considered: direct bending of quasi-continuous jets by ram pressure from intergalactic gas and bending by pressure gradients in the interstellar gas of the host galaxy, the pressure gradients themselves being the result of ram pressure by intergalactic gas. It is shown that the pressure gradients are much less effective in bending jets, implying that the jets have roughly 30 times lower momentum fluxes if they are bent by this mechanism. Ram-pressure bending produces jets with kidney-shaped cross sections; when observed from the side, these jets appear to have diffuse extensions on the downstream side. On the other hand, pressure-gradient bending causes the jets to be densest near their upstream side. 31 references.

  18. A New Lunar Atlas: Mapping the Moon with the Wide Angle Camera

    NASA Astrophysics Data System (ADS)

    Speyerer, E.; Robinson, M. S.; Boyd, A.; Sato, H.

    2012-12-01

    The Lunar Reconnaissance Orbiter (LRO) spacecraft launched in June 2009 and began systematically mapping the lunar surface and providing a priceless dataset for the planetary science community and future mission planners. From 20 September 2009 to 11 December 2011, the spacecraft was in a nominal 50 km polar orbit, except for two one-month long periods when a series of spacecraft maneuvers enabled low attitude flyovers (as low as 22 km) of key exploration and scientifically interesting targets. One of the instruments, the Lunar Reconnaissance Orbiter Camera (LROC) Wide Angle Camera (WAC) [1], captured nearly continuous synoptic views of the illuminated lunar surface. The WAC is a 7-band (321, 360, 415, 566, 604, 643, 689 nm) push frame imager with field of view of 60° in color mode and 90° in monochrome mode. This broad field of view enables the WAC to reimage nearly 50% (at the equator where the orbit tracks our spaced the furthest) of the terrain it imaged in the previous orbit. The visible bands of map projected WAC images have a pixel scale of 100 m, while UV bands have a pixel scale of 400 m due to 4x4 pixel on-chip binning that increases signal-to-noise. The nearly circular polar orbit and short (two hour) orbital periods enable seamless mosaics of broad areas of the surface with uniform lighting and resolution. In March of 2011, the LROC team released the first version of the global monochrome (643nm) morphologic map [2], which was comprised of 15,000 WAC images collected over three periods. With the over 130,000 WAC images collected while the spacecraft was in the 50 km orbit, a new set of mosaics are being produced by the LROC Team and will be released to the Planetary Data Systems. These new maps include an updated morphologic map with an improved set of images (limiting illumination variations and gores due to off-nadir observation of other instruments) and a new photometric correction derived from the LROC WAC dataset. In addition, a higher sun (lower incidence angle) mosaic will also be released. This map has minimal shadows and highlights albedo differences. In addition, seamless regional WAC mosaics acquired under multiple lighting geometries (Sunlight coming from the East, overhead, and West) will also be produced for key areas of interest. These new maps use the latest terrain model (LROC WAC GLD100) [3], updated spacecraft ephemeris provided by the LOLA team [4], and improved WAC distortion model [5] to provide accurate placement of each WAC pixel on the lunar surface. References: [1] Robinson et al. (2010) Space Sci. Rev. [2] Speyerer et al. (2011) LPSC, #2387. [3] Scholten et al. (2012) JGR. [4] Mazarico et al. (2012) J. of Geodesy [5] Speyerer et al. (2012) ISPRS Congress.

  19. Erratum: The Wide Angle Camera of the ROSETTA Mission [Mem.SAIt 74, 434-435 (2003)

    NASA Astrophysics Data System (ADS)

    Barbieri, C.; Fornasier, S.; Verani, S.; Bertini, I.; Lazzarin, M.; Rampazzi, F.; Cremonese, G.; Ragazzoni, R.; Marzari, F.; Angrilli, F.; Bianchini, G. A.; Debei, S.; Dececco, M.; Guizzo, G.; Parzianello, G.; Ramous, P.; Saggin, B.; Zaccariotto, M.; da Deppo, V.; Naletto, G.; Nicolosi, G.; Pelizzo, M. G.; Tondello, G.; Brunello, P.; Peron, F.

    The authors acknowledge that the paper fails to convey the correct information about the respective contributions and roles of the partners of the OSIRIS consortium. In particular, the hardware contributions of the Max-Planck Institut fr Sonnensystemforschung, MPS, (Katlenburg Lindau, Germany, formerly MPAe), of the Instituto de Astrofisica de Andalucia (Granada, Spain), of the Department of Astronomy and Space Physics of Uppsala University (DASP), of ESA Research and Scientific Support Department (ESA/RSSD) to the Wide Angle Camera has not been mentioned or incorrectly expounded. The overall responsibility (PI ship) of MPS (MPAe) for OSIRIS and hence for the Wide Angle Camera is not correctly mentioned either. The correct information is given in the paper by Keller et al. (2006, Space Science Review, in press). The authors take this opportunity to acknowledge that the activity of the Italian team has been partly supported by the Italian Space Agency ASI through a contract to CISAS.

  20. Early direct-injection, low-temperature combustion of diesel fuel in an optical engine utilizing a 15-hole, dual-row, narrow-included-angle nozzle.

    SciTech Connect

    Gehrke, Christopher R.; Radovanovic, Michael S.; Milam, David M.; Martin, Glen C.; Mueller, Charles J.

    2008-04-01

    Low-temperature combustion of diesel fuel was studied in a heavy-duty, single-cylinder optical engine employing a 15-hole, dual-row, narrow-included-angle nozzle (10 holes x 70/mD and 5 holes x 35/mD) with 103-/gmm-diameter orifices. This nozzle configuration provided the spray targeting necessary to contain the direct-injected diesel fuel within the piston bowl for injection timings as early as 70/mD before top dead center. Spray-visualization movies, acquired using a high-speed camera, show that impingement of liquid fuel on the piston surface can result when the in-cylinder temperature and density at the time of injection are sufficiently low. Seven single- and two-parameter sweeps around a 4.82-bar gross indicated mean effective pressure load point were performed to map the sensitivity of the combustion and emissions to variations in injection timing, injection pressure, equivalence ratio, simulated exhaust-gas recirculation, intake temperature, intake boost pressure, and load. High-speed movies of natural luminosity were acquired by viewing through a window in the cylinder wall and through a window in the piston to provide quasi-3D information about the combustion process. These movies revealed that advanced combustion phasing resulted in intense pool fires within the piston bowl, after the end of significant heat release. These pool fires are a result of fuel-films created when the injected fuel impinged on the piston surface. The emissions results showed a strong correlation with pool-fire activity. Smoke and NO/dx emissions rose steadily as pool-fire intensity increased, whereas HC and CO showed a dramatic increase with near-zero pool-fire activity.

  1. Miniature Wide-Angle Lens for Small-Pixel Electronic Camera

    NASA Technical Reports Server (NTRS)

    Mouroulils, Pantazis; Blazejewski, Edward

    2009-01-01

    A proposed wideangle lens is shown that would be especially well suited for an electronic camera in which the focal plane is occupied by an image sensor that has small pixels. The design of the lens is intended to satisfy requirements for compactness, high image quality, and reasonably low cost, while addressing issues peculiar to the operation of small-pixel image sensors. Hence, this design is expected to enable the development of a new generation of compact, high-performance electronic cameras. The lens example shown has a 60 degree field of view and a relative aperture (f-number) of 3.2. The main issues affecting the design are also shown.

  2. A Wide-Angle Camera for the Mobile Asteroid Surface Scout (MASCOT) on Hayabusa-2

    NASA Astrophysics Data System (ADS)

    Schmitz, N.; Koncz, A.; Jaumann, R.; Hoffmann, H.; Jobs, D.; Kachlicki, J.; Michaelis, H.; Mottola, S.; Pforte, B.; Schroeder, S.; Terzer, R.; Trauthan, F.; Tschentscher, M.; Weisse, S.; Ho, T.-M.; Biele, J.; Ulamec, S.; Broll, B.; Kruselburger, A.; Perez-Prieto, L.

    2014-04-01

    JAXA's Hayabusa-2 mission, an asteroid sample return mission, is scheduled for launch in December 2014, for a rendezvous with the C-type asteroid 1999 JU3 in 2018. MASCOT, the Mobile Asteroid Surface Scout [1], is a small lander, designed to deliver ground truth for the orbiter remote measurements, support the selection of sampling sites, and provide context for the returned samples.MASCOT's main objective is to investigate the landing site's geomorphology, the internal structure, texture and composition of the regolith (dust, soil and rocks), and the thermal, mechanical, and magnetic properties of the surface. MASCOT comprises a payload of four scientific instruments: camera, radiometer, magnetometer and hyper-spectral microscope. The camera (MASCOT CAM) was designed and built by DLR's Institute of Planetary Research, together with Airbus DS Germany.

  3. On an assessment of surface roughness estimates from lunar laser altimetry pulse-widths for the Moon from LOLA using LROC narrow-angle stereo DTMs.

    NASA Astrophysics Data System (ADS)

    Muller, Jan-Peter; Poole, William

    2013-04-01

    Neumann et al. [1] proposed that laser altimetry pulse-widths could be employed to derive "within-footprint" surface roughness as opposed to surface roughness estimated from between laser altimetry pierce-points such as the example for Mars [2] and more recently from the 4-pointed star-shaped LOLA (Lunar reconnaissance Orbiter Laser Altimeter) onboard the NASA-LRO [3]. Since 2009, the LOLA has been collecting extensive global laser altimetry data with a 5m footprint and ?25m between the 5 points in a star-shape. In order to assess how accurately surface roughness (defined as simple RMS after slope correction) derived from LROC matches with surface roughness derived from LOLA footprints, publicly released LROC-NA (LRO Camera Narrow Angle) 1m Digital Terrain Models (DTMs) were employed to measure the surface roughness directly within each 5m footprint. A set of 20 LROC-NA DTMs were examined. Initially the match-up between the LOLA and LROC-NA orthorectified images (ORIs) is assessed visually to ensure that the co-registration is better than the LOLA footprint resolution. For each LOLA footprint, the pulse-width geolocation is then retrieved and this is used to "cookie-cut" the surface roughness and slopes derived from the LROC-NA DTMs. The investigation which includes data from a variety of different landforms shows little, if any correlation between surface roughness estimated from DTMs with LOLA pulse-widths at sub-footprint scale. In fact there is only any perceptible correlation between LOLA and LROC-DTMs at baselines of 40-60m for surface roughness and 20m for slopes. [1] Neumann et al. Mars Orbiter Laser Altimeter pulse width measurements and footprint-scale roughness. Geophysical Research Letters (2003) vol. 30 (11), paper 1561. DOI: 10.1029/2003GL017048 [2] Kreslavsky and Head. Kilometer-scale roughness of Mars: results from MOLA data analysis. J Geophys Res (2000) vol. 105 (E11) pp. 26695-26711. [3] Rosenburg et al. Global surface slopes and roughness of the Moon from the Lunar Orbiter Laser Altimeter. Journal of Geophysical Research (2011) vol. 116, paper E02001. DOI: 10.1029/2010JE003716 [4] Chin et al. Lunar Reconnaissance Orbiter Overview: The Instrument Suite and Mission. Space Science Reviews (2007) vol. 129 (4) pp. 391-419

  4. Fovea-stereographic: a projection function for ultra-wide-angle cameras

    NASA Astrophysics Data System (ADS)

    Samy, Ahmed Mahmoud; Gao, Zhishan

    2015-04-01

    A new ultra-wide-angle projection function called fovea-stereographic is described and characterized by the relative relationship between the radial distortion level and the object field-of-view (FOV) angle, creating a high-resolution wide foveal image and adequate peripheral information to be processed within a limited computational time. The paper also provides the design results of an innovative fast fovea-stereographic fisheye lens system with a 170 deg of FOV that shows a more than 58.8% (100 deg) high-resolution central foveal image and at least 15% more peripheral information than any other light projection. Our lens distortion curve, in addition to its modulation transfer function, produces a high-resolution projection for real-time tracking and image transmission applications.

  5. Matching the best viewing angle in depth cameras for biomass estimation based on poplar seedling geometry.

    PubMed

    Andújar, Dionisio; Fernández-Quintanilla, César; Dorado, José

    2015-01-01

    In energy crops for biomass production a proper plant structure is important to optimize wood yields. A precise crop characterization in early stages may contribute to the choice of proper cropping techniques. This study assesses the potential of the Microsoft Kinect for Windows v.1 sensor to determine the best viewing angle of the sensor to estimate the plant biomass based on poplar seedling geometry. Kinect Fusion algorithms were used to generate a 3D point cloud from the depth video stream. The sensor was mounted in different positions facing the tree in order to obtain depth (RGB-D) images from different angles. Individuals of two different ages, e.g., one month and one year old, were scanned. Four different viewing angles were compared: top view (0°), 45° downwards view, front view (90°) and ground upwards view (-45°). The ground-truth used to validate the sensor readings consisted of a destructive sampling in which the height, leaf area and biomass (dry weight basis) were measured in each individual plant. The depth image models agreed well with 45°, 90° and -45° measurements in one-year poplar trees. Good correlations (0.88 to 0.92) between dry biomass and the area measured with the Kinect were found. In addition, plant height was accurately estimated with a few centimeters error. The comparison between different viewing angles revealed that top views showed poorer results due to the fact the top leaves occluded the rest of the tree. However, the other views led to good results. Conversely, small poplars showed better correlations with actual parameters from the top view (0°). Therefore, although the Microsoft Kinect for Windows v.1 sensor provides good opportunities for biomass estimation, the viewing angle must be chosen taking into account the developmental stage of the crop and the desired parameters. The results of this study indicate that Kinect is a promising tool for a rapid canopy characterization, i.e., for estimating crop biomass production, with several important advantages: low cost, low power needs and a high frame rate (frames per second) when dynamic measurements are required. PMID:26053748

  6. Matching the Best Viewing Angle in Depth Cameras for Biomass Estimation Based on Poplar Seedling Geometry

    PubMed Central

    Andújar, Dionisio; Fernández-Quintanilla, César; Dorado, José

    2015-01-01

    In energy crops for biomass production a proper plant structure is important to optimize wood yields. A precise crop characterization in early stages may contribute to the choice of proper cropping techniques. This study assesses the potential of the Microsoft Kinect for Windows v.1 sensor to determine the best viewing angle of the sensor to estimate the plant biomass based on poplar seedling geometry. Kinect Fusion algorithms were used to generate a 3D point cloud from the depth video stream. The sensor was mounted in different positions facing the tree in order to obtain depth (RGB-D) images from different angles. Individuals of two different ages, e.g., one month and one year old, were scanned. Four different viewing angles were compared: top view (0°), 45° downwards view, front view (90°) and ground upwards view (−45°). The ground-truth used to validate the sensor readings consisted of a destructive sampling in which the height, leaf area and biomass (dry weight basis) were measured in each individual plant. The depth image models agreed well with 45°, 90° and −45° measurements in one-year poplar trees. Good correlations (0.88 to 0.92) between dry biomass and the area measured with the Kinect were found. In addition, plant height was accurately estimated with a few centimeters error. The comparison between different viewing angles revealed that top views showed poorer results due to the fact the top leaves occluded the rest of the tree. However, the other views led to good results. Conversely, small poplars showed better correlations with actual parameters from the top view (0°). Therefore, although the Microsoft Kinect for Windows v.1 sensor provides good opportunities for biomass estimation, the viewing angle must be chosen taking into account the developmental stage of the crop and the desired parameters. The results of this study indicate that Kinect is a promising tool for a rapid canopy characterization, i.e., for estimating crop biomass production, with several important advantages: low cost, low power needs and a high frame rate (frames per second) when dynamic measurements are required. PMID:26053748

  7. Lunar Reconnaissance Orbiter Camera (LROC) instrument overview

    USGS Publications Warehouse

    Robinson, M.S.; Brylow, S.M.; Tschimmel, M.; Humm, D.; Lawrence, S.J.; Thomas, P.C.; Denevi, B.W.; Bowman-Cisneros, E.; Zerr, J.; Ravine, M.A.; Caplinger, M.A.; Ghaemi, F.T.; Schaffner, J.A.; Malin, M.C.; Mahanti, P.; Bartels, A.; Anderson, J.; Tran, T.N.; Eliason, E.M.; McEwen, A.S.; Turtle, E.; Jolliff, B.L.; Hiesinger, H.

    2010-01-01

    The Lunar Reconnaissance Orbiter Camera (LROC) Wide Angle Camera (WAC) and Narrow Angle Cameras (NACs) are on the NASA Lunar Reconnaissance Orbiter (LRO). The WAC is a 7-color push-frame camera (100 and 400 m/pixel visible and UV, respectively), while the two NACs are monochrome narrow-angle linescan imagers (0.5 m/pixel). The primary mission of LRO is to obtain measurements of the Moon that will enable future lunar human exploration. The overarching goals of the LROC investigation include landing site identification and certification, mapping of permanently polar shadowed and sunlit regions, meter-scale mapping of polar regions, global multispectral imaging, a global morphology base map, characterization of regolith properties, and determination of current impact hazards.

  8. Mars Observer camera

    NASA Technical Reports Server (NTRS)

    Malin, M. C.; Danielson, G. E.; Ingersoll, A. P.; Masursky, H.; Veverka, J.; Ravine, M. A.; Soulanille, T. A.

    1992-01-01

    The Mars Observer camera (MOC) is a three-component system (one narrow-angle and two wide-angle cameras) designed to take high spatial resolution pictures of the surface of Mars and to obtain lower spatial resolution, synoptic coverage of the planet's surface and atmosphere. The cameras are based on the 'push broom' technique; that is, they do not take 'frames' but rather build pictures, one line at a time, as the spacecraft moves around the planet in its orbit. MOC is primarily a telescope for taking extremely high resolution pictures of selected locations on Mars. Using the narrow-angle camera, areas ranging from 2.8 km x 2.8 km to 2.8 km x 25.2 km (depending on available internal digital buffer memory) can be photographed at about 1.4 m/pixel. Additionally, lower-resolution pictures (to a lowest resolution of about 11 m/pixel) can be acquired by pixel averaging; these images can be much longer, ranging up to 2.8 x 500 km at 11 m/pixel. High-resolution data will be used to study sediments and sedimentary processes, polar processes and deposits, volcanism, and other geologic/geomorphic processes.

  9. The wavelength dependence of the lunar phase curve as seen by the Lunar Reconnaissance Orbiter wide-angle camera

    NASA Astrophysics Data System (ADS)

    Hapke, Bruce; Denevi, Brett; Sato, Hiroyuki; Braden, Sarah; Robinson, Mark

    2012-03-01

    The Lunar Reconnaissance Orbiter wide-angle camera measured the bidirectional reflectances of two areas on the Moon at seven wavelengths between 321 and 689 nm and at phase angles between 0° and 120°. It is not possible to account for the phase curves unless both coherent backscatter and shadow hiding contribute to the opposition effect. For the analyzed highlands area, coherent backscatter contributes nearly 40% in the UV, increasing to over 60% in the red. This conclusion is supported by laboratory measurements of the circular polarization ratios of Apollo regolith samples, which also indicate that the Moon's opposition effect contains a large component of coherent backscatter. The angular width of the lunar opposition effect is almost independent of wavelength, contrary to theories of the coherent backscatter which, for the Moon, predict that the width should be proportional to the square of the wavelength. When added to the large body of other experimental evidence, this lack of wavelength dependence reinforces the argument that our current understanding of the coherent backscatter opposition effect is incomplete or perhaps incorrect. It is shown that phase reddening is caused by the increased contribution of interparticle multiple scattering as the wavelength and albedo increase. Hence, multiple scattering cannot be neglected in lunar photometric analyses. A simplified semiempirical bidirectional reflectance function is proposed for the Moon that contains four free parameters and that is mathematically simple and straightforward to invert. This function should be valid everywhere on the Moon for phase angles less than about 120°, except at large viewing and incidence angles close to the limb, terminator, and poles.

  10. HAWC+: A Detector, Polarimetry, and Narrow-Band Imaging Upgrade to SOFIA's Far-Infrared Facility Camera

    NASA Astrophysics Data System (ADS)

    Dowell, C. D.; Staguhn, J.; Harper, D. A.; Ames, T. J.; Benford, D. J.; Berthoud, M.; Chapman, N. L.; Chuss, D. T.; Dotson, J. L.; Irwin, K. D.; Jhabvala, C. A.; Kovacs, A.; Looney, L.; Novak, G.; Stacey, G. J.; Vaillancourt, J. E.; HAWC+ Science Collaboration

    2013-01-01

    HAWC, the High-resolution Airborne Widebandwidth Camera, is the facility far-infrared camera for SOFIA, providing continuum imaging from 50 to 250 microns wavelength. As a result of NASA selection as a SOFIA Second Generation Instruments upgrade investigation, HAWC will be upgraded with enhanced capability for addressing current problems in star formation and interstellar medium physics prior to commissioning in early 2015. We describe the capabilities of the upgraded HAWC+, as well as our initial science program. The mapping speed of HAWC is increased by a factor of 9, accomplished by using NASA/Goddard's Backshort-Under-Grid bolometer detectors in a 64x40 format. Two arrays are used in a dual-beam polarimeter format, and the full complement of 5120 transition-edge detectors is read using NIST SQUID multiplexers and U.B.C. Multi-Channel Electronics. A multi-band polarimeter is added to the HAWC opto-mechanical system, at the cryogenic pupil image, employing rotating quartz half-wave plates. Six new filters are added to HAWC+, bringing the full set to 53, 63, 89, 155, and 216 microns at R = 5 resolution and 52, 63, 88, 158, and 205 microns at R = 300 resolution. The latter filters are fixed-tuned to key fine-structure emission lines from [OIII], [OI], [CII], and [NII]. Polarimetry can be performed in any of the filter bands. The first-light science program with HAWC+ emphasizes polarimetry for the purpose of mapping magnetic fields in Galactic clouds. The strength and character of magnetic fields in molecular clouds before, during, and after the star formation phase are largely unknown, despite pioneering efforts on the KAO and ground-based telescopes. SOFIA and HAWC+ provide significant new capability: sensitivity to extended dust emission (to A_V ~ 1) which is unmatched, ~10 arcsec angular resolution combined with wide-field mapping which allows statistical estimates of magnetic field strength, and wavelength coverage spanning the peak of the far-infrared spectrum of star-forming clouds. Our initial targets include nearby quiescent clouds, active sites of high- and low-mass star formation, remnants of dispersing clouds, and the Galactic center.

  11. Erratum: First Results from the Wide Angle Camera of the ROSETTA Mission [Mem.SAIt Suppl. 6, 28-33 (2005)

    NASA Astrophysics Data System (ADS)

    Barbieri, C.; Fornasier, S.; Bertini, I.; Angrilli, F.; Bianchini, G. A.; Debei, S.; de Cecco, M.; Parzianello, G.; Zaccariotto, M.; da Deppo, V.; Naletto, G.

    The authors acknowledge that the paper fails to convey the correct information about the respective contributions and roles of the partners of the OSIRIS consortium. In particular, the hardware contributions of the Max-Planck Institut fr Sonnensystemforschung, MPS, (Katlenburg Lindau, Germany, formerly MPAe), of the Instituto de Astrofisica de Andalucia (Granada, Spain), of the Department of Astronomy and Space Physics of Uppsala University (DASP), of ESA Research and Scientific Support Department (ESA/RSSD) to the Wide Angle Camera has not been mentioned or incorrectly expounded. The overall responsibility (PI ship) of MPS (MPAe) for OSIRIS and hence for the Wide Angle Camera is not correctly mentioned either. The correct information is given in the paper by Keller et al. (2006, Space Science Review, in press). The authors take this opportunity to acknowledge that the activity of the Italian team has been partly supported by the Italian Space Agency ASI through a contract to CISAS.

  12. Space Telescope Imaging Spectrograph Long-Slit Spectroscopy of the Narrow-Line Region of NGC 4151. II. Physical Conditions along Position Angle 221 deg

    NASA Astrophysics Data System (ADS)

    Kraemer, S. B.; Crenshaw, D. M.; Hutchings, J. B.; Gull, T. R.; Kaiser, M. E.; Nelson, C. H.; Weistrop, D.

    2000-03-01

    We have examined the physical conditions in the narrow-line region of the well-studied Seyfert galaxy NGC 4151, using long-slit spectra obtained with the Hubble Space Telescope Space Telescope Imaging Spectrograph. The data were taken along a position angle of 221 deg, centered on the optical nucleus. We have generated photoionization models for a contiguous set of radial zones, out to 2.3" in projected position to the southwest of the nucleus and 2.7" to the northeast. Given the uncertainties in the reddening correction, the calculated line ratios successfully matched nearly all the dereddened ratios. We find that the narrow-line region consists of dusty atomic gas photoionized by a power-law continuum that has been modified by transmission through a mix of low- and high-ionization gas, specifically, UV-absorbing and X-ray-absorbing components. The physical characteristics of the absorbers resemble those observed along our line of sight to the nucleus, although the column density of the X-ray absorber is a factor of 10 less than observed. The large inferred covering factor of the absorbing gas is in agreement with the results of our previous study of UV absorption in Seyfert 1 galaxies. We find evidence, specifically the suppression of Lyα, that we are observing the back end of dusty ionized clouds in the region southwest of the nucleus. Since these clouds are blueshifted, this supports the interpretation of the cloud kinematics as being due to radial outflow from the nucleus. We find that the narrow-line gas at each radial position is inhomogeneous and can be modeled as consisting of a radiation-bounded component and a more tenuous, matter-bounded component. The density of the narrow-line gas drops with increasing radial distance, which confirms our earlier results and may be a result of the expansion of radially outflowing emission-line clouds. Based on observations with the NASA/ESA Hubble Space Telescope, obtained at the Space Telescope Science Institute, which is operated by the Association of Universities for Research in Astronomy, Inc., under NASA contract NAS 5-26555.

  13. Seismic waveform variations seen via a vector narrow-angle one-way propagator for three-dimensional anisotropic inhomogeneous media.

    NASA Astrophysics Data System (ADS)

    Angus, D.; Thomson, C.; Pratt, R.

    2003-04-01

    Improvements in data quantity and quality from engineering to global scales need to be matched by improved waveform modelling tools based on physically-motivated approximations. Such approximations should relate directly to the local material properties and yet describe the frequency-dependent effects of wave propagation. The narrow-angle one-way seismic wave equation for three-dimensional anisotropic inhomogeneous media derived by Thomson (1999) is one such approach that produces finite-frequency waveforms. Although backscattering is neglected, the finite-difference implementation of this propagator should provide a sufficiently accurate, efficient and robust simulation of primary wave(s) passing through regions of smoothly-varying weak and/or strong anisotropy. We present characteristic waveform effects associated with conical points (acoustic axes) for rock elasticities representative of mantle, crustal and basin-scale applications. The effects of frequency-dependent wavetype coupling and rapidly-rotating polarization eigenvectors are expressed by: merging/splitting pulses; wavefront `tearing'; gaps/polarity-reversals; and incipient Hilbert-transform like first-motions. These examples have been computed only for homogeneous media to facilitate comparison with a separation-of-variables `exact' reference solution. The particular form of the one-way equation implemented for these homogeneous examples is part of a hierarchy of approximations which are easily implemented in Cartesian coordinates. For heterogeneous media where steeply-dipping and turning waves occur a curvilinear formulation is more appropriate and has been implemented for the narrow-angle limit. In this approach, the computational grid (i.e. curvilinear reference frame) attempts to track the true wavefronts and phases via ray tracing in a suitably-chosen reference medium. Our starting models have simple gradual transitions representing curved `interfaces' so we can explore the effects of wavetype coupling and focussing. In the longer term we hope to explore models as complicated as those from refraction/wide-angle-reflection profiling of the lithosphere. All the calculations are carried out on a single 1.5GHz desktop PC with 1GB RAM and a typical run takes 10mins.

  14. The rate and causes of lunar space weathering: Insights from Lunar Reconnaissance Orbiter Wide Angle Camera ultraviolet observations

    NASA Astrophysics Data System (ADS)

    Denevi, B. W.; Robinson, M. S.; Sato, H.; Hapke, B. W.; McEwen, A. S.; Hawke, B. R.

    2011-12-01

    Lunar Reconnaissance Orbiter Wide Angle Camera global ultraviolet and visible imaging provides a unique opportunity to examine the rate and causes of space weathering on the Moon. Silicates typically have a strong decrease in reflectance toward UV wavelengths (<~450 nm) due to strong bands at 250 nm and in the far UV. Metallic iron is relatively spectrally neutral, and laboratory spectra suggest that its addition to mature soils in the form of submicroscopic iron (also known as nanophase iron) flattens silicate spectra, significantly reducing spectral slope in the ultraviolet. Reflectance at ultraviolet wavelengths may be especially sensitive to the surface coatings that form due to exposure to space weathering because scattering from the surfaces of grains contributes a larger fraction to the reflectance spectrum at short wavelengths. We find that the UV slope (as measured by the 320/415 nm ratio) is a more sensitive measure of maturity than indexes based on visible and near-infrared wavelengths. Only the youngest features (less than ~100 Ma) retain a UV slope that is distinct from mature soils of the same composition. No craters >20 km have UV slopes that approach those observed in laboratory spectra of fresh lunar materials (powdered lunar rocks). While the 320/415 nm ratio increases by ~18% from powdered rocks to mature soils in laboratory samples, Giordano Bruno, the freshest large crater, only shows a 3% difference between fresh and mature materials. At the resolution of our UV data (400 m/pixel), we observe some small (<5 km) craters that show a ~14% difference in 320/415 nm ratio from their mature surroundings. UV observations show that Reiner Gamma has had significantly lower levels of space weathering than any of the Copernican craters we examined, and was the only region we found with a UV slope that approached laboratory values for fresh powdered rock samples. This is consistent with the hypothesis that its high albedo is due to magnetic shielding from solar wind sputtering effects. Furthermore the observation that all Copernican craters we examined show some degree of space weathering and the extreme immaturity of Reiner Gamma materials show that space weathering of the surface and the resultant modification of UV spectra proceeds at a fast rate and is dominated by solar wind sputtering. Comparisons of the UV trends on other airless bodies (i.e., asteroids and Mercury) may prove fruitful for understanding the relative rates and causes of space weathering across the inner solar system.

  15. 2. VAL CAMERA CAR, VIEW OF CAMERA CAR AND TRACK ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    2. VAL CAMERA CAR, VIEW OF CAMERA CAR AND TRACK WITH CAMERA STATION ABOVE LOOKING WEST TAKEN FROM RESERVOIR. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  16. 7. VAL CAMERA CAR, DETAIL OF 'FLARE' OR TRAJECTORY CAMERA ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    7. VAL CAMERA CAR, DETAIL OF 'FLARE' OR TRAJECTORY CAMERA INSIDE CAMERA CAR. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  17. 6. VAL CAMERA CAR, DETAIL OF COMMUNICATION EQUIPMENT INSIDE CAMERA ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    6. VAL CAMERA CAR, DETAIL OF COMMUNICATION EQUIPMENT INSIDE CAMERA CAR WITH CAMERA MOUNT IN FOREGROUND. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  18. Why do I sometimes see bright speckles in an image of the Terrain product, particularly at the oblique camera angles?

    Atmospheric Science Data Center

    2014-12-08

    MISR Level 1B2 data products use various high data values to signify fill, and one of the fill values (16377) in the 14 MSB's of the scaled radiances signifies that this location on the SOM grid was obscured from the camera's view by...

  19. Reliability of sagittal plane hip, knee, and ankle joint angles from a single frame of video data using the GAITRite camera system.

    PubMed

    Ross, Sandy A; Rice, Clinton; Von Behren, Kristyn; Meyer, April; Alexander, Rachel; Murfin, Scott

    2015-01-01

    The purpose of this study was to establish intra-rater, intra-session, and inter-rater, reliability of sagittal plane hip, knee, and ankle angles with and without reflective markers using the GAITRite walkway and single video camera between student physical therapists and an experienced physical therapist. This study included thirty-two healthy participants age 20-59, stratified by age and gender. Participants performed three successful walks with and without markers applied to anatomical landmarks. GAITRite software was used to digitize sagittal hip, knee, and ankle angles at two phases of gait: (1) initial contact; and (2) mid-stance. Intra-rater reliability was more consistent for the experienced physical therapist, regardless of joint or phase of gait. Intra-session reliability was variable, the experienced physical therapist showed moderate to high reliability (intra-class correlation coefficient (ICC)?=?0.50-0.89) and the student physical therapist showed very poor to high reliability (ICC?=?0.07-0.85). Inter-rater reliability was highest during mid-stance at the knee with markers (ICC?=?0.86) and lowest during mid-stance at the hip without markers (ICC?=?0.25). Reliability of a single camera system, especially at the knee joint shows promise. Depending on the specific type of reliability, error can be attributed to the testers (e.g. lack of digitization practice and marker placement), participants (e.g. loose fitting clothing) and camera systems (e.g. frame rate and resolution). However, until the camera technology can be upgraded to a higher frame rate and resolution, and the software can be linked to the GAITRite walkway, the clinical utility for pre/post measures is limited. PMID:25230893

  20. The measurement of in vivo joint angles during a squat using a single camera markerless motion capture system as compared to a marker based system.

    PubMed

    Schmitz, Anne; Ye, Mao; Boggess, Grant; Shapiro, Robert; Yang, Ruigang; Noehren, Brian

    2015-02-01

    Markerless motion capture may have the potential to make motion capture technology widely clinically practical. However, the ability of a single markerless camera system to quantify clinically relevant, lower extremity joint angles has not been studied in vivo. Therefore, the goal of this study was to compare in vivo joint angles calculated using a marker-based motion capture system and a Microsoft Kinect during a squat. Fifteen individuals participated in the study: 8 male, 7 female, height 1.7020.089m, mass 67.910.4kg, age 244 years, BMI 23.42.2kg/m(2). Marker trajectories and Kinect depth map data of the leg were collected while each subject performed a slow squat motion. Custom code was used to export virtual marker trajectories for the Kinect data. Each set of marker trajectories was utilized to calculate Cardan knee and hip angles. The patterns of motion were similar between systems with average absolute differences of <5 deg. Peak joint angles showed high between-trial reliability with ICC>0.9 for both systems. The peak angles calculated by the marker-based and Kinect systems were largely correlated (r>0.55). These results suggest the data from the Kinect can be post processed in way that it may be a feasible markerless motion capture system that can be used in the clinic. PMID:25708833

  1. 3. VAL CAMERA CAR, VIEW OF CAMERA CAR AND TRACK ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    3. VAL CAMERA CAR, VIEW OF CAMERA CAR AND TRACK WITH THE VAL TO THE RIGHT, LOOKING NORTHEAST. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  2. 7. VAL CAMERA STATION, INTERIOR VIEW OF CAMERA MOUNT, COMMUNICATION ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    7. VAL CAMERA STATION, INTERIOR VIEW OF CAMERA MOUNT, COMMUNICATION EQUIPMENT AND STORAGE CABINET. - Variable Angle Launcher Complex, Camera Stations, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  3. 1. VARIABLEANGLE LAUNCHER CAMERA CAR, VIEW OF CAMERA CAR AND ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    1. VARIABLE-ANGLE LAUNCHER CAMERA CAR, VIEW OF CAMERA CAR AND TRACK WITH CAMERA STATION ABOVE LOOKING NORTH TAKEN FROM RESERVOIR. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  4. Large-Grazing-Angle, Multi-Image Kirkpatrick-Baez Microscope as the Front End to a High-Resolution Streak Camera for OMEGA

    SciTech Connect

    Gotchev, O.V.; Hayes, L.J.; Jaanimagi, P.A.; Knauer, J.P.; Marshall, F.J.; Meyerhofer, D. D.

    2003-11-25

    (B204)A new, high-resolution x-ray microscope with a large grazing angle has been developed, characterized, and fielded at the Laboratory for Laser Energetics. It increases the sensitivity and spatial resolution in planar direct-drive hydrodynamic stability experiments, relevant to inertial confinement fusion (ICF) research. It has been designed to work as the optical front end of the PJX-a high-current, high-dynamic-range x-ray streak camera. Optical design optimization, results from numerical ray tracing, mirror-coating choice, and characterization have been described previously [O. V. Gotchev, et al., Rev. Sci. Instrum. 74, 2178 (2003)]. This work highlights the optics' unique mechanical design and flexibility and considers certain applications that benefit from it. Characterization of the microscope's resolution in terms of its modulation transfer function (MTF) over the field of view is shown. Recent results from hydrodynamic stability experiments, diagnosed with the optic and the PJX, are provided to confirm the microscope's advantages as a high-resolution, high-throughput x-ray optical front end for streaked imaging.

  5. 9. VIEW OF CAMERA STATIONS UNDER CONSTRUCTION INCLUDING CAMERA CAR ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    9. VIEW OF CAMERA STATIONS UNDER CONSTRUCTION INCLUDING CAMERA CAR ON RAILROAD TRACK AND FIXED CAMERA STATION 1400 (BUILDING NO. 42021) ABOVE, ADJACENT TO STATE HIGHWAY 39, LOOKING WEST, March 23, 1948. (Original photograph in possession of Dave Willis, San Diego, California.) - Variable Angle Launcher Complex, Camera Stations, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  6. Wide Angle Movie

    NASA Technical Reports Server (NTRS)

    1999-01-01

    This brief movie illustrates the passage of the Moon through the Saturn-bound Cassini spacecraft's wide-angle camera field of view as the spacecraft passed by the Moon on the way to its closest approach with Earth on August 17, 1999. From beginning to end of the sequence, 25 wide-angle images (with a spatial image scale of about 14 miles per pixel (about 23 kilometers)were taken over the course of 7 and 1/2 minutes through a series of narrow and broadband spectral filters and polarizers, ranging from the violet to the near-infrared regions of the spectrum, to calibrate the spectral response of the wide-angle camera. The exposure times range from 5 milliseconds to 1.5 seconds. Two of the exposures were smeared and have been discarded and replaced with nearby images to make a smooth movie sequence. All images were scaled so that the brightness of Crisium basin, the dark circular region in the upper right, is approximately the same in every image. The imaging data were processed and released by the Cassini Imaging Central Laboratory for Operations (CICLOPS)at the University of Arizona's Lunar and Planetary Laboratory, Tucson, AZ.

    Photo Credit: NASA/JPL/Cassini Imaging Team/University of Arizona

    Cassini, launched in 1997, is a joint mission of NASA, the European Space Agency and Italian Space Agency. The mission is managed by NASA's Jet Propulsion Laboratory, Pasadena, CA, for NASA's Office of Space Science, Washington DC. JPL is a division of the California Institute of Technology, Pasadena, CA.

  7. Is Perceptual Narrowing Too Narrow?

    ERIC Educational Resources Information Center

    Cashon, Cara H.; Denicola, Christopher A.

    2011-01-01

    There is a growing list of examples illustrating that infants are transitioning from having earlier abilities that appear more "universal," "broadly tuned," or "unconstrained" to having later abilities that appear more "specialized," "narrowly tuned," or "constrained." Perceptual narrowing, a well-known phenomenon related to face, speech, and…

  8. Determining iconometric parameters of imaging devices using a wide-angle collimator. [calibration of satellite-borne television and photographic cameras

    NASA Technical Reports Server (NTRS)

    Ziman, Y. L.

    1974-01-01

    The problem of determining the iconometric parameters of the imaging device can be solved if the camera being calibrated is used to obtain the image of a group of reference points, the directions to which are known. In order to specify the imaging device coordinate system, it is sufficient in principle to obtain on the picture the images of three reference points which do not lie on a single straight line. Many more such points are required in order to determine the distortion corrections, and they must be distributed uniformly over the entire field of view of the camera being calibrated. Experimental studies were made using this technique to calibrate photographic and phototelevision systems. Evaluation of the results of these experiments permits recommending collimators for calibrating television and phototelevision imaging systems, and also short-focus small-format photographic cameras.

  9. Pre-flight and On-orbit Geometric Calibration of the Lunar Reconnaissance Orbiter Camera

    NASA Astrophysics Data System (ADS)

    Speyerer, E. J.; Wagner, R. V.; Robinson, M. S.; Licht, A.; Thomas, P. C.; Becker, K.; Anderson, J.; Brylow, S. M.; Humm, D. C.; Tschimmel, M.

    2016-04-01

    The Lunar Reconnaissance Orbiter Camera (LROC) consists of two imaging systems that provide multispectral and high resolution imaging of the lunar surface. The Wide Angle Camera (WAC) is a seven color push-frame imager with a 90∘ field of view in monochrome mode and 60∘ field of view in color mode. From the nominal 50 km polar orbit, the WAC acquires images with a nadir ground sampling distance of 75 m for each of the five visible bands and 384 m for the two ultraviolet bands. The Narrow Angle Camera (NAC) consists of two identical cameras capable of acquiring images with a ground sampling distance of 0.5 m from an altitude of 50 km. The LROC team geometrically calibrated each camera before launch at Malin Space Science Systems in San Diego, California and the resulting measurements enabled the generation of a detailed camera model for all three cameras. The cameras were mounted and subsequently launched on the Lunar Reconnaissance Orbiter (LRO) on 18 June 2009. Using a subset of the over 793000 NAC and 207000 WAC images of illuminated terrain collected between 30 June 2009 and 15 December 2013, we improved the interior and exterior orientation parameters for each camera, including the addition of a wavelength dependent radial distortion model for the multispectral WAC. These geometric refinements, along with refined ephemeris, enable seamless projections of NAC image pairs with a geodetic accuracy better than 20 meters and sub-pixel precision and accuracy when orthorectifying WAC images.

  10. Critical Heat Flux in Inclined Rectangular Narrow Gaps

    SciTech Connect

    Jeong J. Kim; Yong H. Kim; Seong J. Kim; Sang W. Noh; Kune Y. Suh; Joy L. Rempe; Fan-Bill Cheung; Sang B. Kim

    2004-06-01

    In light of the TMI-2 accident, in which the reactor vessel lower head survived the attack by molten core material, the in-vessel retention strategy was suggested to benefit from cooling the debris through a gap between the lower head and the core material. The GAMMA 1D (Gap Apparatus Mitigating Melt Attack One Dimensional) tests were conducted to investigate the critical heat flux (CHF) in narrow gaps with varying surface orientations. The CHF in an inclined gap, especially in case of the downward-facing narrow gap, is dictated by bubble behavior because the departing bubbles are squeezed. The orientation angle affects the bubble layer and escape of the bubbles from the narrow gap. The test parameters include gap sizes of 1, 2, 5 and 10 mm and the open periphery, and the orientation angles range from the fully downward-facing (180o) to the vertical (90o) position. The 15 ×35 mm copper test section was electrically heated by the thin film resistor on the back. The heater assembly was installed to the tip of the rotating arm in the heated water pool at the atmospheric pressure. The bubble behavior was photographed utilizing a high-speed camera through the Pyrex glass spacer. It was observed that the CHF decreased as the surface inclination angle increased and as the gap size decreased in most of the cases. However, the opposing results were obtained at certain surface orientations and gap sizes. Transition angles, at which the CHF changed in a rapid slope, were also detected, which is consistent with the existing literature. A semi-empirical CHF correlation was developed for the inclined narrow rectangular channels through dimensional analysis. The correlation provides with best-estimate CHF values for realistically assessing the thermal margin to failure of the lower head during a severe accident involving relocation of the core material.

  11. HIGH SPEED CAMERA

    DOEpatents

    Rogers, B.T. Jr.; Davis, W.C.

    1957-12-17

    This patent relates to high speed cameras having resolution times of less than one-tenth microseconds suitable for filming distinct sequences of a very fast event such as an explosion. This camera consists of a rotating mirror with reflecting surfaces on both sides, a narrow mirror acting as a slit in a focal plane shutter, various other mirror and lens systems as well as an innage recording surface. The combination of the rotating mirrors and the slit mirror causes discrete, narrow, separate pictures to fall upon the film plane, thereby forming a moving image increment of the photographed event. Placing a reflecting surface on each side of the rotating mirror cancels the image velocity that one side of the rotating mirror would impart, so as a camera having this short a resolution time is thereby possible.

  12. 8. VAL CAMERA CAR, CLOSEUP VIEW OF 'FLARE' OR TRAJECTORY ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    8. VAL CAMERA CAR, CLOSE-UP VIEW OF 'FLARE' OR TRAJECTORY CAMERA ON SLIDING MOUNT. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  13. 6. VAL CAMERA STATION, VIEW FROM INTERIOR OUT OF WINDOW ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    6. VAL CAMERA STATION, VIEW FROM INTERIOR OUT OF WINDOW OPENING TOWARD VAL FIRING RANGE LOOKING SOUTHEAST WITH CAMERA MOUNT IN FOREGROUND. - Variable Angle Launcher Complex, Camera Stations, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  14. The DSLR Camera

    NASA Astrophysics Data System (ADS)

    Berkó, Ernő; Argyle, R. W.

    Cameras have developed significantly in the past decade; in particular, digital Single-Lens Reflex Cameras (DSLR) have appeared. As a consequence we can buy cameras of higher and higher pixel number, and mass production has resulted in the great reduction of prices. CMOS sensors used for imaging are increasingly sensitive, and the electronics in the cameras allows images to be taken with much less noise. The software background is developing in a similar way—intelligent programs are created for after-processing and other supplementary works. Nowadays we can find a digital camera in almost every household, most of these cameras are DSLR ones. These can be used very well for astronomical imaging, which is nicely demonstrated by the amount and quality of the spectacular astrophotos appearing in different publications. These examples also show how much post-processing software contributes to the rise in the standard of the pictures. To sum up, the DSLR camera serves as a cheap alternative for the CCD camera, with somewhat weaker technical characteristics. In the following, I will introduce how we can measure the main parameters (position angle and separation) of double stars, based on the methods, software and equipment I use. Others can easily apply these for their own circumstances.

  15. 84. VIEW FROM CAMERA TOWER LOOKING SOUTHWEST SHOWING VAL FIRING ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    84. VIEW FROM CAMERA TOWER LOOKING SOUTHWEST SHOWING VAL FIRING RANGE WITH OVERHEAD CAMERA AND CABLES, Date unknown, circa 1949. - Variable Angle Launcher Complex, Variable Angle Launcher, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  16. Automatic inference of geometric camera parameters and inter-camera topology in uncalibrated disjoint surveillance cameras

    NASA Astrophysics Data System (ADS)

    den Hollander, Richard J. M.; Bouma, Henri; Baan, Jan; Eendebak, Pieter T.; van Rest, Jeroen H. C.

    2015-10-01

    Person tracking across non-overlapping cameras and other types of video analytics benefit from spatial calibration information that allows an estimation of the distance between cameras and a relation between pixel coordinates and world coordinates within a camera. In a large environment with many cameras, or for frequent ad-hoc deployments of cameras, the cost of this calibration is high. This creates a barrier for the use of video analytics. Automating the calibration allows for a short configuration time, and the use of video analytics in a wider range of scenarios, including ad-hoc crisis situations and large scale surveillance systems. We show an autocalibration method entirely based on pedestrian detections in surveillance video in multiple non-overlapping cameras. In this paper, we show the two main components of automatic calibration. The first shows the intra-camera geometry estimation that leads to an estimate of the tilt angle, focal length and camera height, which is important for the conversion from pixels to meters and vice versa. The second component shows the inter-camera topology inference that leads to an estimate of the distance between cameras, which is important for spatio-temporal analysis of multi-camera tracking. This paper describes each of these methods and provides results on realistic video data.

  17. 1. VARIABLEANGLE LAUNCHER (VAL) CONCRETE 'A' FRAME STRUCTURE SHOWING CAMERA ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    1. VARIABLE-ANGLE LAUNCHER (VAL) CONCRETE 'A' FRAME STRUCTURE SHOWING CAMERA TOWER STRUCTURE LOOKING SOUTH AND ARCHED OPENING FOR ROADWAY. - Variable Angle Launcher Complex, Variable Angle Launcher, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  18. Camera Calibration using the Damped Bundle Adjustment Toolbox

    NASA Astrophysics Data System (ADS)

    Börlin, N.; Grussenmeyer, P.

    2014-05-01

    Camera calibration is one of the fundamental photogrammetric tasks. The standard procedure is to apply an iterative adjustment to measurements of known control points. The iterative adjustment needs initial values of internal and external parameters. In this paper we investigate a procedure where only one parameter - the focal length is given a specific initial value. The procedure is validated using the freely available Damped Bundle Adjustment Toolbox on five calibration data sets using varying narrow- and wide-angle lenses. The results show that the Gauss-Newton-Armijo and Levenberg-Marquardt-Powell bundle adjustment methods implemented in the toolbox converge even if the initial values of the focal length are between 1/2 and 32 times the true focal length, even if the parameters are highly correlated. Standard statistical analysis methods in the toolbox enable manual selection of the lens distortion parameters to estimate, something not available in other camera calibration toolboxes. A standardised camera calibration procedure that does not require any information about the camera sensor or focal length is suggested based on the convergence results. The toolbox source and data sets used in this paper are available from the authors.

  19. Space Camera

    NASA Technical Reports Server (NTRS)

    1983-01-01

    Nikon's F3 35mm camera was specially modified for use by Space Shuttle astronauts. The modification work produced a spinoff lubricant. Because lubricants in space have a tendency to migrate within the camera, Nikon conducted extensive development to produce nonmigratory lubricants; variations of these lubricants are used in the commercial F3, giving it better performance than conventional lubricants. Another spinoff is the coreless motor which allows the F3 to shoot 140 rolls of film on one set of batteries.

  20. 1. VARIABLEANGLE LAUNCHER CAMERA STATION 1400 (BUILDING NO. 42021), VIEW ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    1. VARIABLE-ANGLE LAUNCHER CAMERA STATION 1400 (BUILDING NO. 42021), VIEW OF EXTERIOR LOOKING NORTHEAST WITH CAMERA STATION IN 1100 (BUILDING NO. 42020) BACKGROUND. - Variable Angle Launcher Complex, Camera Stations, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  1. Faint Object Camera imaging and spectroscopy of NGC 4151

    NASA Technical Reports Server (NTRS)

    Boksenberg, A.; Catchpole, R. M.; Macchetto, F.; Albrecht, R.; Barbieri, C.; Blades, J. C.; Crane, P.; Deharveng, J. M.; Disney, M. J.; Jakobsen, P.

    1995-01-01

    We describe ultraviolet and optical imaging and spectroscopy within the central few arcseconds of the Seyfert galaxy NGC 4151, obtained with the Faint Object Camera on the Hubble Space Telescope. A narrowband image including (O III) lambda(5007) shows a bright nucleus centered on a complex biconical structure having apparent opening angle approximately 65 deg and axis at a position angle along 65 deg-245 deg; images in bands including Lyman-alpha and C IV lambda(1550) and in the optical continuum near 5500 A, show only the bright nucleus. In an off-nuclear optical long-slit spectrum we find a high and a low radial velocity component within the narrow emission lines. We identify the low-velocity component with the bright, extended, knotty structure within the cones, and the high-velocity component with more confined diffuse emission. Also present are strong continuum emission and broad Balmer emission line components, which we attribute to the extended point spread function arising from the intense nuclear emission. Adopting the geometry pointed out by Pedlar et al. (1993) to explain the observed misalignment of the radio jets and the main optical structure we model an ionizing radiation bicone, originating within a galactic disk, with apex at the active nucleus and axis centered on the extended radio jets. We confirm that through density bounding the gross spatial structure of the emission line region can be reproduced with a wide opening angle that includes the line of sight, consistent with the presence of a simple opaque torus allowing direct view of the nucleus. In particular, our modelling reproduces the observed decrease in position angle with distance from the nucleus, progressing initially from the direction of the extended radio jet, through our optical structure, and on to the extended narrow-line region. We explore the kinematics of the narrow-line low- and high-velocity components on the basis of our spectroscopy and adopted model structure.

  2. Mapping the Apollo 17 landing site area based on Lunar Reconnaissance Orbiter Camera images and Apollo surface photography

    NASA Astrophysics Data System (ADS)

    Haase, I.; Oberst, J.; Scholten, F.; Wählisch, M.; Gläser, P.; Karachevtseva, I.; Robinson, M. S.

    2012-05-01

    Newly acquired high resolution Lunar Reconnaissance Orbiter Camera (LROC) images allow accurate determination of the coordinates of Apollo hardware, sampling stations, and photographic viewpoints. In particular, the positions from where the Apollo 17 astronauts recorded panoramic image series, at the so-called “traverse stations”, were precisely determined for traverse path reconstruction. We analyzed observations made in Apollo surface photography as well as orthorectified orbital images (0.5 m/pixel) and Digital Terrain Models (DTMs) (1.5 m/pixel and 100 m/pixel) derived from LROC Narrow Angle Camera (NAC) and Wide Angle Camera (WAC) images. Key features captured in the Apollo panoramic sequences were identified in LROC NAC orthoimages. Angular directions of these features were measured in the panoramic images and fitted to the NAC orthoimage by applying least squares techniques. As a result, we obtained the surface panoramic camera positions to within 50 cm. At the same time, the camera orientations, North azimuth angles and distances to nearby features of interest were also determined. Here, initial results are shown for traverse station 1 (northwest of Steno Crater) as well as the Apollo Lunar Surface Experiment Package (ALSEP) area.

  3. Development of broad-view camera unit for laparoscopic surgery.

    PubMed

    Kawahara, Tomohiro; Takaki, Takeshi; Ishii, Idaku; Okajima, Masazumi

    2009-01-01

    A disadvantage of laparoscopic surgery is the narrow operative field provided by the endoscope camera. This paper describes a newly developed broad-view camera unit for use with the Broad-View Camera System, which is capable of providing a wider view of the internal organs during laparoscopic surgery. The developed camera unit is composed of a miniature color CMOS camera, an indwelling needle, and an extra-thin connector. The specific design of the camera unit and the method for positioning it are shown. The performance of the camera unit has been confirmed through basic and animal experiments. PMID:19963983

  4. Infrared Camera

    NASA Technical Reports Server (NTRS)

    1997-01-01

    A sensitive infrared camera that observes the blazing plumes from the Space Shuttle or expendable rocket lift-offs is capable of scanning for fires, monitoring the environment and providing medical imaging. The hand-held camera uses highly sensitive arrays in infrared photodetectors known as quantum well infrared photo detectors (QWIPS). QWIPS were developed by the Jet Propulsion Laboratory's Center for Space Microelectronics Technology in partnership with Amber, a Raytheon company. In October 1996, QWIP detectors pointed out hot spots of the destructive fires speeding through Malibu, California. Night vision, early warning systems, navigation, flight control systems, weather monitoring, security and surveillance are among the duties for which the camera is suited. Medical applications are also expected.

  5. Electronic Still Camera

    NASA Technical Reports Server (NTRS)

    Holland, S. Douglas (Inventor)

    1992-01-01

    A handheld, programmable, digital camera is disclosed that supports a variety of sensors and has program control over the system components to provide versatility. The camera uses a high performance design which produces near film quality images from an electronic system. The optical system of the camera incorporates a conventional camera body that was slightly modified, thus permitting the use of conventional camera accessories, such as telephoto lenses, wide-angle lenses, auto-focusing circuitry, auto-exposure circuitry, flash units, and the like. An image sensor, such as a charge coupled device ('CCD') collects the photons that pass through the camera aperture when the shutter is opened, and produces an analog electrical signal indicative of the image. The analog image signal is read out of the CCD and is processed by preamplifier circuitry, a correlated double sampler, and a sample and hold circuit before it is converted to a digital signal. The analog-to-digital converter has an accuracy of eight bits to insure accuracy during the conversion. Two types of data ports are included for two different data transfer needs. One data port comprises a general purpose industrial standard port and the other a high speed/high performance application specific port. The system uses removable hard disks as its permanent storage media. The hard disk receives the digital image signal from the memory buffer and correlates the image signal with other sensed parameters, such as longitudinal or other information. When the storage capacity of the hard disk has been filled, the disk can be replaced with a new disk.

  6. 5. VAL CAMERA CAR, DETAIL OF HOIST AT SIDE OF ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    5. VAL CAMERA CAR, DETAIL OF HOIST AT SIDE OF BRIDGE AND ENGINE CAR ON TRACKS, LOOKING NORTHEAST. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  7. 3. VAL CAMERA STATION, DETAIL OF ROOF OVERHANG AND EXPOSED ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    3. VAL CAMERA STATION, DETAIL OF ROOF OVERHANG AND EXPOSED CONCRETE SURFACES. - Variable Angle Launcher Complex, Camera Stations, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  8. Nikon Camera

    NASA Technical Reports Server (NTRS)

    1980-01-01

    Nikon FM compact has simplification feature derived from cameras designed for easy, yet accurate use in a weightless environment. Innovation is a plastic-cushioned advance lever which advances the film and simultaneously switches on a built in light meter. With a turn of the lens aperture ring, a glowing signal in viewfinder confirms correct exposure.

  9. CCD Camera

    DOEpatents

    Roth, Roger R. (Minnetonka, MN)

    1983-01-01

    A CCD camera capable of observing a moving object which has varying intensities of radiation eminating therefrom and which may move at varying speeds is shown wherein there is substantially no overlapping of successive images and wherein the exposure times and scan times may be varied independently of each other.

  10. Narrow Bandwidth Telecommunications.

    ERIC Educational Resources Information Center

    Kessler, William J.; Wilhelm, Michael J.

    The basic principles of narrow bandwidth telecommunications are treated in a manner understandable to the non-engineer. The comparative characteristics of the various narrow bandwidth communications circuits are examined. Currently available graphics transmission and reception equipment are described and their capabilities and limitations…

  11. Cassini Camera Contamination Anomaly: Experiences and Lessons Learned

    NASA Technical Reports Server (NTRS)

    Haemmerle, Vance R.; Gerhard, James H.

    2006-01-01

    We discuss the contamination 'Haze' anomaly for the Cassini Narrow Angle Camera (NAC), one of two optical telescopes that comprise the Imaging Science Subsystem (ISS). Cassini is a Saturn Orbiter with a 4-year nominal mission. The incident occurred in 2001, five months after Jupiter encounter during the Cruise phase and ironically at the resumption of planned maintenance decontamination cycles. The degraded optical performance was first identified by the Instrument Operations Team with the first ISS Saturn imaging six weeks later. A distinct haze of varying size from image to image marred the images of Saturn. A photometric star calibration of the Pleiades, 4 days after the incident, showed stars with halos. Analysis showed that while the halo's intensity was only 1 - 2% of the intensity of the central peak of a star, the halo contained 30 - 70% of its integrated flux. This condition would impact science return. In a review of our experiences, we examine the contamination control plan, discuss the analysis of the limited data available and describe the one-year campaign to remove the haze from the camera. After several long conservative heating activities and interim analysis of their results, the contamination problem as measured by the camera's point spread function was essentially back to preanomaly size and at a point where there would be more risk to continue. We stress the importance of the flexibility of operations and instrument design, the need to do early infight instrument calibration and continual monitoring of instrument performance.

  12. Initial Results of 3D Topographic Mapping Using Lunar Reconnaissance Orbiter Camera (LROC) Stereo Imagery

    NASA Astrophysics Data System (ADS)

    Li, R.; Oberst, J.; McEwen, A. S.; Archinal, B. A.; Beyer, R. A.; Thomas, P. C.; Chen, Y.; Hwangbo, J.; Lawver, J. D.; Scholten, F.; Mattson, S. S.; Howington-Kraus, A. E.; Robinson, M. S.

    2009-12-01

    The Lunar Reconnaissance Orbiter (LRO), launched June 18, 2009, carries the Lunar Reconnaissance Orbiter Camera (LROC) as one of seven remote sensing instruments on board. The camera system is equipped with a Wide Angle Camera (WAC) and two Narrow Angle Cameras (NAC) for systematic lunar surface mapping and detailed site characterization for potential landing site selection and resource identification. The LROC WAC is a pushframe camera with five 14-line by 704-sample framelets for visible light bands and two 16-line by 512-sample (summed 4x to 4 by 128) UV bands. The WAC can also acquire monochrome images with a 14-line by 1024-sample format. At the nominal 50-km orbit the visible bands ground scale is 75-m/pixel and the UV 383-m/pixel. Overlapping WAC images from adjacent orbits can be used to map topography at a scale of a few hundred meters. The two panchromatic NAC cameras are pushbroom imaging sensors each with a Cassegrain telescope of a 700-mm focal length. The two NAC cameras are aligned with a small overlap in the cross-track direction so that they cover a 5-km swath with a combined field-of-view (FOV) of 5.6°. At an altitude of 50-km, the NAC can provide panchromatic images from its 5,000-pixel linear CCD at a ground scale of 0.5-m/pixel. Calibration of the cameras was performed by using precision collimator measurements to determine the camera principal points and radial lens distortion. The orientation of the two NAC cameras is estimated by a boresight calibration using double and triple overlapping NAC images of the lunar surface. The resulting calibration results are incorporated into a photogrammetric bundle adjustment (BA), which models the LROC camera imaging geometry, in order to refine the exterior orientation (EO) parameters initially retrieved from the SPICE kernels. Consequently, the improved EO parameters can significantly enhance the quality of topographic products derived from LROC NAC imagery. In addition, an analysis of the spacecraft jitter effect is performed by measuring lunar surface features in the NAC CCD overlapping strip in the image space and object space. Topographic and cartographic data processing results and products derived from LROC NAC and WAC stereo imagery using different software systems from several participating institutions of the LROC team will be presented, including results of calibration, bundle adjustment, jitter analysis, DEM, orthophoto, and cartographic maps.

  13. 7. VAL CONCRETE 'A' FRAME STRUCTURE SHOWING CAMERA TOWER ARCHED ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    7. VAL CONCRETE 'A' FRAME STRUCTURE SHOWING CAMERA TOWER ARCHED OPENING FOR ROADWAY AND COUNTERWEIGHT SLOPE TAKEN FROM RESERVOIR LOOKING WEST. - Variable Angle Launcher Complex, Variable Angle Launcher, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  14. 11. VAL, DETAIL OF CAMERA TOWER AND THE TOP OF ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    11. VAL, DETAIL OF CAMERA TOWER AND THE TOP OF CONCRETE 'A' FRAME STRUCTURE LOOKING NORTH. - Variable Angle Launcher Complex, Variable Angle Launcher, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  15. 2. VAL CONCRETE 'A' FRAME STRUCTURE SHOWING CAMERA TOWER, PROJECTILE ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    2. VAL CONCRETE 'A' FRAME STRUCTURE SHOWING CAMERA TOWER, PROJECTILE LOADING DECK AND BREECH END OF LAUNCHER BRIDGE LOOKING SOUTH. - Variable Angle Launcher Complex, Variable Angle Launcher, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  16. Calibration and Epipolar Geometry of Generic Heterogenous Camera Systems

    NASA Astrophysics Data System (ADS)

    Luber, A.; Rueß, D.; Manthey, K.; Reulke, R.

    2012-07-01

    The application of perspective camera systems in photogrammetry and computer vision is state of the art. In recent years nonperspective and especially omnidirectional camera systems were increasingly used in close-range photogrammetry tasks. In general perspective camera model, i. e. pinhole model, cannot be applied when using non-perspective camera systems. However, several camera models for different omnidirectional camera systems are proposed in literature. Using different types of cameras in a heterogeneous camera system may lead to an advantageous combination. The advantages of different camera systems, e. g. field of view and resolution, result in a new enhanced camera system. If these different kinds of cameras can be modeled, using a unified camera model, the total calibration process can be simplified. Sometimes it is not possible to give the specific camera model in advance. In these cases a generic approach is helpful. Furthermore, a simple stereo reconstruction becomes possible using a fisheye and a perspective camera for example. In this paper camera models for perspective, wide-angle and omnidirectional camera systems are evaluated. The crucial initialization of the model's parameters is conducted using a generic method that is independent of the particular camera system. The accuracy of this generic camera calibration approach is evaluated by calibration of a dozen of real camera systems. It will be shown, that a unified method of modeling, parameter approximation and calibration of interior and exterior orientation can be applied to derive 3D object data.

  17. Calibration of the Lunar Reconnaissance Orbiter Camera

    NASA Astrophysics Data System (ADS)

    Tschimmel, M.; Robinson, M. S.; Humm, D. C.; Denevi, B. W.; Lawrence, S. J.; Brylow, S.; Ravine, M.; Ghaemi, T.

    2008-12-01

    The Lunar Reconnaissance Orbiter Camera (LROC) onboard the NASA Lunar Reconnaissance Orbiter (LRO) spacecraft consists of three cameras: the Wide-Angle Camera (WAC) and two identical Narrow Angle Cameras (NAC-L, NAC-R). The WAC is push-frame imager with 5 visible wavelength filters (415 to 680 nm) at a spatial resolution of 100 m/pixel and 2 UV filters (315 and 360 nm) with a resolution of 400 m/pixel. In addition to the multicolor imaging the WAC can operate in monochrome mode to provide a global large- incidence angle basemap and a time-lapse movie of the illumination conditions at both poles. The WAC has a highly linear response, a read noise of 72 e- and a full well capacity of 47,200 e-. The signal-to-noise ratio in each band is 140 in the worst case. There are no out-of-band leaks and the spectral response of each filter is well characterized. Each NAC is a monochrome pushbroom scanner, providing images with a resolution of 50 cm/pixel from a 50-km orbit. A single NAC image has a swath width of 2.5 km and a length of up to 26 km. The NACs are mounted to acquire side-by-side imaging for a combined swath width of 5 km. The NAC is designed to fully characterize future human and robotic landing sites in terms of topography and hazard risks. The North and South poles will be mapped on a 1-meter-scale poleward of 85.5° latitude. Stereo coverage can be provided by pointing the NACs off-nadir. The NACs are also highly linear. Read noise is 71 e- for NAC-L and 74 e- for NAC-R and the full well capacity is 248,500 e- for NAC-L and 262,500 e- for NAC- R. The focal lengths are 699.6 mm for NAC-L and 701.6 mm for NAC-R; the system MTF is 28% for NAC-L and 26% for NAC-R. The signal-to-noise ratio is at least 46 (terminator scene) and can be higher than 200 (high sun scene). Both NACs exhibit a straylight feature, which is caused by out-of-field sources and is of a magnitude of 1-3%. However, as this feature is well understood it can be greatly reduced during ground processing. All three cameras were calibrated in the laboratory under ambient conditions. Future thermal vacuum tests will characterize critical behaviors across the full range of lunar operating temperatures. In-flight tests will check for changes in response after launch and provide key data for meeting the requirements of 1% relative and 10% absolute radiometric calibration.

  18. The DRAGO gamma camera

    NASA Astrophysics Data System (ADS)

    Fiorini, C.; Gola, A.; Peloso, R.; Longoni, A.; Lechner, P.; Soltau, H.; Strüder, L.; Ottobrini, L.; Martelli, C.; Lui, R.; Madaschi, L.; Belloli, S.

    2010-04-01

    In this work, we present the results of the experimental characterization of the DRAGO (DRift detector Array-based Gamma camera for Oncology), a detection system developed for high-spatial resolution gamma-ray imaging. This camera is based on a monolithic array of 77 silicon drift detectors (SDDs), with a total active area of 6.7 cm2, coupled to a single 5-mm-thick CsI(Tl) scintillator crystal. The use of an array of SDDs provides a high quantum efficiency for the detection of the scintillation light together with a very low electronics noise. A very compact detection module based on the use of integrated readout circuits was developed. The performances achieved in gamma-ray imaging using this camera are reported here. When imaging a 0.2 mm collimated C57o source (122 keV) over different points of the active area, a spatial resolution ranging from 0.25 to 0.5 mm was measured. The depth-of-interaction capability of the detector, thanks to the use of a Maximum Likelihood reconstruction algorithm, was also investigated by imaging a collimated beam tilted to an angle of 45° with respect to the scintillator surface. Finally, the imager was characterized with in vivo measurements on mice, in a real preclinical environment.

  19. Anisotropic de Gennes Narrowing in Confined Fluids

    NASA Astrophysics Data System (ADS)

    Nygârd, Kim; Buitenhuis, Johan; Kagias, Matias; Jefimovs, Konstantins; Zontone, Federico; Chushkin, Yuriy

    2016-04-01

    The collective diffusion of dense fluids in spatial confinement is studied by combining high-energy (21 keV) x-ray photon correlation spectroscopy and small-angle x-ray scattering from colloid-filled microfluidic channels. We find the structural relaxation in confinement to be slower compared to the bulk. The collective dynamics is wave vector dependent, akin to the de Gennes narrowing typically observed in bulk fluids. However, in stark contrast to the bulk, the structure factor and de Gennes narrowing in confinement are anisotropic. These experimental observations are essential in order to develop a microscopic theoretical description of collective diffusion of dense fluids in confined geometries.

  20. Pinhole Cameras: For Science, Art, and Fun!

    ERIC Educational Resources Information Center

    Button, Clare

    2007-01-01

    A pinhole camera is a camera without a lens. A tiny hole replaces the lens, and light is allowed to come in for short amount of time by means of a hand-operated shutter. The pinhole allows only a very narrow beam of light to enter, which reduces confusion due to scattered light on the film. This results in an image that is focused, reversed, and

  1. Pinhole Cameras: For Science, Art, and Fun!

    ERIC Educational Resources Information Center

    Button, Clare

    2007-01-01

    A pinhole camera is a camera without a lens. A tiny hole replaces the lens, and light is allowed to come in for short amount of time by means of a hand-operated shutter. The pinhole allows only a very narrow beam of light to enter, which reduces confusion due to scattered light on the film. This results in an image that is focused, reversed, and…

  2. Bacterial motion in narrow capillaries

    PubMed Central

    Ping, Liyan; Wasnik, Vaibhav; Emberly, Eldon

    2014-01-01

    Motile bacteria often have to pass through small tortuous pores in soil or tissue of higher organisms. However, their motion in this prevalent type of niche is not fully understood. Here, we modeled it with narrow glass capillaries and identified a critical radius (Rc) for bacterial motion. Near the surface of capillaries narrower than that, the swimming trajectories are helices. In larger capillaries, they swim in distorted circles. Under non-slip condition, the peritrichous Escherichia coli swam in left-handed helices with an Rc of ∼10 μm near glass surface. However, slipping could occur in the fast monotrichous Pseudomonas fluorescens, when a speed threshold was exceeded, and thus both left-handed and right-handed helices were executed in glass capillaries. In the natural non-cylindrical pores, the near-surface trajectories would be spirals and twisted loops. Engaging in such motions reduces the bacterial migration rate. With a given pore size, the run length and the tumbling angle of the bacterium determine the probability and duration of their near-surface motion. Shear flow and chemotaxis potentially enhance it. Based on this observation, the puzzling previous observations on bacterial migration in porous environments can be interpreted. PMID:25764548

  3. The High Resolution Stereo Camera (HRSC) for Mars 96: Results of Outdoor Tests

    NASA Astrophysics Data System (ADS)

    Hauber, E.; Oberst, J.; Flohrer, J.; Sebastian, I.; Zhang, W.; Robinson, C. A.; Jaumann, R.; Neukum, G.

    1996-03-01

    The High Resolution Stereo Camera (HRSC) is one of the principal orbiter payload instruments for the Russian Mars 96 Mission to be launched in November this year. The pushbroom scanner is equipped with a single 175 mm lens and 9 linear CCD arrays (5 panchromatic and 4 narrow-band color filters) mounted in parallel providing nadir, forward, and backward looking viewing conditions for each line, respectively. In orbit, images will be acquired line by line as the spacecraft moves. The goal is to obtain large-scale high-resolution (10- 15m/pixel) multispectral stereo images at different phase angles. In spring 1995, the flight hard-ware was tested at the prime manufacturer's facility near Lake Constance (Germany) in order to verify the geometric and radiometric performance of the camera as well as the software developed for HRSC ground data processing. It was demonstrated that instrument and processing software met or exceeded their design goals.

  4. 11. COMPLETED FIXED CAMERA STATION 1700 (BUILDING NO. 42022) LOOKING ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    11. COMPLETED FIXED CAMERA STATION 1700 (BUILDING NO. 42022) LOOKING WEST SHOWING WINDOW OPENING FOR CAMERA, March 31, 1948. (Original photograph in possession of Dave Willis, San Diego, California.) - Variable Angle Launcher Complex, Camera Stations, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  5. 5. VAL CAMERA STATION, VIEW FROM INTERIOR OUT OF WINDOW ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    5. VAL CAMERA STATION, VIEW FROM INTERIOR OUT OF WINDOW OPENING TOWARD VAL FIRING RANGE LOOKING EAST WITH VARIABLE ANGLE LAUNCHER IN BACKGROUND. - Variable Angle Launcher Complex, Camera Stations, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  6. Narrowness and Liberality

    ERIC Educational Resources Information Center

    Agresto, John

    2003-01-01

    John Agresto, whose task has been to rebuild the war-ravaged infrastructure of a Middle-Eastern university system, is discouraged to see that narrow expertise is the only goal of education there, to the utter exclusion of intellectual breadth. He comments that, although it is not that bad in the U.S., he feels that doctoral programs as currently…

  7. Omnidirectional underwater camera design and calibration.

    PubMed

    Bosch, Josep; Gracias, Nuno; Ridao, Pere; Ribas, David

    2015-01-01

    This paper presents the development of an underwater omnidirectional multi-camera system (OMS) based on a commercially available six-camera system, originally designed for land applications. A full calibration method is presented for the estimation of both the intrinsic and extrinsic parameters, which is able to cope with wide-angle lenses and non-overlapping cameras simultaneously. This method is valid for any OMS in both land or water applications. For underwater use, a customized housing is required, which often leads to strong image distortion due to refraction among the different media. This phenomena makes the basic pinhole camera model invalid for underwater cameras, especially when using wide-angle lenses, and requires the explicit modeling of the individual optical rays. To address this problem, a ray tracing approach has been adopted to create a field-of-view (FOV) simulator for underwater cameras. The simulator allows for the testing of different housing geometries and optics for the cameras to ensure a complete hemisphere coverage in underwater operation. This paper describes the design and testing of a compact custom housing for a commercial off-the-shelf OMS camera (Ladybug 3) and presents the first results of its use. A proposed three-stage calibration process allows for the estimation of all of the relevant camera parameters. Experimental results are presented, which illustrate the performance of the calibration method and validate the approach. PMID:25774707

  8. Omnidirectional Underwater Camera Design and Calibration

    PubMed Central

    Bosch, Josep; Gracias, Nuno; Ridao, Pere; Ribas, David

    2015-01-01

    This paper presents the development of an underwater omnidirectional multi-camera system (OMS) based on a commercially available six-camera system, originally designed for land applications. A full calibration method is presented for the estimation of both the intrinsic and extrinsic parameters, which is able to cope with wide-angle lenses and non-overlapping cameras simultaneously. This method is valid for any OMS in both land or water applications. For underwater use, a customized housing is required, which often leads to strong image distortion due to refraction among the different media. This phenomena makes the basic pinhole camera model invalid for underwater cameras, especially when using wide-angle lenses, and requires the explicit modeling of the individual optical rays. To address this problem, a ray tracing approach has been adopted to create a field-of-view (FOV) simulator for underwater cameras. The simulator allows for the testing of different housing geometries and optics for the cameras to ensure a complete hemisphere coverage in underwater operation. This paper describes the design and testing of a compact custom housing for a commercial off-the-shelf OMS camera (Ladybug 3) and presents the first results of its use. A proposed three-stage calibration process allows for the estimation of all of the relevant camera parameters. Experimental results are presented, which illustrate the performance of the calibration method and validate the approach. PMID:25774707

  9. Experience with duplex bearings in narrow angle oscillating applications

    NASA Technical Reports Server (NTRS)

    Phinney, D. D.; Pollard, C. L.; Hinricks, J. T.

    1988-01-01

    Duplex ball bearings are matched pairs on which the abutting faces of the rings have been accurately ground so that when the rings are clamped together, a controlled amount of interference (preload) exists across the balls. These bearings are vulnerable to radial temperature gradients, blocking in oscillation and increased sensitivity to contamination. These conditions decrease the service life of these bearings. It was decided that an accelerated thermal vacuum life test should be conducted. The test apparatus and results are described and the rationale is presented for reducing a multiyear life test on oil lubricated bearings to less than a year.

  10. Laboratory calibration and characterization of video cameras

    NASA Technical Reports Server (NTRS)

    Burner, A. W.; Snow, W. L.; Shortis, M. R.; Goad, W. K.

    1990-01-01

    Some techniques for laboratory calibration and characterization of video cameras used with frame grabber boards are presented. A laser-illuminated displaced reticle technique (with camera lens removed) is used to determine the camera/grabber effective horizontal and vertical pixel spacing as well as the angle of nonperpendicularity of the axes. The principal point of autocollimation and point of symmetry are found by illuminating the camera with an unexpanded laser beam, either aligned with the sensor or lens. Lens distortion and the principal distance are determined from images of a calibration plate suitably aligned with the camera. Calibration and characterization results for several video cameras are presented. Differences between these laboratory techniques and test range and plumb line calibration are noted.

  11. Laboratory Calibration and Characterization of Video Cameras

    NASA Technical Reports Server (NTRS)

    Burner, A. W.; Snow, W. L.; Shortis, M. R.; Goad, W. K.

    1989-01-01

    Some techniques for laboratory calibration and characterization of video cameras used with frame grabber boards are presented. A laser-illuminated displaced reticle technique (with camera lens removed) is used to determine the camera/grabber effective horizontal and vertical pixel spacing as well as the angle of non-perpendicularity of the axes. The principal point of autocollimation and point of symmetry are found by illuminating the camera with an unexpanded laser beam, either aligned with the sensor or lens. Lens distortion and the principal distance are determined from images of a calibration plate suitable aligned with the camera. Calibration and characterization results for several video cameras are presented. Differences between these laboratory techniques and test range and plumb line calibration are noted.

  12. Caught on Camera.

    ERIC Educational Resources Information Center

    Milshtein, Amy

    2002-01-01

    Describes the benefits of and rules to be followed when using surveillance cameras for school security. Discusses various camera models, including indoor and outdoor fixed position cameras, pan-tilt zoom cameras, and pinhole-lens cameras for covert surveillance. (EV)

  13. Automatic commanding of the Mars Observer Camera

    NASA Technical Reports Server (NTRS)

    Caplinger, Michael

    1994-01-01

    Mars Observer, launched in September 1992, was intended to be a 'survey-type' mission that acquired global coverage of Mars from a low, circular, near-polar orbit during an entire Martian year. As such, most of its instruments had fixed data rates, wide fields of view, and relatively low resolution, with fairly limited requirements for commanding. An exception is the Mars Observer Camera, or MOC. The MOC consists of a two-color Wide Angle (WA) system that can acquire both global images at low resolution (7.5 km/pixel) and regional images at commandable resolutions up to 250 m/pixel. Complementing the WA is the Narrow Angle (NA) system, that can acquire images at 8 resolutions from 12 m/pixel to 1.5 m/pixel, with a maximum crosstrack dimension of 3 km. The MOC also provides various forms of data compression (both lossless and lossy), and is designed to work at data rates from 700 bits per second (bps) to over 80k bps. Because of this flexibility, developing MOC command sequences is much more difficult than the routine mode-changing that characterizes other instrument operations. Although the MOC cannot be pointed (the spacecraft is fixed nadir-pointing and has no scan platform), the timing, downlink stream allocation, compression type and parameters, and image dimensions of each image must be commanded from the ground, subject to the constraints inherent in the MOC and the spacecraft. To minimize the need for a large operations staff, the entire command generation process has been automated within the MOC Ground Data System. Following the loss of the Mars Observer spacecraft in August 1993, NASA intends to launch a new spacecraft, Mars Global Surveyor (MGS), in late 1996. This spacecraft will carry the MOC flight spare (MOC 2). The MOC 2 operations plan will be largely identical to that developed for MOC, and all of the algorithms described here are applicable to it.

  14. Readout electronics of physics of accelerating universe camera

    NASA Astrophysics Data System (ADS)

    de Vicente, Juan; Castilla, Javier; Jiménez, Jorge; Cardiel-Sas, L.; Illa, José M.

    2014-08-01

    The Physics of Accelerating Universe Camera (PAUCam) is a new camera for dark energy studies that will be installed in the William Herschel telescope. The main characteristic of the camera is the capacity for high precision photometric redshift measurement. The camera is composed of eighteen Hamamatsu Photonics CCDs providing a wide field of view covering a diameter of one degree. Unlike the common five optical filters of other similar surveys, PAUCam has forty optical narrow band filters which will provide higher resolution in photometric redshifts. In this paper a general description of the electronics of the camera and its status is presented.

  15. System Synchronizes Recordings from Separated Video Cameras

    NASA Technical Reports Server (NTRS)

    Nail, William; Nail, William L.; Nail, Jasper M.; Le, Doung T.

    2009-01-01

    A system of electronic hardware and software for synchronizing recordings from multiple, physically separated video cameras is being developed, primarily for use in multiple-look-angle video production. The system, the time code used in the system, and the underlying method of synchronization upon which the design of the system is based are denoted generally by the term "Geo-TimeCode(TradeMark)." The system is embodied mostly in compact, lightweight, portable units (see figure) denoted video time-code units (VTUs) - one VTU for each video camera. The system is scalable in that any number of camera recordings can be synchronized. The estimated retail price per unit would be about $350 (in 2006 dollars). The need for this or another synchronization system external to video cameras arises because most video cameras do not include internal means for maintaining synchronization with other video cameras. Unlike prior video-camera-synchronization systems, this system does not depend on continuous cable or radio links between cameras (however, it does depend on occasional cable links lasting a few seconds). Also, whereas the time codes used in prior video-camera-synchronization systems typically repeat after 24 hours, the time code used in this system does not repeat for slightly more than 136 years; hence, this system is much better suited for long-term deployment of multiple cameras.

  16. Angle performance on optima MDxt

    SciTech Connect

    David, Jonathan; Kamenitsa, Dennis

    2012-11-06

    Angle control on medium current implanters is important due to the high angle-sensitivity of typical medium current implants, such as halo implants. On the Optima MDxt, beam-to-wafer angles are controlled in both the horizontal and vertical directions. In the horizontal direction, the beam angle is measured through six narrow slits, and any angle adjustment is made by electrostatically steering the beam, while cross-wafer beam parallelism is adjusted by changing the focus of the electrostatic parallelizing lens (P-lens). In the vertical direction, the beam angle is measured through a high aspect ratio mask, and any angle adjustment is made by slightly tilting the wafer platen prior to implant. A variety of tests were run to measure the accuracy and repeatability of Optima MDxt's angle control. SIMS profiles of a high energy, channeling sensitive condition show both the cross-wafer angle uniformity, along with the small-angle resolution of the system. Angle repeatability was quantified by running a channeling sensitive implant as a regular monitor over a seven month period and measuring the sheet resistance-to-angle sensitivity. Even though crystal cut error was not controlled for in this case, when attributing all Rs variation to angle changes, the overall angle repeatability was measured as 0.16 Degree-Sign (1{sigma}). A separate angle repeatability test involved running a series of V-curves tests over a four month period using low crystal cut wafers selected from the same boule. The results of this test showed the angle repeatability to be <0.1 Degree-Sign (1{sigma}).

  17. Characterization of previously unidentified lunar pyroclastic deposits using Lunar Reconnaissance Orbiter Camera (LROC) data

    USGS Publications Warehouse

    Gustafson, J. Olaf; Bell, James F.; Gaddis, Lisa R.R.; Hawke, B. Ray Ray; Giguere, Thomas A.

    2012-01-01

    We used a Lunar Reconnaissance Orbiter Camera (LROC) global monochrome Wide-angle Camera (WAC) mosaic to conduct a survey of the Moon to search for previously unidentified pyroclastic deposits. Promising locations were examined in detail using LROC multispectral WAC mosaics, high-resolution LROC Narrow Angle Camera (NAC) images, and Clementine multispectral (ultraviolet-visible or UVVIS) data. Out of 47 potential deposits chosen for closer examination, 12 were selected as probable newly identified pyroclastic deposits. Potential pyroclastic deposits were generally found in settings similar to previously identified deposits, including areas within or near mare deposits adjacent to highlands, within floor-fractured craters, and along fissures in mare deposits. However, a significant new finding is the discovery of localized pyroclastic deposits within floor-fractured craters Anderson E and F on the lunar farside, isolated from other known similar deposits. Our search confirms that most major regional and localized low-albedo pyroclastic deposits have been identified on the Moon down to ~100 m/pix resolution, and that additional newly identified deposits are likely to be either isolated small deposits or additional portions of discontinuous, patchy deposits.

  18. Pre-hibernation performances of the OSIRIS cameras onboard the Rosetta spacecraft

    NASA Astrophysics Data System (ADS)

    Magrin, S.; La Forgia, F.; Da Deppo, V.; Lazzarin, M.; Bertini, I.; Ferri, F.; Pajola, M.; Barbieri, M.; Naletto, G.; Barbieri, C.; Tubiana, C.; Küppers, M.; Fornasier, S.; Jorda, L.; Sierks, H.

    2015-02-01

    Context. The ESA cometary mission Rosetta was launched in 2004. In the past years and until the spacecraft hibernation in June 2011, the two cameras of the OSIRIS imaging system (Narrow Angle and Wide Angle Camera, NAC and WAC) observed many different sources. On 20 January 2014 the spacecraft successfully exited hibernation to start observing the primary scientific target of the mission, comet 67P/Churyumov-Gerasimenko. Aims: A study of the past performances of the cameras is now mandatory to be able to determine whether the system has been stable through the time and to derive, if necessary, additional analysis methods for the future precise calibration of the cometary data. Methods: The instrumental responses and filter passbands were used to estimate the efficiency of the system. A comparison with acquired images of specific calibration stars was made, and a refined photometric calibration was computed, both for the absolute flux and for the reflectivity of small bodies of the solar system. Results: We found a stability of the instrumental performances within ±1.5% from 2007 to 2010, with no evidence of an aging effect on the optics or detectors. The efficiency of the instrumentation is found to be as expected in the visible range, but lower than expected in the UV and IR range. A photometric calibration implementation was discussed for the two cameras. Conclusions: The calibration derived from pre-hibernation phases of the mission will be checked as soon as possible after the awakening of OSIRIS and will be continuously monitored until the end of the mission in December 2015. A list of additional calibration sources has been determined that are to be observed during the forthcoming phases of the mission to ensure a better coverage across the wavelength range of the cameras and to study the possible dust contamination of the optics.

  19. Multispectral calibration to enhance the metrology performance of C-mount camera systems

    NASA Astrophysics Data System (ADS)

    Robson, S.; MacDonald, L.; Kyle, S. A.; Shortis, M. R.

    2014-06-01

    Low cost monochrome camera systems based on CMOS sensors and C-mount lenses have been successfully applied to a wide variety of metrology tasks. For high accuracy work such cameras are typically equipped with ring lights to image retro-reflective targets as high contrast image features. Whilst algorithms for target image measurement and lens modelling are highly advanced, including separate RGB channel lens distortion correction, target image circularity compensation and a wide variety of detection and centroiding approaches, less effort has been directed towards optimising physical target image quality by considering optical performance in narrow wavelength bands. This paper describes an initial investigation to assess the effect of wavelength on camera calibration parameters for two different camera bodies and the same `C-mount' wide angle lens. Results demonstrate the expected strong influence on principal distance, radial and tangential distortion, and also highlight possible trends in principal point, orthogonality and affinity parameters which are close to the parameter estimation noise level from the strong convergent self-calibrating image networks.

  20. Tunable compound eye cameras

    NASA Astrophysics Data System (ADS)

    Pätz, Daniel; Leopold, Steffen; Knöbber, Fabian; Sinzinger, Stefan; Hoffmann, Martin; Ambacher, Oliver

    2010-05-01

    We present design and realization concepts for thin compound eye cameras with enhanced optical functionality. The systems are based on facets with individually tunable focus lengths and viewing angles for scanning of the object space. The active lens elements are made of aluminum nitride (AlN)/nanocrystalline diamond (NCD) membranes. This material system allows slow thermally actuated elements with a large deformation range as well as fast piezoelectric elements with a smaller deformation range. Due to the extreme mechanical stability of these materials, we are able to realize microoptical components with optimum surface qualities as well as an excellent long-term stability. We use facets of microlenses with 1 mm in diameter and a tunable focusing power to compensate for the focus shift for different viewing angles during the scanning procedure. The beam deflection for scanning is realized either by laterally shifting spherical elements or by a tunable microprism with reduced aberrations. For both actuators we present a design, fabrication concept and first experimental results.

  1. Determining Camera Gain in Room Temperature Cameras

    SciTech Connect

    Joshua Cogliati

    2010-12-01

    James R. Janesick provides a method for determining the amplification of a CCD or CMOS camera when only access to the raw images is provided. However, the equation that is provided ignores the contribution of dark current. For CCD or CMOS cameras that are cooled well below room temperature, this is not a problem, however, the technique needs adjustment for use with room temperature cameras. This article describes the adjustment made to the equation, and a test of this method.

  2. Anisotropic de Gennes Narrowing in Confined Fluids.

    PubMed

    Nygård, Kim; Buitenhuis, Johan; Kagias, Matias; Jefimovs, Konstantins; Zontone, Federico; Chushkin, Yuriy

    2016-04-22

    The collective diffusion of dense fluids in spatial confinement is studied by combining high-energy (21 keV) x-ray photon correlation spectroscopy and small-angle x-ray scattering from colloid-filled microfluidic channels. We find the structural relaxation in confinement to be slower compared to the bulk. The collective dynamics is wave vector dependent, akin to the de Gennes narrowing typically observed in bulk fluids. However, in stark contrast to the bulk, the structure factor and de Gennes narrowing in confinement are anisotropic. These experimental observations are essential in order to develop a microscopic theoretical description of collective diffusion of dense fluids in confined geometries. PMID:27152823

  3. Two-Camera Acquisition and Tracking of a Flying Target

    NASA Technical Reports Server (NTRS)

    Biswas, Abhijit; Assad, Christopher; Kovalik, Joseph M.; Pain, Bedabrata; Wrigley, Chris J.; Twiss, Peter

    2008-01-01

    A method and apparatus have been developed to solve the problem of automated acquisition and tracking, from a location on the ground, of a luminous moving target in the sky. The method involves the use of two electronic cameras: (1) a stationary camera having a wide field of view, positioned and oriented to image the entire sky; and (2) a camera that has a much narrower field of view (a few degrees wide) and is mounted on a two-axis gimbal. The wide-field-of-view stationary camera is used to initially identify the target against the background sky. So that the approximate position of the target can be determined, pixel locations on the image-detector plane in the stationary camera are calibrated with respect to azimuth and elevation. The approximate target position is used to initially aim the gimballed narrow-field-of-view camera in the approximate direction of the target. Next, the narrow-field-of view camera locks onto the target image, and thereafter the gimbals are actuated as needed to maintain lock and thereby track the target with precision greater than that attainable by use of the stationary camera.

  4. Full Stokes polarization imaging camera

    NASA Astrophysics Data System (ADS)

    Vedel, M.; Breugnot, S.; Lechocinski, N.

    2011-10-01

    Objective and background: We present a new version of Bossa Nova Technologies' passive polarization imaging camera. The previous version was performing live measurement of the Linear Stokes parameters (S0, S1, S2), and its derivatives. This new version presented in this paper performs live measurement of Full Stokes parameters, i.e. including the fourth parameter S3 related to the amount of circular polarization. Dedicated software was developed to provide live images of any Stokes related parameters such as the Degree Of Linear Polarization (DOLP), the Degree Of Circular Polarization (DOCP), the Angle Of Polarization (AOP). Results: We first we give a brief description of the camera and its technology. It is a Division Of Time Polarimeter using a custom ferroelectric liquid crystal cell. A description of the method used to calculate Data Reduction Matrix (DRM)5,9 linking intensity measurements and the Stokes parameters is given. The calibration was developed in order to maximize the condition number of the DRM. It also allows very efficient post processing of the images acquired. Complete evaluation of the precision of standard polarization parameters is described. We further present the standard features of the dedicated software that was developed to operate the camera. It provides live images of the Stokes vector components and the usual associated parameters. Finally some tests already conducted are presented. It includes indoor laboratory and outdoor measurements. This new camera will be a useful tool for many applications such as biomedical, remote sensing, metrology, material studies, and others.

  5. Synchronized Mid-Infrared Beam Characterization of Narrow Gap Semiconductors

    NASA Astrophysics Data System (ADS)

    Olafsen, L. J.; Eaves, I. K.; Olafsen, J. S.

    2011-12-01

    The near- and mid-infrared output from the idler of an optical parametric oscillator (OPO) and from antimonide-based narrow gap semiconductors is imaged using an infrared camera that yields 30 Hz (interlaced) and 60 Hz (deinterlaced) images. These images are collected in free-running, synchronized, and slow phase slip modes utilizing hardware and software platforms for synchronization of the camera with the 10 Hz, 4 ns output from the OPO. This method is useful for analyzing mid-infrared semiconductor output as well as correlating that output with characteristics of the optical pump used to stimulate mid-infrared emission.

  6. Evaluating intensified camera systems

    SciTech Connect

    S. A. Baker

    2000-06-30

    This paper describes image evaluation techniques used to standardize camera system characterizations. The authors group is involved with building and fielding several types of camera systems. Camera types include gated intensified cameras, multi-frame cameras, and streak cameras. Applications range from X-ray radiography to visible and infrared imaging. Key areas of performance include sensitivity, noise, and resolution. This team has developed an analysis tool, in the form of image processing software, to aid an experimenter in measuring a set of performance metrics for their camera system. These performance parameters are used to identify a camera system's capabilities and limitations while establishing a means for camera system comparisons. The analysis tool is used to evaluate digital images normally recorded with CCD cameras. Electro-optical components provide fast shuttering and/or optical gain to camera systems. Camera systems incorporate a variety of electro-optical components such as microchannel plate (MCP) or proximity focused diode (PFD) image intensifiers; electro-static image tubes; or electron-bombarded (EB) CCDs. It is often valuable to evaluate the performance of an intensified camera in order to determine if a particular system meets experimental requirements.

  7. Vacuum Camera Cooler

    NASA Technical Reports Server (NTRS)

    Laugen, Geoffrey A.

    2011-01-01

    Acquiring cheap, moving video was impossible in a vacuum environment, due to camera overheating. This overheating is brought on by the lack of cooling media in vacuum. A water-jacketed camera cooler enclosure machined and assembled from copper plate and tube has been developed. The camera cooler (see figure) is cup-shaped and cooled by circulating water or nitrogen gas through copper tubing. The camera, a store-bought "spy type," is not designed to work in a vacuum. With some modifications the unit can be thermally connected when mounted in the cup portion of the camera cooler. The thermal conductivity is provided by copper tape between parts of the camera and the cooled enclosure. During initial testing of the demonstration unit, the camera cooler kept the CPU (central processing unit) of this video camera at operating temperature. This development allowed video recording of an in-progress test, within a vacuum environment.

  8. Constrained space camera assembly

    DOEpatents

    Heckendorn, Frank M.; Anderson, Erin K.; Robinson, Casandra W.; Haynes, Harriet B.

    1999-01-01

    A constrained space camera assembly which is intended to be lowered through a hole into a tank, a borehole or another cavity. The assembly includes a generally cylindrical chamber comprising a head and a body and a wiring-carrying conduit extending from the chamber. Means are included in the chamber for rotating the body about the head without breaking an airtight seal formed therebetween. The assembly may be pressurized and accompanied with a pressure sensing means for sensing if a breach has occurred in the assembly. In one embodiment, two cameras, separated from their respective lenses, are installed on a mounting apparatus disposed in the chamber. The mounting apparatus includes means allowing both longitudinal and lateral movement of the cameras. Moving the cameras longitudinally focuses the cameras, and moving the cameras laterally away from one another effectively converges the cameras so that close objects can be viewed. The assembly further includes means for moving lenses of different magnification forward of the cameras.

  9. Novel fundus camera design

    NASA Astrophysics Data System (ADS)

    Dehoog, Edward A.

    A fundus camera a complex optical system that makes use of the principle of reflex free indirect ophthalmoscopy to image the retina. Despite being in existence as early as 1900's, little has changed in the design of a fundus camera and there is minimal information about the design principles utilized. Parameters and specifications involved in the design of fundus camera are determined and their affect on system performance are discussed. Fundus cameras incorporating different design methods are modeled and a performance evaluation based on design parameters is used to determine the effectiveness of each design strategy. By determining the design principles involved in the fundus camera, new cameras can be designed to include specific imaging modalities such as optical coherence tomography, imaging spectroscopy and imaging polarimetry to gather additional information about properties and structure of the retina. Design principles utilized to incorporate such modalities into fundus camera systems are discussed. Design, implementation and testing of a snapshot polarimeter fundus camera are demonstrated.

  10. Making Ceramic Cameras

    ERIC Educational Resources Information Center

    Squibb, Matt

    2009-01-01

    This article describes how to make a clay camera. This idea of creating functional cameras from clay allows students to experience ceramics, photography, and painting all in one unit. (Contains 1 resource and 3 online resources.)

  11. Prediction of Viking lander camera image quality

    NASA Technical Reports Server (NTRS)

    Huck, F. O.; Burcher, E. E.; Jobson, D. J.; Wall, S. D.

    1976-01-01

    Formulations are presented that permit prediction of image quality as a function of camera performance, surface radiance properties, and lighting and viewing geometry. Predictions made for a wide range of surface radiance properties reveal that image quality depends strongly on proper camera dynamic range command and on favorable lighting and viewing geometry. Proper camera dynamic range commands depend mostly on the surface albedo that will be encountered. Favorable lighting and viewing geometries depend mostly on lander orientation with respect to the diurnal sun path over the landing site, and tend to be independent of surface albedo and illumination scattering function. Side lighting with low sun elevation angles (10 to 30 deg) is generally favorable for imaging spatial details and slopes, whereas high sun elevation angles are favorable for measuring spectral reflectances.

  12. Digital Pinhole Camera

    ERIC Educational Resources Information Center

    Lancor, Rachael; Lancor, Brian

    2014-01-01

    In this article we describe how the classic pinhole camera demonstration can be adapted for use with digital cameras. Students can easily explore the effects of the size of the pinhole and its distance from the sensor on exposure time, magnification, and image quality. Instructions for constructing a digital pinhole camera and our method for…

  13. Evaluating intensified camera systems

    SciTech Connect

    S. A. Baker

    2000-07-01

    This paper describes image evaluation techniques used to standardize camera system characterizations. Key areas of performance include resolution, noise, and sensitivity. This team has developed a set of analysis tools, in the form of image processing software used to evaluate camera calibration data, to aid an experimenter in measuring a set of camera performance metrics. These performance metrics identify capabilities and limitations of the camera system, while establishing a means for comparing camera systems. Analysis software is used to evaluate digital camera images recorded with charge-coupled device (CCD) cameras. Several types of intensified camera systems are used in the high-speed imaging field. Electro-optical components are used to provide precise shuttering or optical gain for a camera system. These components including microchannel plate or proximity focused diode image intensifiers, electro-static image tubes, or electron-bombarded CCDs affect system performance. It is important to quantify camera system performance in order to qualify a system as meeting experimental requirements. The camera evaluation tool is designed to provide side-by-side camera comparison and system modeling information.

  14. Digital Pinhole Camera

    ERIC Educational Resources Information Center

    Lancor, Rachael; Lancor, Brian

    2014-01-01

    In this article we describe how the classic pinhole camera demonstration can be adapted for use with digital cameras. Students can easily explore the effects of the size of the pinhole and its distance from the sensor on exposure time, magnification, and image quality. Instructions for constructing a digital pinhole camera and our method for

  15. 9. COMPLETED ROLLING CAMERA CAR ON RAILROAD TRACK AND BRIDGE ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    9. COMPLETED ROLLING CAMERA CAR ON RAILROAD TRACK AND BRIDGE LOOKING WEST, APRIL 26, 1948. (ORIGINAL PHOTOGRAPH IN POSSESSION OF DAVE WILLIS, SAN DIEGO, CALIFORNIA.) - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  16. 10. CONSTRUCTION OF FIXED CAMERA STATION 1100 (BUILDING NO. 42020) ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    10. CONSTRUCTION OF FIXED CAMERA STATION 1100 (BUILDING NO. 42020) LOOKING NORTHEAST SHOWING CONCRETE FOUNDATION, WOOD FORMWORK AND STEEL REINFORCING, March 26, 1948. (Original photograph in possession of Dave Willis, San Diego, California.) - Variable Angle Launcher Complex, Camera Stations, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  17. Contrail study with ground-based cameras

    NASA Astrophysics Data System (ADS)

    Schumann, U.; Hempel, R.; Flentje, H.; Garhammer, M.; Graf, K.; Kox, S.; Lösslein, H.; Mayer, B.

    2013-12-01

    Photogrammetric methods and analysis results for contrails observed with wide-angle cameras are described. Four cameras of two different types (view angle < 90° or whole-sky imager) at the ground at various positions are used to track contrails and to derive their altitude, width, and horizontal speed. Camera models for both types are described to derive the observation angles for given image coordinates and their inverse. The models are calibrated with sightings of the Sun, the Moon and a few bright stars. The methods are applied and tested in a case study. Four persistent contrails crossing each other, together with a short-lived one, are observed with the cameras. Vertical and horizontal positions of the contrails are determined from the camera images to an accuracy of better than 230 m and horizontal speed to 0.2 m s-1. With this information, the aircraft causing the contrails are identified by comparison to traffic waypoint data. The observations are compared with synthetic camera pictures of contrails simulated with the contrail prediction model CoCiP, a Lagrangian model using air traffic movement data and numerical weather prediction (NWP) data as input. The results provide tests for the NWP and contrail models. The cameras show spreading and thickening contrails, suggesting ice-supersaturation in the ambient air. The ice-supersaturated layer is found thicker and more humid in this case than predicted by the NWP model used. The simulated and observed contrail positions agree up to differences caused by uncertain wind data. The contrail widths, which depend on wake vortex spreading, ambient shear and turbulence, were partly wider than simulated.

  18. The nucleus of comet 67P through the eyes of the OSIRIS cameras

    NASA Astrophysics Data System (ADS)

    Guettler, Carsten; Sierks, Holger; Barbieri, Cesare; Lamy, Philippe; Rodrigo, Rafael; Koschny, Detlef; Rickman, Hans; OSIRIS Team; Capaccioni, Fabrizio; Filacchione, Gianrico; Ciarniello, Mauro; Erard, Stephane; Rinaldi, Giovanna; Tosi, Federico

    2015-11-01

    The Rosetta spacecraft is studying comet 67P/Churyumov-Gerasimenko from a close distance since August 2014. Onboard the spacecraft, the two scientific cameras, the OSIRIS narrow- and the wide-angle camera, are observing the cometary nucleus, its activity, as well as the dust and gas environment.This overview paper will cover OSIRIS science from the early arrival and mapping phase, the PHILAE landing, and the escort phase including the two close fly-bys. With a first characterization of global physical parameters of the nucleus, the OSIRIS cameras also provided the data to reconstruct a 3D shape model of the comet and a division into morphologic sub-units. From observations of near-surface activity, jet-like features can be projected onto the surface and active sources can be correlated with surface features like cliffs, pits, or flat planes. The increase of activity during and after perihelion in August 2015 showed several outbursts, which were seen as strong, collimated jets originating from the southern hemisphere.A comparison of results between different Rosetta instruments will give further inside into the physics of the comet's nucleus and its coma. The OSIRIS and VIRTIS instruments are particularly well suited to support and complement each other. With an overlap in spectral range, one instrument can provide the best spatial resolution while the other is strong in the spectral resolution. A summary on collaborative efforts will be given.

  19. Compact stereo endoscopic camera using microprism arrays.

    PubMed

    Yang, Sung-Pyo; Kim, Jae-Jun; Jang, Kyung-Won; Song, Weon-Kook; Jeong, Ki-Hun

    2016-03-15

    This work reports a microprism array (MPA) based compact stereo endoscopic camera with a single image sensor. The MPAs were monolithically fabricated by using two-step photolithography and geometry-guided resist reflow to form an appropriate prism angle for stereo image pair formation. The fabricated MPAs were transferred onto a glass substrate with a UV curable resin replica by using polydimethylsiloxane (PDMS) replica molding and then successfully integrated in front of a single camera module. The stereo endoscopic camera with MPA splits an image into two stereo images and successfully demonstrates the binocular disparities between the stereo image pairs for objects with different distances. This stereo endoscopic camera can serve as a compact and 3D imaging platform for medical, industrial, or military uses. PMID:26977690

  20. Classroom multispectral imaging using inexpensive digital cameras.

    NASA Astrophysics Data System (ADS)

    Fortes, A. D.

    2007-12-01

    The proliferation of increasingly cheap digital cameras in recent years means that it has become easier to exploit the broad wavelength sensitivity of their CCDs (360 - 1100 nm) for classroom-based teaching. With the right tools, it is possible to open children's eyes to the invisible world of UVA and near-IR radiation either side of our narrow visual band. The camera-filter combinations I describe can be used to explore the world of animal vision, looking for invisible markings on flowers, or in bird plumage, for example. In combination with a basic spectroscope (such as the Project-STAR handheld plastic spectrometer, 25), it is possible to investigate the range of human vision and camera sensitivity, and to explore the atomic and molecular absorption lines from the solar and terrestrial atmospheres. My principal use of the cameras has been to teach multispectral imaging of the kind used to determine remotely the composition of planetary surfaces. A range of camera options, from 50 circuit-board mounted CCDs up to $900 semi-pro infrared camera kits (including mobile phones along the way), and various UV-vis-IR filter options will be presented. Examples of multispectral images taken with these systems are used to illustrate the range of classroom topics that can be covered. Particular attention is given to learning about spectral reflectance curves and comparing images from Earth and Mars taken using the same filter combination that it used on the Mars Rovers.

  1. Rethinking camera user interfaces

    NASA Astrophysics Data System (ADS)

    Brewster, Stephen; McAdam, Christopher; McDonald, James; Maciver, James

    2012-01-01

    Digital cameras and camera phones are now very widely used but there are some issues that affect their use and the quality of the images captured. Many of these issues are due to problem of interaction or feedback from the camera. Modern smartphones have a wide range of sensors, rich feedback mechanisms and lots of processing power. We have developed and evaluated a range of new interaction techniques for cameras and camera phones that improve the picture taking process and allow people to take better pictures first time.

  2. Tower Camera Handbook

    SciTech Connect

    Moudry, D

    2005-01-01

    The tower camera in Barrow provides hourly images of ground surrounding the tower. These images may be used to determine fractional snow cover as winter arrives, for comparison with the albedo that can be calculated from downward-looking radiometers, as well as some indication of present weather. Similarly, during spring time, the camera images show the changes in the ground albedo as the snow melts. The tower images are saved in hourly intervals. In addition, two other cameras, the skydeck camera in Barrow and the piling camera in Atqasuk, show the current conditions at those sites.

  3. Efficient spectral narrowing of a XeCl TEA laser

    NASA Astrophysics Data System (ADS)

    Buffa, R.; Burlamacchi, P.; Salimbeni, R.; Matera, M.

    1983-07-01

    Spectral narrowing of a XeCl TEA laser has been efficiently accomplished by using a grating set at grazing incidence angle in the optical resonator. It has been demonstrated that a sufficiently long-lasting gain allows an intense oscillation to build up with output energy of 50 mJ and linewidth of 0.1 A.

  4. Automatic camera tracking for remote manipulators

    SciTech Connect

    Stoughton, R.S.; Martin, H.L.; Bentz, R.R.

    1984-04-01

    The problem of automatic camera tracking of mobile objects is addressed with specific reference to remote manipulators and using either fixed or mobile cameras. The technique uses a kinematic approach employing 4 x 4 coordinate transformation matrices to solve for the needed camera PAN and TILT angles. No vision feedback systems are used, as the required input data are obtained entirely from position sensors from the manipulator and the camera-positioning system. All hardware requirements are generally satisfied by currently available remote manipulator systems with a supervisory computer. The system discussed here implements linear plus on/off (bang-bang) closed-loop control with a +-2/sup 0/ deadband. The deadband area is desirable to avoid operator seasickness caused by continuous camera movement. Programming considerations for camera control, including operator interface options, are discussed. The example problem presented is based on an actual implementation using a PDP 11/34 computer, a TeleOperator Systems SM-229 manipulator, and an Oak Ridge National Laboratory (ORNL) camera-positioning system. 3 references, 6 figures, 2 tables.

  5. Automatic camera tracking for remote manipulators

    SciTech Connect

    Stoughton, R.S.; Martin, H.L.; Bentz, R.R.

    1984-07-01

    The problem of automatic camera tracking of mobile objects is addressed with specific reference to remote manipulators and using either fixed or mobile cameras. The technique uses a kinematic approach employing 4 x 4 coordinate transformation matrices to solve for the needed camera PAN and TILT angles. No vision feedback systems are used, as the required input data are obtained entirely from position sensors from the manipulator and the camera-positioning system. All hardware requirements are generally satisfied by currently available remote manipulator systems with a supervisory computer. The system discussed here implements linear plus on/off (bang-bang) closed-loop control with a +-2-deg deadband. The deadband area is desirable to avoid operator seasickness caused by continuous camera movement. Programming considerations for camera control, including operator interface options, are discussed. The example problem presented is based on an actual implementation using a PDP 11/34 computer, a TeleOperator Systems SM-229 manipulator, and an Oak Ridge National Laboratory (ORNL) camera-positioning system. 3 references, 6 figures, 2 tables.

  6. Automated Camera Calibration

    NASA Technical Reports Server (NTRS)

    Chen, Siqi; Cheng, Yang; Willson, Reg

    2006-01-01

    Automated Camera Calibration (ACAL) is a computer program that automates the generation of calibration data for camera models used in machine vision systems. Machine vision camera models describe the mapping between points in three-dimensional (3D) space in front of the camera and the corresponding points in two-dimensional (2D) space in the camera s image. Calibrating a camera model requires a set of calibration data containing known 3D-to-2D point correspondences for the given camera system. Generating calibration data typically involves taking images of a calibration target where the 3D locations of the target s fiducial marks are known, and then measuring the 2D locations of the fiducial marks in the images. ACAL automates the analysis of calibration target images and greatly speeds the overall calibration process.

  7. Ultra-fast framing camera tube

    DOEpatents

    Kalibjian, Ralph

    1981-01-01

    An electronic framing camera tube features focal plane image dissection and synchronized restoration of the dissected electron line images to form two-dimensional framed images. Ultra-fast framing is performed by first streaking a two-dimensional electron image across a narrow slit, thereby dissecting the two-dimensional electron image into sequential electron line images. The dissected electron line images are then restored into a framed image by a restorer deflector operated synchronously with the dissector deflector. The number of framed images on the tube's viewing screen is equal to the number of dissecting slits in the tube. The distinguishing features of this ultra-fast framing camera tube are the focal plane dissecting slits, and the synchronously-operated restorer deflector which restores the dissected electron line images into a two-dimensional framed image. The framing camera tube can produce image frames having high spatial resolution of optical events in the sub-100 picosecond range.

  8. Single-Camera Panoramic-Imaging Systems

    NASA Technical Reports Server (NTRS)

    Lindner, Jeffrey L.; Gilbert, John

    2007-01-01

    Panoramic detection systems (PDSs) are developmental video monitoring and image-data processing systems that, as their name indicates, acquire panoramic views. More specifically, a PDS acquires images from an approximately cylindrical field of view that surrounds an observation platform. The main subsystems and components of a basic PDS are a charge-coupled- device (CCD) video camera and lens, transfer optics, a panoramic imaging optic, a mounting cylinder, and an image-data-processing computer. The panoramic imaging optic is what makes it possible for the single video camera to image the complete cylindrical field of view; in order to image the same scene without the benefit of the panoramic imaging optic, it would be necessary to use multiple conventional video cameras, which have relatively narrow fields of view.

  9. Effects of incidence angle on observations of equilibrium crater diameter

    NASA Astrophysics Data System (ADS)

    Ostrach, L. R.; Denevi, B. W.; Hastings, A.; Koeber, S.; Robinson, M. S.; Thomas, P. C.; Tran, T. N.

    2009-12-01

    Determining the equilibrium crater diameter for a crater population is important in lunar regolith depth estimates as the equilibrium diameter represents the steady-state between the formation of new craters and the removal of older craters [1]. [2] hypothesized that the number of craters identified in an image is dependent on the incidence angle and showed that for three different young mare regions, fewer craters are visible at lower incidence angles, affecting reliable estimates of the equilibrium diameter of the counted crater population. [3] disputed this hypothesis and the presence of an equilibrium crater population in the data from [2]. Testing the hypothesis from [2], we chose four Apollo Metric images of the same area with different incidence angles to examine the effects of resolution on apparent equilibrium diameter estimates. We selected a 100 km2 area centered at 27.3°N, 18.2°W in Mare Imbrium east of Lambert crater with data at 87°, 82°, 71°, and 50° incidence angles, and scan resolutions of 6.6 to 7.6 m/pixel. To compare the craters visible at different illuminations, we resampled the images to 10 m/pixel and employed three individuals to count craters. The cumulative histograms for the four Apollo Metric frames exhibit the effects of different incidence angles on reliably counting craters. Current results show that the crater counts for the 82° incidence angle image are the most consistent between different observers, finding a production function slope of -4.1 and an apparent equilibrium diameter of 200 m. Deviation from the small crater trends (equilibrium population?) and the production function slope observed at 82° incidence is found at the higher (87°) and lower (71°, 50°) incidence angles. We attribute some of this deviation to the effects of incidence angle on crater detection; at crater diameters >~300 m, we find similar production functions, an observation consistent with our identification of these large craters in all four illuminations. However, the small crater trends vary significantly among observations at different illuminations. An important question is whether the small crater slope and rollover we observe are representative of the equilibrium crater population or whether these observations are due to resolution limits of the images, a too-small count area, or shadow effects (e.g., loss of small craters in the shadows of larger craters). To test if the observed rollover in the cumulative histograms is due to resolution effects or to the observation of the equilibrium crater population, we will use substantially higher resolution images. Images from the Lunar Reconnaissance Orbiter Camera Narrow Angle Camera (resolution increasing during the nominal mission from ~1.5 to ~0.5 m/pixel) at incidence angles ranging from 50° to 87°, focusing on higher incidences (70° to 87°), will be used to maximize the identification of small craters. [1] L. A. Soderblom (1970) JGR, 75, 2655. [2] B. B. Wilcox et al. (2005) Meteoritics & Plan. Sci., 40, 695. [3] V. R. Oberbeck (2008) Meteoritics & Plan. Sci., 43, 815.

  10. Microchannel plate streak camera

    DOEpatents

    Wang, C.L.

    1984-09-28

    An improved streak camera in which a microchannel plate electron multiplier is used in place of or in combination with the photocathode used in prior streak cameras. The improved streak camera is far more sensitive to photons (uv to gamma-rays) than the conventional x-ray streak camera which uses a photocathode. The improved streak camera offers gamma-ray detection with high temporal resolution. It also offers low-energy x-ray detection without attenuation inside the cathode. Using the microchannel plate in the improved camera has resulted in a time resolution of about 150 ps, and has provided a sensitivity sufficient for 1000 keV x-rays.

  11. Microchannel plate streak camera

    DOEpatents

    Wang, C.L.

    1989-03-21

    An improved streak camera in which a microchannel plate electron multiplier is used in place of or in combination with the photocathode used in prior streak cameras is disclosed. The improved streak camera is far more sensitive to photons (UV to gamma-rays) than the conventional x-ray streak camera which uses a photocathode. The improved streak camera offers gamma-ray detection with high temporal resolution. It also offers low-energy x-ray detection without attenuation inside the cathode. Using the microchannel plate in the improved camera has resulted in a time resolution of about 150 ps, and has provided a sensitivity sufficient for 1,000 KeV x-rays. 3 figs.

  12. Microchannel plate streak camera

    DOEpatents

    Wang, Ching L.

    1989-01-01

    An improved streak camera in which a microchannel plate electron multiplier is used in place of or in combination with the photocathode used in prior streak cameras. The improved streak camera is far more sensitive to photons (UV to gamma-rays) than the conventional x-ray streak camera which uses a photocathode. The improved streak camera offers gamma-ray detection with high temporal resolution. It also offers low-energy x-ray detection without attenuation inside the cathode. Using the microchannel plate in the improved camera has resulted in a time resolution of about 150 ps, and has provided a sensitivity sufficient for 1000 KeV x-rays.

  13. Analytical multicollimator camera calibration

    USGS Publications Warehouse

    Tayman, W.P.

    1978-01-01

    Calibration with the U.S. Geological survey multicollimator determines the calibrated focal length, the point of symmetry, the radial distortion referred to the point of symmetry, and the asymmetric characteristiecs of the camera lens. For this project, two cameras were calibrated, a Zeiss RMK A 15/23 and a Wild RC 8. Four test exposures were made with each camera. Results are tabulated for each exposure and averaged for each set. Copies of the standard USGS calibration reports are included. ?? 1978.

  14. Intrinsic camera calibration equipped with Scheimpflug optical device

    NASA Astrophysics Data System (ADS)

    Fasogbon, Peter; Duvieubourg, Luc; Lacaze, Pierre-Antoine; Macaire, Ludovic

    2015-04-01

    We present the problem of setting up an intrinsic camera calibration under Scheimpflug condition for an industrial application. We aim to calibrate the Scheimpflug camera using a roughly hand positioned calibration pattern with bundle adjustment technique. The assumptions used by classical calibration methodologies are not valid anymore for cameras undergoing Scheimpflug condition. Therefore, we slightly modify pin-hole model to estimate the Scheimpflug angles. The results are tested on real data sets captured from cameras limited by various industrial constraints, and in the presence of large distortions.

  15. Digital Electronic Still Camera

    NASA Technical Reports Server (NTRS)

    Holland, Samuel D.; Yeates, Herbert D.

    1993-01-01

    Digital electronic still camera part of electronic recording, processing, tansmitting, and displaying system. Removable hard-disk drive in camera serves as digital electronic equivalent of photographic film. Images viewed, analyzed, or transmitted quickly. Camera takes images of nearly photographic quality and stores them in digital form. Portable, hand-held, battery-powered unit designed for scientific use. Camera used in conjunction with playback unit also serving as transmitting unit if images sent to remote station. Remote station equipped to store, process, and display images. Digital image data encoded with error-correcting code at playback/transmitting unit for error-free transmission to remote station.

  16. LSST Camera Optics Design

    SciTech Connect

    Riot, V J; Olivier, S; Bauman, B; Pratuch, S; Seppala, L; Gilmore, D; Ku, J; Nordby, M; Foss, M; Antilogus, P; Morgado, N

    2012-05-24

    The Large Synoptic Survey Telescope (LSST) uses a novel, three-mirror, telescope design feeding a camera system that includes a set of broad-band filters and three refractive corrector lenses to produce a flat field at the focal plane with a wide field of view. Optical design of the camera lenses and filters is integrated in with the optical design of telescope mirrors to optimize performance. We discuss the rationale for the LSST camera optics design, describe the methodology for fabricating, coating, mounting and testing the lenses and filters, and present the results of detailed analyses demonstrating that the camera optics will meet their performance goals.

  17. LSST camera optics design

    NASA Astrophysics Data System (ADS)

    Olivier, Scot S.; Riot, Vincent J.; Gilmore, David K.; Bauman, Brian; Pratuch, Steve; Seppala, Lynn; Ku, John; Nordby, Martin; Foss, Mike; Antilogus, Pierre; Morgado, Nazario; Sassolas, Benoit; Flaminio, Raffaele; Michel, Christophe

    2012-09-01

    The Large Synoptic Survey Telescope (LSST) uses a novel, three-mirror, telescope design feeding a camera system that includes a set of broad-band filters and three refractive corrector lenses to produce a flat field at the focal plane with a wide field of view. Optical design of the camera lenses and filters is integrated in with the optical design of telescope mirrors to optimize performance. We discuss the rationale for the LSST camera optics design, describe the methodology for fabricating, coating, mounting and testing the lenses and filters, and present the results of detailed analyses demonstrating that the camera optics will meet their performance goals.

  18. Ringfield lithographic camera

    DOEpatents

    Sweatt, William C.

    1998-01-01

    A projection lithography camera is presented with a wide ringfield optimized so as to make efficient use of extreme ultraviolet radiation from a large area radiation source (e.g., D.sub.source .apprxeq.0.5 mm). The camera comprises four aspheric mirrors optically arranged on a common axis of symmetry with an increased etendue for the camera system. The camera includes an aperture stop that is accessible through a plurality of partial aperture stops to synthesize the theoretical aperture stop. Radiation from a mask is focused to form a reduced image on a wafer, relative to the mask, by reflection from the four aspheric mirrors.

  19. 71. COMPLETED 'A' FRAME STRUCTURE LOOKING SOUTH SHOWING CAMERA TOWER, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    71. COMPLETED 'A' FRAME STRUCTURE LOOKING SOUTH SHOWING CAMERA TOWER, DRIVE GEARS, COUNTERWEIGHT CAR AND CANTILEVERED WALKWAYS, July 28, 1948. (Original photograph in possession of Dave Willis, San Diego, California.) - Variable Angle Launcher Complex, Variable Angle Launcher, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  20. New developments to improve SO2 cameras

    NASA Astrophysics Data System (ADS)

    Luebcke, P.; Bobrowski, N.; Hoermann, C.; Kern, C.; Klein, A.; Kuhn, J.; Vogel, L.; Platt, U.

    2012-12-01

    The SO2 camera is a remote sensing instrument that measures the two-dimensional distribution of SO2 (column densities) in volcanic plumes using scattered solar radiation as a light source. From these data SO2-fluxes can be derived. The high time resolution of the order of 1 Hz allows correlating SO2 flux measurements with other traditional volcanological measurement techniques, i.e., seismology. In the last years the application of SO2 cameras has increased, however, there is still potential to improve the instrumentation. First of all, the influence of aerosols and ash in the volcanic plume can lead to large errors in the calculated SO2 flux, if not accounted for. We present two different concepts to deal with the influence of ash and aerosols. The first approach uses a co-axial DOAS system that was added to a two filter SO2 camera. The camera used Filter A (peak transmission centred around 315 nm) to measures the optical density of SO2 and Filter B (centred around 330 nm) to correct for the influence of ash and aerosol. The DOAS system simultaneously performs spectroscopic measurements in a small area of the camera's field of view and gives additional information to correct for these effects. Comparing the optical densities for the two filters with the SO2 column density from the DOAS allows not only a much more precise calibration, but also to draw conclusions about the influence from ash and aerosol scattering. Measurement examples from Popocatépetl, Mexico in 2011 are shown and interpreted. Another approach combines the SO2 camera measurement principle with the extremely narrow and periodic transmission of a Fabry-Pérot interferometer. The narrow transmission window allows to select individual SO2 absorption bands (or series of bands) as a substitute for Filter A. Measurements are therefore more selective to SO2. Instead of Filter B, as in classical SO2 cameras, the correction for aerosol can be performed by shifting the transmission window of the Fabry-Pérot interferometer towards the SO2 absorption cross section minima. A correction of ash and aerosol influences with this technique can decrease deviation from the true column by more than 60%, since the wavelength difference between the two measurement channels is much smaller than in classical SO2 cameras. While the implementation of this approach for a 2D camera encompasses many challenges, it gives the possibility to build a relatively simple and robust scanning instrument for volcanic SO2 distributions. A second problem of the SO2 camera technique is the relatively high price, which prevents its use in many volcano observatories in developing countries. Most SO2 cameras use CCDs that were originally designed for astronomical purposes. The large pixel size and low noise of these detectors compensates for the low intensity of solar radiation in the UV and the low quantum efficiency of the detector in this spectral range. However, the detectors used cost several thousand US dollars. We present results from test measurements using a consumer DSLR camera as a detector of an SO2 camera. Since the camera is not sensitive in the UV, the incoming radiation is first imaged onto a screen that is covered with a suitable fluorescent dye converting the UV radiation to visible light.

  1. Integrated mobile radar-camera system in airport perimeter security

    NASA Astrophysics Data System (ADS)

    Zyczkowski, M.; Szustakowski, M.; Ciurapinski, W.; Dulski, R.; Kastek, M.; Trzaskawka, P.

    2011-11-01

    The paper presents the test results of a mobile system for the protection of large-area objects, which consists of a radar and thermal and visual cameras. Radar is used for early detection and localization of an intruder and the cameras with narrow field of view are used for identification and tracking of a moving object. The range evaluation of an integrated system are presented as well as the probability of human detection as a function of the distance from radar-camera unit.

  2. Improved Tracking of Targets by Cameras on a Mars Rover

    NASA Technical Reports Server (NTRS)

    Kim, Won; Ansar, Adnan; Steele, Robert

    2007-01-01

    A paper describes a method devised to increase the robustness and accuracy of tracking of targets by means of three stereoscopic pairs of video cameras on a Mars-rover-type exploratory robotic vehicle. Two of the camera pairs are mounted on a mast that can be adjusted in pan and tilt; the third camera pair is mounted on the main vehicle body. Elements of the method include a mast calibration, a camera-pointing algorithm, and a purely geometric technique for handing off tracking between different camera pairs at critical distances as the rover approaches a target of interest. The mast calibration is an extension of camera calibration in which the camera images of calibration targets at known positions are collected at various pan and tilt angles. In the camerapointing algorithm, pan and tilt angles are computed by a closed-form, non-iterative solution of inverse kinematics of the mast combined with mathematical models of the cameras. The purely geometric camera-handoff technique involves the use of stereoscopic views of a target of interest in conjunction with the mast calibration.

  3. What are the benefits of having multiple camera angles?

    Atmospheric Science Data Center

    2014-12-08

    ... different directions is really useful because geophysical media (the atmosphere, including the clouds and aerosols, the ocean, and ... water surfaces, thereby enabling observations even when traditional sensors are hampered by the very high reflectance of these ...

  4. Constrained space camera assembly

    DOEpatents

    Heckendorn, F.M.; Anderson, E.K.; Robinson, C.W.; Haynes, H.B.

    1999-05-11

    A constrained space camera assembly which is intended to be lowered through a hole into a tank, a borehole or another cavity is disclosed. The assembly includes a generally cylindrical chamber comprising a head and a body and a wiring-carrying conduit extending from the chamber. Means are included in the chamber for rotating the body about the head without breaking an airtight seal formed therebetween. The assembly may be pressurized and accompanied with a pressure sensing means for sensing if a breach has occurred in the assembly. In one embodiment, two cameras, separated from their respective lenses, are installed on a mounting apparatus disposed in the chamber. The mounting apparatus includes means allowing both longitudinal and lateral movement of the cameras. Moving the cameras longitudinally focuses the cameras, and moving the cameras laterally away from one another effectively converges the cameras so that close objects can be viewed. The assembly further includes means for moving lenses of different magnification forward of the cameras. 17 figs.

  5. Camera Operator and Videographer

    ERIC Educational Resources Information Center

    Moore, Pam

    2007-01-01

    Television, video, and motion picture camera operators produce images that tell a story, inform or entertain an audience, or record an event. They use various cameras to shoot a wide range of material, including television series, news and sporting events, music videos, motion pictures, documentaries, and training sessions. Those who film or

  6. The Camera Cook Book.

    ERIC Educational Resources Information Center

    Education Development Center, Inc., Newton, MA.

    Intended for use with the photographic materials available from the Workshop for Learning Things, Inc., this "camera cookbook" describes procedures that have been tried in classrooms and workshops and proven to be the most functional and inexpensive. Explicit starting off instructions--directions for exploring and loading the camera and for taking…

  7. Camera Operator and Videographer

    ERIC Educational Resources Information Center

    Moore, Pam

    2007-01-01

    Television, video, and motion picture camera operators produce images that tell a story, inform or entertain an audience, or record an event. They use various cameras to shoot a wide range of material, including television series, news and sporting events, music videos, motion pictures, documentaries, and training sessions. Those who film or…

  8. Cameras in mobile phones

    NASA Astrophysics Data System (ADS)

    Nummela, Ville; Viinikanoja, Jarkko; Alakarhu, Juha

    2006-04-01

    One of the fastest growing markets in consumer markets today are camera phones. During past few years total volume has been growing fast and today millions of mobile phones with camera will be sold. At the same time resolution and functionality of the cameras has been growing from CIF towards DSC level. From camera point of view the mobile world is an extremely challenging field. Cameras should have good image quality but in small size. They also need to be reliable and their construction should be suitable for mass manufacturing. All components of the imaging chain should be well optimized in this environment. Image quality and usability are the most important parameters to user. The current trend of adding more megapixels to cameras and at the same time using smaller pixels is affecting both. On the other hand reliability and miniaturization are key drivers for product development as well as the cost. In optimized solution all parameters are in balance but the process of finding the right trade-offs is not an easy task. In this paper trade-offs related to optics and their effects to image quality and usability of cameras are discussed. Key development areas from mobile phone camera point of view are also listed.

  9. Camera Trajectory fromWide Baseline Images

    NASA Astrophysics Data System (ADS)

    Havlena, M.; Torii, A.; Pajdla, T.

    2008-09-01

    Camera trajectory estimation, which is closely related to the structure from motion computation, is one of the fundamental tasks in computer vision. Reliable camera trajectory estimation plays an important role in 3D reconstruction, self localization, and object recognition. There are essential issues for a reliable camera trajectory estimation, for instance, choice of the camera and its geometric projection model, camera calibration, image feature detection and description, and robust 3D structure computation. Most of approaches rely on classical perspective cameras because of the simplicity of their projection models and ease of their calibration. However, classical perspective cameras offer only a limited field of view, and thus occlusions and sharp camera turns may cause that consecutive frames look completely different when the baseline becomes longer. This makes the image feature matching very difficult (or impossible) and the camera trajectory estimation fails under such conditions. These problems can be avoided if omnidirectional cameras, e.g. a fish-eye lens convertor, are used. The hardware which we are using in practice is a combination of Nikon FC-E9 mounted via a mechanical adaptor onto a Kyocera Finecam M410R digital camera. Nikon FC-E9 is a megapixel omnidirectional addon convertor with 180° view angle which provides images of photographic quality. Kyocera Finecam M410R delivers 2272×1704 images at 3 frames per second. The resulting combination yields a circular view of diameter 1600 pixels in the image. Since consecutive frames of the omnidirectional camera often share a common region in 3D space, the image feature matching is often feasible. On the other hand, the calibration of these cameras is non-trivial and is crucial for the accuracy of the resulting 3D reconstruction. We calibrate omnidirectional cameras off-line using the state-of-the-art technique and Mičušík's two-parameter model, that links the radius of the image point r to the angle θ of its corresponding rays w.r.t. the optical axis as θ = ar 1+br2 . After a successful calibration, we know the correspondence of the image points to the 3D optical rays in the coordinate system of the camera. The following steps aim at finding the transformation between the camera and the world coordinate systems, i.e. the pose of the camera in the 3D world, using 2D image matches. For computing 3D structure, we construct a set of tentative matches detecting different affine covariant feature regions including MSER, Harris Affine, and Hessian Affine in acquired images. These features are alternative to popular SIFT features and work comparably in our situation. Parameters of the detectors are chosen to limit the number of regions to 1-2 thousands per image. The detected regions are assigned local affine frames (LAF) and transformed into standard positions w.r.t. their LAFs. Discrete Cosine Descriptors are computed for each region in the standard position. Finally, mutual distances of all regions in one image and all regions in the other image are computed as the Euclidean distances of their descriptors and tentative matches are constructed by selecting the mutually closest pairs. Opposed to the methods using short baseline images, simpler image features which are not affine covariant cannot be used because the view point can change a lot between consecutive frames. Furthermore, feature matching has to be performed on the whole frame because no assumptions on the proximity of the consecutive projections can be made for wide baseline images. This is making the feature detection, description, and matching much more time-consuming than it is for short baseline images and limits the usage to low frame rate sequences when operating in real-time. Robust 3D structure can be computed by RANSAC which searches for the largest subset of the set of tentative matches which is, within a predefined threshold ", consistent with an epipolar geometry. We use ordered sampling as suggested in to draw 5-tuples from the list of tentative matches ordered ascendingly by the distance of their descriptors which may help to reduce the number of samples in RANSAC. From each 5-tuple, relative orientation is computed by solving the 5-point minimal relative orientation problem for calibrated cameras. Often, there are more models which are supported by a large number of matches. Thus the chance that the correct model, even if it has the largest support, will be found by running a single RANSAC is small. Work suggested to generate models by randomized sampling as in RANSAC but to use soft (kernel) voting for a parameter instead of looking for the maximal support. The best model is then selected as the one with the parameter closest to the maximum in the accumulator space. In our case, we vote in a two-dimensional accumulator for the estimated camera motion direction. However, unlike in, we do not cast votes directly by each sampled epipolar geometry but by the best epipolar geometries recovered by ordered sampling of RANSAC. With our technique, we could go up to the 98.5 % contamination of mismatches with comparable effort as simple RANSAC does for the contamination by 84 %. The relative camera orientation with the motion direction closest to the maximum in the voting space is finally selected. As already mentioned in the first paragraph, the use of camera trajectory estimates is quite wide. In we have introduced a technique for measuring the size of camera translation relatively to the observed scene which uses the dominant apical angle computed at the reconstructed scene points and is robust against mismatches. The experiments demonstrated that the measure can be used to improve the robustness of camera path computation and object recognition for methods which use a geometric, e.g. the ground plane, constraint such as does for the detection of pedestrians. Using the camera trajectories, perspective cutouts with stabilized horizon are constructed and an arbitrary object recognition routine designed to work with images acquired by perspective cameras can be used without any further modifications.

  10. Dry imaging cameras.

    PubMed

    Indrajit, Ik; Alam, Aftab; Sahni, Hirdesh; Bhatia, Mukul; Sahu, Samaresh

    2011-04-01

    Dry imaging cameras are important hard copy devices in radiology. Using dry imaging camera, multiformat images of digital modalities in radiology are created from a sealed unit of unexposed films. The functioning of a modern dry camera, involves a blend of concurrent processes, in areas of diverse sciences like computers, mechanics, thermal, optics, electricity and radiography. Broadly, hard copy devices are classified as laser and non laser based technology. When compared with the working knowledge and technical awareness of different modalities in radiology, the understanding of a dry imaging camera is often superficial and neglected. To fill this void, this article outlines the key features of a modern dry camera and its important issues that impact radiology workflow. PMID:21799589

  11. The New Light Weight, High Performance Reconnaissance Camera KRb 8/24 F

    NASA Astrophysics Data System (ADS)

    Uhl, Berndt

    1990-02-01

    As sensor payload for the CL-289 drone system, Carl Zeiss developed a new compact reconnaissance camera system. The high performance camera with 143 degree wide angle ground coverage, is a pulse operated sequential frame camera. It features true-angle foreward motion com pensation across the entire format and direct stereoscopic viewing. Small size and low weight permits easy installation in remotely piloted vehicles, pods and aircraft.

  12. Early Experience & Multisensory Perceptual Narrowing

    PubMed Central

    Lewkowicz, David J.

    2014-01-01

    Perceptual narrowing is a reflection of early experience and contributes in key ways to perceptual and cognitive development. In general, findings have shown that unisensory perceptual sensitivity in early infancy is broadly tuned such that young infants respond to, and discriminate, native as well as non-native sensory inputs, whereas older infants only respond to native inputs. Recently, my colleagues and I discovered that perceptual narrowing occurs at the multisensory processing level as well. The present article reviews this new evidence and puts it in the larger context of multisensory perceptual development and the role that perceptual experience plays in it. Together, the evidence on unisensory and multisensory narrowing shows that early experience shapes the emergence of perceptual specialization and expertise. PMID:24435505

  13. Wide-Angle Quasar Feedback

    NASA Astrophysics Data System (ADS)

    Chartas, George

    2015-08-01

    I will present results from the detection of near-relativistic winds launched near the innermost stable circular orbits of SMBHs. A recent detection of a powerful wind in the X-ray bright narrow absorption line (NAL) quasar HS 0810 strengthens the case that quasars play a significant role in feedback. In both deep Chandra and XMM-Newton observations of HS 0810 we detected blueshifted absorption lines implying outflowing velocities ranging between 0.1c and 0.5c. The presence of both an emission line at 6.8 keV and an absorption line at 7.8 keV in the spectral line profile of HS 0810 is a characteristic feature of a P-Cygni profile supporting the presence of an expanding outflowing highly ionized Fe absorber in this object. A hard excess component is detected in the XMM-Newton observation of HS 0810 possibly originating from reflection off the disk. Modeling of the XMM-Newton spectrum constrains the inclination angle to be about 30 degrees. The presence of relativistic winds in both low inclination angle NAL quasars and well as in high inclination angle BAL quasars implies that the solid angle of quasar winds may be quite large. The larger solid angle of quasar winds would also indicate that their contribution to the regulation of the host galaxy may be even more important than previously thought.

  14. Seasonal and vertical changes in leaf angle distribution for selected deciduous broadleaf tree species common to Europe

    NASA Astrophysics Data System (ADS)

    Raabe, Kairi; Pisek, Jan; Sonnentag, Oliver; Annuk, Kalju

    2014-05-01

    Leaf inclination angle distribution is a key parameter in determining the transmission and reflection of radiation by vegetation canopies. It has been previously observed that leaf inclination angle might change gradually from more vertical in the upper canopy and in high light habitats to more horizontal in the lower canopy and in low light habitats [1]. Despite its importance, relatively few measurements on actual leaf angle distributions have been reported for different tree species. Even smaller number of studies have dealt with the possible seasonal changes in leaf angle distribution [2]. In this study the variation of leaf inclination angle distributions was examined both temporally throughout the growing season and vertically at different heights of trees. We report on leaf inclination angle distributions for five deciduous broadleaf species found commonly in several parts of Europe: grey alder (Alnus incana), Silver birch (Betula pendula Roth), chestnut (Castanea), Norway maple (Acer platanoides), and aspen (Populus tremula). The angles were measured using the leveled camera method [3], with the data collected at several separate heights and four times during the period of May-September 2013. The results generally indicate the greatest change in leaf inclination angles for spring, with the changes usually being the most pronounced at the top of the canopy. It should also be noted, however, that whereas the temporal variation proved to be rather consistent for different species, the vertical variation differed more between species. The leveled camera method was additionally tested in terms of sensitivity to different users. Ten people were asked to measure the leaf angles for four different species. The results indicate the method is quite robust in providing coinciding distributions irrespective of the user and level of previous experience with the method. However, certain caution must be exercised when measuring long narrow leaves. References [1] G.G. McMillen, and J.H. McClendon, "Leaf angle: an adaptive feature of sun and shade leaves," Botanical Gazette, vol. 140, pp. 437-442, 1979. [2] J. Pisek, O. Sonnentag, A.D. Richardson, and M. Mõttus, "Is the spherical leaf inclination angle distribution a valid assumption for temperate and boreal broadleaf tree species?" Agricultural and Forest Meteorology, vol. 169, pp. 186-194, 2013. [3] Y. Ryu, O. Sonnentag, T. Nilson, R. Vargas, H. Kobayashi, R. Wenk, and D. Baldocchi, "How to quantify tree leaf area index in a heterogenous savanna ecosystem: a multi-instrument and multimodel approach," Agricultural and Forest Meteorology, vol. 150, pp. 63-76, 2010.

  15. Six-year operation of the Venus Monitoring Camera (Venus Express): spatial and temporal variations of the properties of particles in upper clouds of Venus from the phase dependence of the near-IR brightness

    NASA Astrophysics Data System (ADS)

    Shalygina, O. S.; Petrova, E. V.; Markiewicz, W. J.

    2015-10-01

    Since May, 2006, the Venus Monitoring Camera (VMC) [1] has been imaging Venus in four narrow spectral channels centered at the wavelengths of 0.365 μm (UV), 0.513 μm (VIS), 0.965 μm (NIR1), and 1.010 μm (NIR2). It took around 300 000 images in four channels covering almost all the latitudes, including night and day sides. We analyze the whole set of the VMC data processed to October, 2012, i.e. the data from orbits 60 - 2 352 obtained in the phase angle range

  16. Replacing 16 mm film cameras with high definition digital cameras

    SciTech Connect

    Balch, K.S.

    1995-12-31

    For many years 16 mm film cameras have been used in severe environments. These film cameras are used on Hy-G automotive sleds, airborne gun cameras, range tracking and other hazardous environments. The companies and government agencies using these cameras are in need of replacing them with a more cost effective solution. Film-based cameras still produce the best resolving capability, however, film development time, chemical disposal, recurring media cost, and faster digital analysis are factors influencing the desire for a 16 mm film camera replacement. This paper will describe a new camera from Kodak that has been designed to replace 16 mm high speed film cameras.

  17. Streak camera time calibration procedures

    NASA Technical Reports Server (NTRS)

    Long, J.; Jackson, I.

    1978-01-01

    Time calibration procedures for streak cameras utilizing a modulated laser beam are described. The time calibration determines a writing rate accuracy of 0.15% with a rotating mirror camera and 0.3% with an image converter camera.

  18. Camera calibration by linear decomposition

    NASA Astrophysics Data System (ADS)

    Skarbek, Wladyslaw; Tomaszewski, Michal; Nowakowski, Artur

    2006-03-01

    This paper presents an algorithm for camera calibration, applying digital images to calculate camera parameters, position and orientation. A linear decomposition technique is proposed to solve nonlinear pixel equations in which camera parameters are involved.

  19. Kitt Peak speckle camera

    NASA Technical Reports Server (NTRS)

    Breckinridge, J. B.; Mcalister, H. A.; Robinson, W. G.

    1979-01-01

    The speckle camera in regular use at Kitt Peak National Observatory since 1974 is described in detail. The design of the atmospheric dispersion compensation prisms, the use of film as a recording medium, the accuracy of double star measurements, and the next generation speckle camera are discussed. Photographs of double star speckle patterns with separations from 1.4 sec of arc to 4.7 sec of arc are shown to illustrate the quality of image formation with this camera, the effects of seeing on the patterns, and to illustrate the isoplanatic patch of the atmosphere.

  20. Night Vision Camera

    NASA Technical Reports Server (NTRS)

    1996-01-01

    PixelVision, Inc. developed the Night Video NV652 Back-illuminated CCD Camera, based on the expertise of a former Jet Propulsion Laboratory employee and a former employee of Scientific Imaging Technologies, Inc. The camera operates without an image intensifier, using back-illuminated and thinned CCD technology to achieve extremely low light level imaging performance. The advantages of PixelVision's system over conventional cameras include greater resolution and better target identification under low light conditions, lower cost and a longer lifetime. It is used commercially for research and aviation.

  1. Ringfield lithographic camera

    DOEpatents

    Sweatt, W.C.

    1998-09-08

    A projection lithography camera is presented with a wide ringfield optimized so as to make efficient use of extreme ultraviolet radiation from a large area radiation source (e.g., D{sub source} {approx_equal} 0.5 mm). The camera comprises four aspheric mirrors optically arranged on a common axis of symmetry. The camera includes an aperture stop that is accessible through a plurality of partial aperture stops to synthesize the theoretical aperture stop. Radiation from a mask is focused to form a reduced image on a wafer, relative to the mask, by reflection from the four aspheric mirrors. 11 figs.

  2. IRNG-camera family

    NASA Astrophysics Data System (ADS)

    Dupiech, Michael; Marche, Pierre M.

    1996-06-01

    Further to the development of the SYNERGI set of modules dedicated to new generation multipurpose high performance thermal cameras based on the SOFRADIR 288 by 4 element IRCCD detector, THOMSON-CSF OPTRONIQUE have decided to extend the family of second generation cameras with the development of SOPHIE. SOPHIE is a handheld infra-red camera, also organized around the 288 by 4 element detector, corresponding to a different cost/performance trade-off. It is an ultralow-cost, ultralight, medium range imager designed for passive observation and surveillance. It exhibits growth potential such as low cost infra-red sights for light armored vehicles firing posts.

  3. Structured light camera calibration

    NASA Astrophysics Data System (ADS)

    Garbat, P.; Skarbek, W.; Tomaszewski, M.

    2013-03-01

    Structured light camera which is being designed with the joined effort of Institute of Radioelectronics and Institute of Optoelectronics (both being large units of the Warsaw University of Technology within the Faculty of Electronics and Information Technology) combines various hardware and software contemporary technologies. In hardware it is integration of a high speed stripe projector and a stripe camera together with a standard high definition video camera. In software it is supported by sophisticated calibration techniques which enable development of advanced application such as real time 3D viewer of moving objects with the free viewpoint or 3D modeller for still objects.

  4. Spectral narrowing via quantum coherence

    SciTech Connect

    Mikhailov, Eugeniy E.; Rostovtsev, Yuri V.; Zhang Aihua; Welch, George R.; Sautenkov, Vladimir A.; Zubairy, M. Suhail; Scully, Marlan O.

    2006-07-15

    We have studied the transmission through an optically thick {sup 87}Rb vapor that is illuminated by monochromatic and noise-broadened laser fields in {lambda} configuration. The spectral width of the beat signal between the two fields after transmission through the atomic medium is more than 1000 times narrower than the spectral width of this signal before the medium.

  5. Traffic Sign Recognition with Invariance to Lighting in Dual-Focal Active Camera System

    NASA Astrophysics Data System (ADS)

    Gu, Yanlei; Panahpour Tehrani, Mehrdad; Yendo, Tomohiro; Fujii, Toshiaki; Tanimoto, Masayuki

    In this paper, we present an automatic vision-based traffic sign recognition system, which can detect and classify traffic signs at long distance under different lighting conditions. To realize this purpose, the traffic sign recognition is developed in an originally proposed dual-focal active camera system. In this system, a telephoto camera is equipped as an assistant of a wide angle camera. The telephoto camera can capture a high accuracy image for an object of interest in the view field of the wide angle camera. The image from the telephoto camera provides enough information for recognition when the accuracy of traffic sign is low from the wide angle camera. In the proposed system, the traffic sign detection and classification are processed separately for different images from the wide angle camera and telephoto camera. Besides, in order to detect traffic sign from complex background in different lighting conditions, we propose a type of color transformation which is invariant to light changing. This color transformation is conducted to highlight the pattern of traffic signs by reducing the complexity of background. Based on the color transformation, a multi-resolution detector with cascade mode is trained and used to locate traffic signs at low resolution in the image from the wide angle camera. After detection, the system actively captures a high accuracy image of each detected traffic sign by controlling the direction and exposure time of the telephoto camera based on the information from the wide angle camera. Moreover, in classification, a hierarchical classifier is constructed and used to recognize the detected traffic signs in the high accuracy image from the telephoto camera. Finally, based on the proposed system, a set of experiments in the domain of traffic sign recognition is presented. The experimental results demonstrate that the proposed system can effectively recognize traffic signs at low resolution in different lighting conditions.

  6. Copernican craters: Early results from the Lunar Reconnaissance Orbiter Camera

    NASA Astrophysics Data System (ADS)

    McEwen, A. S.; Hiesinger, H.; Thomas, P. C.; Robinson, M. S.; van der Bogert, C.; Ostrach, L.; Plescia, J. B.; Bray, V. J.; Tornabene, L. L.

    2009-12-01

    The youngest (Copernican) craters on the Moon provide the best examples of original crater morphology and a record of the impact flux over the last ~1 Ga in the Earth-Moon system. The LRO Narrow Angle Cameras (NAC) provide 50 cm pixels from an altitude of 50 km. With changing incidence angle, global access, and very high data rates, these cameras provide unprecedented data on lunar craters. Stereo image pairs are being acquired for detailed topographic mapping. These data allow comparisons of relative ages of the larger young craters, some of which are tied to absolute radiometric ages from Apollo-returned samples. These relative ages, the crater populations at small diameters, and details of crater morphology including ejecta and melt morphologies, allow better delineation of recent lunar history and the formation and modification of impact craters. Crater counts may also reveal differences in the formation and preservation of small diameter craters as a function of target material (e.g., unconsolidated regolith versus solid impact melt). One key question: Is the current cratering rate constant or does it fluctuate. We will constrain the very recent cratering rate (at 10-100 m diameter) by comparing LROC images with those taken by Apollo nearly 40 years ago to determine the number of new impact craters. The current cratering rate and an assumption of constant cratering rate over time may or may not correctly predict the number of craters superimposed over radiometrically-dated surfaces such as South Ray, Cone, and North Ray craters, which range from 2-50 Ma and are not saturated by 10-100 m craters. If the prediction fails with realistic consideration of errors, then the present-day cratering rate must be atypical. Secondary craters complicate this analysis, but the resolution and coverage of LROC enables improved recognition of secondary craters. Of particular interest for the youngest Copernican craters is the possibility of self-cratering. LROC is providing the the image quality needed to classify small craters by state of degradation (i.e., relative age); concentrations of craters with uniform size and age indicate secondary formation. Portion of LROC image M103703826LE showing a sparsely-cratered pond of impact melt on the floor of farside Copernican crater Necho (4.95 S, 123.6 E).

  7. Miniature TV Camera

    NASA Technical Reports Server (NTRS)

    2004-01-01

    Originally devised to observe Saturn stage separation during Apollo flights, Marshall Space Flight Center's Miniature Television Camera, measuring only 4 x 3 x 1 1/2 inches, quickly made its way to the commercial telecommunications market.

  8. Cameras in the Classroom.

    ERIC Educational Resources Information Center

    Steinman, Richard C.

    1993-01-01

    Describes the following uses for a video camera in the science classroom: video presentations, microscope work, taping and/or monitoring experiments, analyzing everyday phenomena, lesson enhancement, field trip alternative, and classroom management. (PR)

  9. The MKID Camera

    NASA Astrophysics Data System (ADS)

    Maloney, P. R.; Czakon, N. G.; Day, P. K.; Duan, R.; Gao, J.; Glenn, J.; Golwala, S.; Hollister, M.; LeDuc, H. G.; Mazin, B.; Noroozian, O.; Nguyen, H. T.; Sayers, J.; Schlaerth, J.; Vaillancourt, J. E.; Vayonakis, A.; Wilson, P.; Zmuidzinas, J.

    2009-12-01

    The MKID Camera project is a collaborative effort of Caltech, JPL, the University of Colorado, and UC Santa Barbara to develop a large-format, multi-color millimeter and submillimeter-wavelength camera for astronomy using microwave kinetic inductance detectors (MKIDs). These are superconducting, micro-resonators fabricated from thin aluminum and niobium films. We couple the MKIDs to multi-slot antennas and measure the change in surface impedance produced by photon-induced breaking of Cooper pairs. The readout is almost entirely at room temperature and can be highly multiplexed; in principle hundreds or even thousands of resonators could be read out on a single feedline. The camera will have 576 spatial pixels that image simultaneously in four bands at 750, 850, 1100 and 1300 microns. It is scheduled for deployment at the Caltech Submillimeter Observatory in the summer of 2010. We present an overview of the camera design and readout and describe the current status of testing and fabrication.

  10. Calibration of action cameras for photogrammetric purposes.

    PubMed

    Balletti, Caterina; Guerra, Francesco; Tsioukas, Vassilios; Vernier, Paolo

    2014-01-01

    The use of action cameras for photogrammetry purposes is not widespread due to the fact that until recently the images provided by the sensors, using either still or video capture mode, were not big enough to perform and provide the appropriate analysis with the necessary photogrammetric accuracy. However, several manufacturers have recently produced and released new lightweight devices which are: (a) easy to handle, (b) capable of performing under extreme conditions and more importantly (c) able to provide both still images and video sequences of high resolution. In order to be able to use the sensor of action cameras we must apply a careful and reliable self-calibration prior to the use of any photogrammetric procedure, a relatively difficult scenario because of the short focal length of the camera and its wide angle lens that is used to obtain the maximum possible resolution of images. Special software, using functions of the OpenCV library, has been created to perform both the calibration and the production of undistorted scenes for each one of the still and video image capturing mode of a novel action camera, the GoPro Hero 3 camera that can provide still images up to 12 Mp and video up 8 Mp resolution. PMID:25237898

  11. Calibration of Action Cameras for Photogrammetric Purposes

    PubMed Central

    Balletti, Caterina; Guerra, Francesco; Tsioukas, Vassilios; Vernier, Paolo

    2014-01-01

    The use of action cameras for photogrammetry purposes is not widespread due to the fact that until recently the images provided by the sensors, using either still or video capture mode, were not big enough to perform and provide the appropriate analysis with the necessary photogrammetric accuracy. However, several manufacturers have recently produced and released new lightweight devices which are: (a) easy to handle, (b) capable of performing under extreme conditions and more importantly (c) able to provide both still images and video sequences of high resolution. In order to be able to use the sensor of action cameras we must apply a careful and reliable self-calibration prior to the use of any photogrammetric procedure, a relatively difficult scenario because of the short focal length of the camera and its wide angle lens that is used to obtain the maximum possible resolution of images. Special software, using functions of the OpenCV library, has been created to perform both the calibration and the production of undistorted scenes for each one of the still and video image capturing mode of a novel action camera, the GoPro Hero 3 camera that can provide still images up to 12 Mp and video up 8 Mp resolution. PMID:25237898

  12. Spacecraft camera image registration

    NASA Technical Reports Server (NTRS)

    Kamel, Ahmed A. (Inventor); Graul, Donald W. (Inventor); Chan, Fred N. T. (Inventor); Gamble, Donald W. (Inventor)

    1987-01-01

    A system for achieving spacecraft camera (1, 2) image registration comprises a portion external to the spacecraft and an image motion compensation system (IMCS) portion onboard the spacecraft. Within the IMCS, a computer (38) calculates an image registration compensation signal (60) which is sent to the scan control loops (84, 88, 94, 98) of the onboard cameras (1, 2). At the location external to the spacecraft, the long-term orbital and attitude perturbations on the spacecraft are modeled. Coefficients (K, A) from this model are periodically sent to the onboard computer (38) by means of a command unit (39). The coefficients (K, A) take into account observations of stars and landmarks made by the spacecraft cameras (1, 2) themselves. The computer (38) takes as inputs the updated coefficients (K, A) plus synchronization information indicating the mirror position (AZ, EL) of each of the spacecraft cameras (1, 2), operating mode, and starting and stopping status of the scan lines generated by these cameras (1, 2), and generates in response thereto the image registration compensation signal (60). The sources of periodic thermal errors on the spacecraft are discussed. The system is checked by calculating measurement residuals, the difference between the landmark and star locations predicted at the external location and the landmark and star locations as measured by the spacecraft cameras (1, 2).

  13. Neutron cameras for ITER

    SciTech Connect

    Johnson, L.C.; Barnes, C.W.; Batistoni, P.

    1998-12-31

    Neutron cameras with horizontal and vertical views have been designed for ITER, based on systems used on JET and TFTR. The cameras consist of fan-shaped arrays of collimated flight tubes, with suitably chosen detectors situated outside the biological shield. The sight lines view the ITER plasma through slots in the shield blanket and penetrate the vacuum vessel, cryostat, and biological shield through stainless steel windows. This paper analyzes the expected performance of several neutron camera arrangements for ITER. In addition to the reference designs, the authors examine proposed compact cameras, in which neutron fluxes are inferred from {sup 16}N decay gammas in dedicated flowing water loops, and conventional cameras with fewer sight lines and more limited fields of view than in the reference designs. It is shown that the spatial sampling provided by the reference designs is sufficient to satisfy target measurement requirements and that some reduction in field of view may be permissible. The accuracy of measurements with {sup 16}N-based compact cameras is not yet established, and they fail to satisfy requirements for parameter range and time resolution by large margins.

  14. What convention is used for the illumination and view angles?

    Atmospheric Science Data Center

    2014-12-08

    ... Azimuth angles are measured clockwise from the direction of travel to local north. For both the Sun and cameras, azimuth describes the ... to the equator, because of its morning equator crossing time. Additionally, the difference in view and solar azimuth angle will be near ...

  15. Aircraft Altitude Estimation Using Un-calibrated Onboard Cameras

    NASA Astrophysics Data System (ADS)

    Naidu, V. P. S.; Mukherjee, J.

    2012-10-01

    In the present study, implementation and study of aircraft altitude estimation using un-calibrated onboard camera is obtained. A camera model has been implemented to simulate the test data. From the results, it was observed that the rounding nature of pixel coordinates creates fluctuations around the true vanishing point (VP) angle and height computations. These fluctuations were smoothened using a Kalman filter based state estimator. The effects of camera tilt and focal length on VP angle and height computations were also studied. It is concluded that the camera should be perpendicular to the runway for there to be no effect of the focal length on the height computation. It is being planned to apply this algorithm for real time imaging data along with Integrated Enhanced Synthetic Vision (IESVS) on HANSA aircraft.

  16. Calibration Procedures in Mid Format Camera Setups

    NASA Astrophysics Data System (ADS)

    Pivnicka, F.; Kemper, G.; Geissler, S.

    2012-07-01

    A growing number of mid-format cameras are used for aerial surveying projects. To achieve a reliable and geometrically precise result also in the photogrammetric workflow, awareness on the sensitive parts is important. The use of direct referencing systems (GPS/IMU), the mounting on a stabilizing camera platform and the specific values of the mid format camera make a professional setup with various calibration and misalignment operations necessary. An important part is to have a proper camera calibration. Using aerial images over a well designed test field with 3D structures and/or different flight altitudes enable the determination of calibration values in Bingo software. It will be demonstrated how such a calibration can be performed. The direct referencing device must be mounted in a solid and reliable way to the camera. Beside the mechanical work especially in mounting the camera beside the IMU, 2 lever arms have to be measured in mm accuracy. Important are the lever arms from the GPS Antenna to the IMU's calibrated centre and also the lever arm from the IMU centre to the Camera projection centre. In fact, the measurement with a total station is not a difficult task but the definition of the right centres and the need for using rotation matrices can cause serious accuracy problems. The benefit of small and medium format cameras is that also smaller aircrafts can be used. Like that, a gyro bases stabilized platform is recommended. This causes, that the IMU must be mounted beside the camera on the stabilizer. The advantage is, that the IMU can be used to control the platform, the problematic thing is, that the IMU to GPS antenna lever arm is floating. In fact we have to deal with an additional data stream, the values of the movement of the stabiliser to correct the floating lever arm distances. If the post-processing of the GPS-IMU data by taking the floating levers into account, delivers an expected result, the lever arms between IMU and camera can be applied. However, there is a misalignment (bore side angle) that must be evaluated by photogrammetric process using advanced tools e.g. in Bingo. Once, all these parameters have been determined, the system is capable for projects without or with only a few ground control points. But which effect has the photogrammetric process when directly applying the achieved direct orientation values compared with an AT based on a proper tiepoint matching? The paper aims to show the steps to be done by potential users and gives a kind of quality estimation about the importance and quality influence of the various calibration and adjustment steps.

  17. Deployable Wireless Camera Penetrators

    NASA Technical Reports Server (NTRS)

    Badescu, Mircea; Jones, Jack; Sherrit, Stewart; Wu, Jiunn Jeng

    2008-01-01

    A lightweight, low-power camera dart has been designed and tested for context imaging of sampling sites and ground surveys from an aerobot or an orbiting spacecraft in a microgravity environment. The camera penetrators also can be used to image any line-of-sight surface, such as cliff walls, that is difficult to access. Tethered cameras to inspect the surfaces of planetary bodies use both power and signal transmission lines to operate. A tether adds the possibility of inadvertently anchoring the aerobot, and requires some form of station-keeping capability of the aerobot if extended examination time is required. The new camera penetrators are deployed without a tether, weigh less than 30 grams, and are disposable. They are designed to drop from any altitude with the boost in transmitting power currently demonstrated at approximately 100-m line-of-sight. The penetrators also can be deployed to monitor lander or rover operations from a distance, and can be used for surface surveys or for context information gathering from a touch-and-go sampling site. Thanks to wireless operation, the complexity of the sampling or survey mechanisms may be reduced. The penetrators may be battery powered for short-duration missions, or have solar panels for longer or intermittent duration missions. The imaging device is embedded in the penetrator, which is dropped or projected at the surface of a study site at 90 to the surface. Mirrors can be used in the design to image the ground or the horizon. Some of the camera features were tested using commercial "nanny" or "spy" camera components with the charge-coupled device (CCD) looking at a direction parallel to the ground. Figure 1 shows components of one camera that weighs less than 8 g and occupies a volume of 11 cm3. This camera could transmit a standard television signal, including sound, up to 100 m. Figure 2 shows the CAD models of a version of the penetrator. A low-volume array of such penetrator cameras could be deployed from an aerobot or a spacecraft onto a comet or asteroid. A system of 20 of these penetrators could be designed and built in a 1- to 2-kg mass envelope. Possible future modifications of the camera penetrators, such as the addition of a chemical spray device, would allow the study of simple chemical reactions of reagents sprayed at the landing site and looking at the color changes. Zoom lenses also could be added for future use.

  18. Auto-converging stereo cameras for 3D robotic tele-operation

    NASA Astrophysics Data System (ADS)

    Edmondson, Richard; Aycock, Todd; Chenault, David

    2012-06-01

    Polaris Sensor Technologies has developed a Stereovision Upgrade Kit for TALON robot to provide enhanced depth perception to the operator. This kit previously required the TALON Operator Control Unit to be equipped with the optional touchscreen interface to allow for operator control of the camera convergence angle adjustment. This adjustment allowed for optimal camera convergence independent of the distance from the camera to the object being viewed. Polaris has recently improved the performance of the stereo camera by implementing an Automatic Convergence algorithm in a field programmable gate array in the camera assembly. This algorithm uses scene content to automatically adjust the camera convergence angle, freeing the operator to focus on the task rather than adjustment of the vision system. The autoconvergence capability has been demonstrated on both visible zoom cameras and longwave infrared microbolometer stereo pairs.

  19. Sensing driver awareness by combining fisheye camera and Kinect

    NASA Astrophysics Data System (ADS)

    Wuhe, Z.; Lei, Z.; Ning, D.

    2014-11-01

    In this paper, we propose a Driver's Awareness Catching System to sense the driver's awareness. The system consists of a fisheye camera and a Kinect. The Kinect mounted inside vehicle is used to recognize and locate the 3D face of the driver. The fisheye camera mounted outside vehicle is used to monitor the road. The relative pose between two cameras is calibrated via a state-of-the-art method for calibrating cameras with non-overlapping field of view. The camera system works in this way: First, the SDK of Kinect released by Microsoft is used to tracking driver's face and capture eye's location together with sight direction. Secondly, the eye's location and the sight direction are transformed to the coordinate system of fisheye camera. Thirdly, corresponding view field is extracted from fisheye image. As there is a small displacement between driver's eyes and the optical center of fisheye camera, it will lead to a view angle deviation. Finally, we did a systematic analysis of the error distribution by numerical simulation and proved the feasibility of our camera system. On the other hand, we realized this camera system and achieved desired effect in realworld experiment.

  20. Clementine longwave infrared camera

    NASA Astrophysics Data System (ADS)

    Priest, Robert E.; Lewis, Isabella T.; Sewall, Noel R.; Park, Hye-Sook; Shannon, Michael J.; Ledebuhr, Arno G.; Pleasance, Lyn D.; Massie, Mark A.; Metschuleit, Karen

    1995-06-01

    The Clementine mission provided the first ever complete, systematic surface mapping of the moon from the ultra-violet to the near-infrared regions. More than 1.7 million images of the moon, earth, and space were returned from this mission. The long-wave-infrared (LWIR) camera supplemented the UV/visible and near-infrared mapping cameras providing limited strip coverage of the moon, giving insight to the thermal properties of the soils. This camera provided approximately 100 m spatial resolution at 400 km periselene, and a 7 km across- track swath. This 2.1 kg camera using a 128 X 128 mercury-cadmium-telluride (MCT) FPA viewed thermal emission of the lunar surface and lunar horizon in the 8.0 to 9.5 micrometers wavelength region. A description of this lightweight, low power LWIR camera along with a summary of lessons learned is presented. Design goals and preliminary on-orbit performance estimates are addressed in terms of meeting the mission's primary objective for flight qualifying the sensors for future Department of Defense flights.

  1. Satellite camera image navigation

    NASA Technical Reports Server (NTRS)

    Kamel, Ahmed A. (Inventor); Graul, Donald W. (Inventor); Savides, John (Inventor); Hanson, Charles W. (Inventor)

    1987-01-01

    Pixels within a satellite camera (1, 2) image are precisely located in terms of latitude and longitude on a celestial body, such as the earth, being imaged. A computer (60) on the earth generates models (40, 50) of the satellite's orbit and attitude, respectively. The orbit model (40) is generated from measurements of stars and landmarks taken by the camera (1, 2), and by range data. The orbit model (40) is an expression of the satellite's latitude and longitude at the subsatellite point, and of the altitude of the satellite, as a function of time, using as coefficients (K) the six Keplerian elements at epoch. The attitude model (50) is based upon star measurements taken by each camera (1, 2). The attitude model (50) is a set of expressions for the deviations in a set of mutually orthogonal reference optical axes (x, y, z) as a function of time, for each camera (1, 2). Measured data is fit into the models (40, 50) using a walking least squares fit algorithm. A transformation computer (66 ) transforms pixel coordinates as telemetered by the camera (1, 2) into earth latitude and longitude coordinates, using the orbit and attitude models (40, 50).

  2. The Dark Energy Camera

    SciTech Connect

    Flaugher, B.

    2015-04-11

    The Dark Energy Camera is a new imager with a 2.2-degree diameter field of view mounted at the prime focus of the Victor M. Blanco 4-meter telescope on Cerro Tololo near La Serena, Chile. The camera was designed and constructed by the Dark Energy Survey Collaboration, and meets or exceeds the stringent requirements designed for the wide-field and supernova surveys for which the collaboration uses it. The camera consists of a five element optical corrector, seven filters, a shutter with a 60 cm aperture, and a CCD focal plane of 250-μm thick fully depleted CCDs cooled inside a vacuum Dewar. The 570 Mpixel focal plane comprises 62 2k x 4k CCDs for imaging and 12 2k x 2k CCDs for guiding and focus. The CCDs have 15μm x 15μm pixels with a plate scale of 0.263" per pixel. A hexapod system provides state-of-the-art focus and alignment capability. The camera is read out in 20 seconds with 6-9 electrons readout noise. This paper provides a technical description of the camera's engineering, construction, installation, and current status.

  3. The Dark Energy Camera

    NASA Astrophysics Data System (ADS)

    Flaugher, B.; Diehl, H. T.; Honscheid, K.; Abbott, T. M. C.; Alvarez, O.; Angstadt, R.; Annis, J. T.; Antonik, M.; Ballester, O.; Beaufore, L.; Bernstein, G. M.; Bernstein, R. A.; Bigelow, B.; Bonati, M.; Boprie, D.; Brooks, D.; Buckley-Geer, E. J.; Campa, J.; Cardiel-Sas, L.; Castander, F. J.; Castilla, J.; Cease, H.; Cela-Ruiz, J. M.; Chappa, S.; Chi, E.; Cooper, C.; da Costa, L. N.; Dede, E.; Derylo, G.; DePoy, D. L.; de Vicente, J.; Doel, P.; Drlica-Wagner, A.; Eiting, J.; Elliott, A. E.; Emes, J.; Estrada, J.; Fausti Neto, A.; Finley, D. A.; Flores, R.; Frieman, J.; Gerdes, D.; Gladders, M. D.; Gregory, B.; Gutierrez, G. R.; Hao, J.; Holland, S. E.; Holm, S.; Huffman, D.; Jackson, C.; James, D. J.; Jonas, M.; Karcher, A.; Karliner, I.; Kent, S.; Kessler, R.; Kozlovsky, M.; Kron, R. G.; Kubik, D.; Kuehn, K.; Kuhlmann, S.; Kuk, K.; Lahav, O.; Lathrop, A.; Lee, J.; Levi, M. E.; Lewis, P.; Li, T. S.; Mandrichenko, I.; Marshall, J. L.; Martinez, G.; Merritt, K. W.; Miquel, R.; Muñoz, F.; Neilsen, E. H.; Nichol, R. C.; Nord, B.; Ogando, R.; Olsen, J.; Palaio, N.; Patton, K.; Peoples, J.; Plazas, A. A.; Rauch, J.; Reil, K.; Rheault, J.-P.; Roe, N. A.; Rogers, H.; Roodman, A.; Sanchez, E.; Scarpine, V.; Schindler, R. H.; Schmidt, R.; Schmitt, R.; Schubnell, M.; Schultz, K.; Schurter, P.; Scott, L.; Serrano, S.; Shaw, T. M.; Smith, R. C.; Soares-Santos, M.; Stefanik, A.; Stuermer, W.; Suchyta, E.; Sypniewski, A.; Tarle, G.; Thaler, J.; Tighe, R.; Tran, C.; Tucker, D.; Walker, A. R.; Wang, G.; Watson, M.; Weaverdyck, C.; Wester, W.; Woods, R.; Yanny, B.; DES Collaboration

    2015-11-01

    The Dark Energy Camera is a new imager with a 2.°2 diameter field of view mounted at the prime focus of the Victor M. Blanco 4 m telescope on Cerro Tololo near La Serena, Chile. The camera was designed and constructed by the Dark Energy Survey Collaboration and meets or exceeds the stringent requirements designed for the wide-field and supernova surveys for which the collaboration uses it. The camera consists of a five-element optical corrector, seven filters, a shutter with a 60 cm aperture, and a charge-coupled device (CCD) focal plane of 250 μm thick fully depleted CCDs cooled inside a vacuum Dewar. The 570 megapixel focal plane comprises 62 2k × 4k CCDs for imaging and 12 2k × 2k CCDs for guiding and focus. The CCDs have 15 μm × 15 μm pixels with a plate scale of 0.″263 pixel-1. A hexapod system provides state-of-the-art focus and alignment capability. The camera is read out in 20 s with 6-9 electron readout noise. This paper provides a technical description of the camera's engineering, construction, installation, and current status.

  4. Study on the diagnostic system of scoliosis by using infrared camera.

    PubMed

    Jeong, Jin-hyoung; Park, Eun-jeong; Cho, Chang-ok; Kim, Yoon-jeong; Lee, Sang-sik

    2015-01-01

    In this study, the radiation generated in the diagnosis of scoliosis, to solve the problems by using an infrared camera and an optical marker system that can diagnose scoliosis developed. System developed by the infrared camera attached to the optical spinal curvature is recognized as a marker to shoot the angle between the two optical markers are measured. Measurement of angle, we used the Cobb's Angle method used in the diagnosis of spinal scoliosis. We developed a software to be able to output to the screen using an infrared camera to diagnose spinal scoliosis. Software is composed of camera output unit was manufactured in Labview, angle measurement unit, in Cobb's Angle measurement unit. In the future, kyphosis, Hallux Valgus, such as the diagnosis of orthopedic disorders that require the use of a diagnostic system is expected case. PMID:26405878

  5. Selective-imaging camera

    NASA Astrophysics Data System (ADS)

    Szu, Harold; Hsu, Charles; Landa, Joseph; Cha, Jae H.; Krapels, Keith A.

    2015-05-01

    How can we design cameras that image selectively in Full Electro-Magnetic (FEM) spectra? Without selective imaging, we cannot use, for example, ordinary tourist cameras to see through fire, smoke, or other obscurants contributing to creating a Visually Degraded Environment (VDE). This paper addresses a possible new design of selective-imaging cameras at firmware level. The design is consistent with physics of the irreversible thermodynamics of Boltzmann's molecular entropy. It enables imaging in appropriate FEM spectra for sensing through the VDE, and displaying in color spectra for Human Visual System (HVS). We sense within the spectra the largest entropy value of obscurants such as fire, smoke, etc. Then we apply a smart firmware implementation of Blind Sources Separation (BSS) to separate all entropy sources associated with specific Kelvin temperatures. Finally, we recompose the scene using specific RGB colors constrained by the HVS, by up/down shifting Planck spectra at each pixel and time.

  6. SOPHIE: portable infrared camera

    NASA Astrophysics Data System (ADS)

    Dupiech, Michael

    1999-07-01

    After 20 years of experience in land-based IR systems, THOMSON-CSF is able to offer its customers a new product: SOPHIE. It is the fruit of an assortment of skills mustered at the heart of THOMSON-CSF OPTRONIQUE and of the Integrated Cooler Assembly developed by SOFRADIR and CRYOTECHNOLOGIES. SOPHIE, the world's first handled IR camera/binocular, weighs only 2 kg and offers a reconnaissance range performance well beyond 2 km, comparable to the cameras of the first generation. It can run from its own self-contained power supply or from the mains, giving it further flexibility in use. It is a genuine night-or-day instrument, operating in the 8-12 micrometers wavelength. Its leading edge technologies, together with its light weight, make it a dual-purpose product, functioning either as a camera that can be linked to a monitor, or as a conventional pair of binoculars.

  7. Solid state television camera

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The design, fabrication, and tests of a solid state television camera using a new charge-coupled imaging device are reported. An RCA charge-coupled device arranged in a 512 by 320 format and directly compatible with EIA format standards was the sensor selected. This is a three-phase, sealed surface-channel array that has 163,840 sensor elements, which employs a vertical frame transfer system for image readout. Included are test results of the complete camera system, circuit description and changes to such circuits as a result of integration and test, maintenance and operation section, recommendations to improve the camera system, and a complete set of electrical and mechanical drawing sketches.

  8. Laser range camera modeling

    NASA Astrophysics Data System (ADS)

    Storjohann, Kai

    1990-04-01

    An imaging model is described that was derived for use with a laser range camera (LRC) developed by the Advanced Intelligent Machines Division of Odetics. However, this model could be applied to any comparable imaging system. Both the derivation of the model and the determination of the LRC's intrinsic parameters are explained. For the purpose of evaluating the LRC's imaging model into a standard camera's (SC) pinhole model is derived. By virtue of this transformation, the evaluation of the LRC's external orientation can be found by applying any SC calibration technique.

  9. Artificial human vision camera

    NASA Astrophysics Data System (ADS)

    Goudou, J.-F.; Maggio, S.; Fagno, M.

    2014-10-01

    In this paper we present a real-time vision system modeling the human vision system. Our purpose is to inspire from human vision bio-mechanics to improve robotic capabilities for tasks such as objects detection and tracking. This work describes first the bio-mechanical discrepancies between human vision and classic cameras and the retinal processing stage that takes place in the eye, before the optic nerve. The second part describes our implementation of these principles on a 3-camera optical, mechanical and software model of the human eyes and associated bio-inspired attention model.

  10. Camera Calibration Based on Perspective Geometry and Its Application in LDWS

    NASA Astrophysics Data System (ADS)

    Xu, Huarong; Wang, Xiaodong

    In this paper, we present a novel algorithm to calibrate cameras for lane departure warning system(LDWS). The algorithm only need a set of parallel lane markings and parallel lines perpendicular to the ground plane to determine the camera parameters such as the roll angle, the tilt angle, the pan angle and the focal length. Then with the camera height, the positions of objects in world space can be easily obtained from the image. We apply the proposed method to our lane departure warning system which monitors the distance between the car and road boundaries. Experiments show that the proposed method is easy to operate, and can achieve accurate results.

  11. Spas color camera

    NASA Technical Reports Server (NTRS)

    Toffales, C.

    1983-01-01

    The procedures to be followed in assessing the performance of the MOS color camera are defined. Aspects considered include: horizontal and vertical resolution; value of the video signal; gray scale rendition; environmental (vibration and temperature) tests; signal to noise ratios; and white balance correction.

  12. Make a Pinhole Camera

    ERIC Educational Resources Information Center

    Fisher, Diane K.; Novati, Alexander

    2009-01-01

    On Earth, using ordinary visible light, one can create a single image of light recorded over time. Of course a movie or video is light recorded over time, but it is a series of instantaneous snapshots, rather than light and time both recorded on the same medium. A pinhole camera, which is simple to make out of ordinary materials and using ordinary…

  13. Anger Camera Firmware

    Energy Science and Technology Software Center (ESTSC)

    2010-11-19

    The firmware is responsible for the operation of Anger Camera Electronics, calculation of position, time of flight and digital communications. It provides a first stage analysis of 48 signals from 48 analog signals that have been converted to digital values using A/D convertors.

  14. Jack & the Video Camera

    ERIC Educational Resources Information Center

    Charlan, Nathan

    2010-01-01

    This article narrates how the use of video camera has transformed the life of Jack Williams, a 10-year-old boy from Colorado Springs, Colorado, who has autism. The way autism affected Jack was unique. For the first nine years of his life, Jack remained in his world, alone. Functionally non-verbal and with motor skill problems that affected his…

  15. Make a Pinhole Camera

    ERIC Educational Resources Information Center

    Fisher, Diane K.; Novati, Alexander

    2009-01-01

    On Earth, using ordinary visible light, one can create a single image of light recorded over time. Of course a movie or video is light recorded over time, but it is a series of instantaneous snapshots, rather than light and time both recorded on the same medium. A pinhole camera, which is simple to make out of ordinary materials and using ordinary

  16. Communities, Cameras, and Conservation

    ERIC Educational Resources Information Center

    Patterson, Barbara

    2012-01-01

    Communities, Cameras, and Conservation (CCC) is the most exciting and valuable program the author has seen in her 30 years of teaching field science courses. In this citizen science project, students and community volunteers collect data on mountain lions ("Puma concolor") at four natural areas and public parks along the Front Range of Colorado.…

  17. The LSST Camera Overview

    SciTech Connect

    Gilmore, Kirk; Kahn, Steven A.; Nordby, Martin; Burke, David; O'Connor, Paul; Oliver, John; Radeka, Veljko; Schalk, Terry; Schindler, Rafe; /SLAC

    2007-01-10

    The LSST camera is a wide-field optical (0.35-1um) imager designed to provide a 3.5 degree FOV with better than 0.2 arcsecond sampling. The detector format will be a circular mosaic providing approximately 3.2 Gigapixels per image. The camera includes a filter mechanism and, shuttering capability. It is positioned in the middle of the telescope where cross-sectional area is constrained by optical vignetting and heat dissipation must be controlled to limit thermal gradients in the optical beam. The fast, f/1.2 beam will require tight tolerances on the focal plane mechanical assembly. The focal plane array operates at a temperature of approximately -100 C to achieve desired detector performance. The focal plane array is contained within an evacuated cryostat, which incorporates detector front-end electronics and thermal control. The cryostat lens serves as an entrance window and vacuum seal for the cryostat. Similarly, the camera body lens serves as an entrance window and gas seal for the camera housing, which is filled with a suitable gas to provide the operating environment for the shutter and filter change mechanisms. The filter carousel can accommodate 5 filters, each 75 cm in diameter, for rapid exchange without external intervention.

  18. Jack & the Video Camera

    ERIC Educational Resources Information Center

    Charlan, Nathan

    2010-01-01

    This article narrates how the use of video camera has transformed the life of Jack Williams, a 10-year-old boy from Colorado Springs, Colorado, who has autism. The way autism affected Jack was unique. For the first nine years of his life, Jack remained in his world, alone. Functionally non-verbal and with motor skill problems that affected his

  19. Photogrammetric camera calibration

    USGS Publications Warehouse

    Tayman, W.P.; Ziemann, H.

    1984-01-01

    Section 2 (Calibration) of the document "Recommended Procedures for Calibrating Photogrammetric Cameras and Related Optical Tests" from the International Archives of Photogrammetry, Vol. XIII, Part 4, is reviewed in the light of recent practical work, and suggestions for changes are made. These suggestions are intended as a basis for a further discussion. ?? 1984.

  20. Communities, Cameras, and Conservation

    ERIC Educational Resources Information Center

    Patterson, Barbara

    2012-01-01

    Communities, Cameras, and Conservation (CCC) is the most exciting and valuable program the author has seen in her 30 years of teaching field science courses. In this citizen science project, students and community volunteers collect data on mountain lions ("Puma concolor") at four natural areas and public parks along the Front Range of Colorado.

  1. The LSST Camera System

    NASA Astrophysics Data System (ADS)

    Gilmore, D. Kirk; Kahn, S.; Fouts, K.; LSST Camera Team

    2009-01-01

    The LSST camera provides a 3.2 Gigapixel focal plane array, tiled by 189 4Kx4K CCD science sensors with 10um pixels. This pixel count is a direct consequence of sampling the 9.6 deg^2 field-of-view (0.64m diameter) with 0.2 arcsec pixels (Nyquist sampling in the best expected seeing of 0.4 arcsec). The sensors are deep depleted, back-illuminated devices with a highly segmented architecture that enables the entire array to be read in 2 seconds. The detectors are grouped into 3x3 rafts, each containing its own dedicated front-end and back-end electronics boards. The rafts are mounted on a silicon carbide grid inside a vacuum cryostat, with an intricate thermal cryostat and the third of the three refractive lenses in the camera. The other two lenses are mounted in an optics structure at the front of the camera body, which also contains a mechanical shutter, and a carousel assembly that holds five large optical filters(ugrizy). A sixth optical filter will also be fabricated and can replace any of the others via procedures accomplished during daylight hours. This poster will illustrate the current mechanical design of the camera, FEA and thermal analysis of the cryostat, and overview of the data acquisition system and the performance characteristics of the filters.

  2. Behind the Camera.

    ERIC Educational Resources Information Center

    Kuhns, William; Giardino, Thomas F.

    Intended for the beginning filmmaker, this book presents basic information on major aspects of shooting a film. It covers characteristics of various cameras, films, lenses, and lighting equipment and tells how to use them. The importance of a shooting script is stressed. The mechanics of sound systems, editing, and titles, animations, and special…

  3. The Martian Atmosphere as seen by the OSIRIS camera

    NASA Astrophysics Data System (ADS)

    Moissl, R.; Pajola, M.; Määttänen, A.; Küppers, M.

    2013-09-01

    Despite the long time that has passed since the observations, only few studies based on the data from the wide- (WAC) and narrow- (NAC) angle camera systems of OSIRIS have been published to date. In this Paper we will present the results on the observations of the Martian Limbs acquired by the OSIRIS [1] instrument on board the ESA mission Rosetta during its swing-by maneuver around February 25th, 2007 on the way to Comet 67P/Churyumov-Gerasimenko, during the onset of the very active dust storm season of Mars year 28 (at Ls ~190). Although OSIRIS did only capture the Planet during a relatively short time interval of several hours, the obtained global view and the spectral coverage, from the UV (245 nm) over the full visible range to the near IR (1000 nm), allow for a valuable global overview over the state of the Martian atmosphere. The image acquisition started a February 24 around 18:00 UTC from a distance of about 260.000 km and continued until 04:51 UTC on February 25 to a distance of 105.000 km. During the Closest Approach to the Planet at 01:54 UTC o February 25 at a distance of 250 km. All images have been manually co-registered with the help of SPICE data, and vertical profiles have been extracted over the limb in intervals of ~0.5 degrees (se Figures 1 and 2). Despite the long time that has passed since the observations, only few studies based on the data from the wide- (WAC) and narrow- (NAC) angle camera systems of OSIRIS have been published to date. In this Paper we will present the results on the observations of the Martian Limbs acquired by the OSIRIS [1] instrument on board the ESA mission Rosetta during its swing-by maneuver around February 25th, 2007 on the way to Comet 67P/Churyumov-Gerasimenko, during the onset of the very active dust storm season of Mars year 28 (at Ls ~190). Although OSIRIS did only capture the Planet during a relatively short time interval of several hours, the obtained global view and the spectral coverage, from the UV (245 nm) over the full visible range to the near IR (1000 nm), allow for a valuable global overview over the state of the Martian atmosphere. The image acquisition started at February 24 around 18:00 UTC from a distance of about 260.000 km and continued until 04:51 UTC on February 25 to a distance of 105.000 km. During the Closest Approach to the Planet at 01:54 UTC on February 25 at a distance of 250 km. All images have been manually co-registered with the help of SPICE data, and vertical profiles have been extracted over the limb in intervals of ~0.5 degrees (see Figures 1 and 2). In this work we will focus on our findings about the vertical structure of the atmosphere over the Martian limbs and report on the observed altitudes and optical densities of dust and (partially detached) clouds and put the findings in context with data from other satellites in orbit around Mars at the same time (e.g. Mars Express). Based on previous datasets (MGS/TES, MOd/THEMIS, MRO/MCS, see, e.g., [2], [3] and [4]) we can expect to observe the waning of the South polar hood and the development of the Northern one. Some remains of the aphelion cloud belt might still be visible near the equator. Detached layers have been recently observed at this season by MEx/SPICAM [5] and MRO/MCS [6].

  4. Image Sensors Enhance Camera Technologies

    NASA Technical Reports Server (NTRS)

    2010-01-01

    In the 1990s, a Jet Propulsion Laboratory team led by Eric Fossum researched ways of improving complementary metal-oxide semiconductor (CMOS) image sensors in order to miniaturize cameras on spacecraft while maintaining scientific image quality. Fossum s team founded a company to commercialize the resulting CMOS active pixel sensor. Now called the Aptina Imaging Corporation, based in San Jose, California, the company has shipped over 1 billion sensors for use in applications such as digital cameras, camera phones, Web cameras, and automotive cameras. Today, one of every three cell phone cameras on the planet feature Aptina s sensor technology.

  5. Wide-angle imaging system with fiberoptic components providing angle-dependent virtual material stops

    NASA Technical Reports Server (NTRS)

    Vaughan, Arthur H. (Inventor)

    1993-01-01

    A strip imaging wide angle optical system is provided. The optical system is provided with a 'virtual' material stop to avoid aberrational effects inherent in wide angle optical systems. The optical system includes a spherical mirror section for receiving light from a 180 deg strip or arc of a target image. Light received by the spherical mirror section is reflected to a frustoconical mirror section for subsequent rereflection to a row of optical fibers. Each optical fiber transmits a portion of the received light to a detector. The optical system exploits the narrow cone of acceptance associated with optical fibers to substantially eliminate vignetting effects inherent in wide angle systems. Further, the optical system exploits the narrow cone of acceptance of the optical fibers to substantially limit spherical aberration. The optical system is ideally suited for any application wherein a 180 deg strip image need be detected, and is particularly well adapted for use in hostile environments such as in planetary exploration.

  6. The Beagle 2 stereo camera system

    NASA Astrophysics Data System (ADS)

    Griffiths, A. D.; Coates, A. J.; Josset, J.-L.; Paar, G.; Hofmann, B.; Pullan, D.; Rüffer, P.; Sims, M. R.; Pillinger, C. T.

    2005-12-01

    The stereo camera system (SCS) was designed to provide wide-angle multi-spectral stereo imaging of the Beagle 2 landing site. Based on the Space-X micro-cameras, the primary objective was to construct a digital elevation model of the area in reach of the lander's robot arm. The SCS technical specifications and scientific objectives are described; these included panoramic 3-colour imaging to characterise the landing site; multi-spectral imaging to study the mineralogy of rocks and soils beyond the reach of the arm and solar observations to measure water vapour absorption and the atmospheric dust optical density. Also envisaged were stellar observations to determine the lander location and orientation, multi-spectral observations of Phobos & Deimos and observations of the landing site to monitor temporal changes.

  7. 15. ELEVATED CAMERA STAND, SHOWING LINE OF CAMERA STANDS PARALLEL ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    15. ELEVATED CAMERA STAND, SHOWING LINE OF CAMERA STANDS PARALLEL TO SLED TRACK. Looking west southwest down Camera Road. - Edwards Air Force Base, South Base Sled Track, Edwards Air Force Base, North of Avenue B, between 100th & 140th Streets East, Lancaster, Los Angeles County, CA

  8. Narrow-Line Seyfert Galaxies

    NASA Astrophysics Data System (ADS)

    Crenshaw, D. Michael

    We propose to obtain simultaneous SWP and optical spectra of three unusual Seyfert galaxies that have most of the properties of normal Seyfert I galaxies (high-ionization lines, strong nonstellar continua), but have permitted lines that are much narrower (< 1000 km sec^-1 FWHM). We have obtained test SWP exposures of these objects, and find that total exposure times of 12 - 14 hours should be sufficient to detect many of the weak lines that are blended together in Seyferts with broad (1000 - 6000 km sec^-1 FWHM) lines. We have chosen these three high-ionization narrow-line (HINL) Seyferts because their [OIII] 5007/H-beta ratios are small, which indicates that the emission lines from the low-density region do not severely contaminate those from the high-density region that we wish to study. We wish to accomplish the following scientific objectives: 1. We will study the relation between the HINL Seyferts and Seyfert 1 and 2 galaxies by determining the L-alpha/H-beta ratios to see if there is a partially ionized zone in the high-density clouds of HINL Seyferts. We will also compare the strengths of the UV continua in these objects with those in Seyfert I and 2 galaxies. 2. We will investigate the possibility that the emission feature at 1909 A is not entirely due to C III], and may receive a contribution from Fe III. 3. We will determine the reddening of the emission lines from the He II 1640/He II 4686 ratio. 4. We will identify as many weak features as possible and measure their relative intensities. In particular, we will attempt to determine the C:N:O abundances from various emission lines.

  9. Dual cameras acquisition and display system of retina-like sensor camera and rectangular sensor camera

    NASA Astrophysics Data System (ADS)

    Cao, Nan; Cao, Fengmei; Lin, Yabin; Bai, Tingzhu; Song, Shengyu

    2015-04-01

    For a new kind of retina-like senor camera and a traditional rectangular sensor camera, dual cameras acquisition and display system need to be built. We introduce the principle and the development of retina-like senor. Image coordinates transformation and interpolation based on sub-pixel interpolation need to be realized for our retina-like sensor's special pixels distribution. The hardware platform is composed of retina-like senor camera, rectangular sensor camera, image grabber and PC. Combined the MIL and OpenCV library, the software program is composed in VC++ on VS 2010. Experience results show that the system can realizes two cameras' acquisition and display.

  10. Angled Layers in Super Resolution

    NASA Technical Reports Server (NTRS)

    2004-01-01

    Researchers used a special imaging technique with the panoramic camera on NASA's Mars Exploration Rover Opportunity to get as detailed a look as possible at a target region near eastern foot of 'Burns Cliff.' The intervening terrain was too difficult for driving the rover closer. The target is the boundary between two sections of layered rock. The layers in lower section (left) run at a marked angle to the layers in next higher section (right).

    This view is the product of a technique called super resolution. It was generated from data acquired on sol 288 of Opportunity's mission (Nov. 14, 2004) from a position along the southeast wall of 'Endurance Crater.' Resolution slightly higher than normal for the panoramic camera was synthesized for this view by combining 17 separate images of this scene, each one 'dithered' or pointed slightly differently from the previous one. Computer manipulation of the individual images was then used to generate a new synthetic view of the scene in a process known mathematically as iterative deconvolution, but referred to informally as super resolution. Similar methods have been used to enhance the resolution of images from the Mars Pathfinder mission and the Hubble Space Telescope.

  11. Streak camera receiver definition study

    NASA Technical Reports Server (NTRS)

    Johnson, C. B.; Hunkler, L. T., Sr.; Letzring, S. A.; Jaanimagi, P.

    1990-01-01

    Detailed streak camera definition studies were made as a first step toward full flight qualification of a dual channel picosecond resolution streak camera receiver for the Geoscience Laser Altimeter and Ranging System (GLRS). The streak camera receiver requirements are discussed as they pertain specifically to the GLRS system, and estimates of the characteristics of the streak camera are given, based upon existing and near-term technological capabilities. Important problem areas are highlighted, and possible corresponding solutions are discussed.

  12. Automated Camera Array Fine Calibration

    NASA Technical Reports Server (NTRS)

    Clouse, Daniel; Padgett, Curtis; Ansar, Adnan; Cheng, Yang

    2008-01-01

    Using aerial imagery, the JPL FineCalibration (JPL FineCal) software automatically tunes a set of existing CAHVOR camera models for an array of cameras. The software finds matching features in the overlap region between images from adjacent cameras, and uses these features to refine the camera models. It is not necessary to take special imagery of a known target and no surveying is required. JPL FineCal was developed for use with an aerial, persistent surveillance platform.

  13. Combustion pinhole camera system

    DOEpatents

    Witte, A.B.

    1984-02-21

    A pinhole camera system is described utilizing a sealed optical-purge assembly which provides optical access into a coal combustor or other energy conversion reactors. The camera system basically consists of a focused-purge pinhole optical port assembly, a conventional TV vidicon receiver, an external, variable density light filter which is coupled electronically to the vidicon automatic gain control (agc). The key component of this system is the focused-purge pinhole optical port assembly which utilizes a purging inert gas to keep debris from entering the port and a lens arrangement which transfers the pinhole to the outside of the port assembly. One additional feature of the port assembly is that it is not flush with the interior of the combustor. 2 figs.

  14. Combustion pinhole camera system

    DOEpatents

    Witte, Arvel B.

    1984-02-21

    A pinhole camera system utilizing a sealed optical-purge assembly which provides optical access into a coal combustor or other energy conversion reactors. The camera system basically consists of a focused-purge pinhole optical port assembly, a conventional TV vidicon receiver, an external, variable density light filter which is coupled electronically to the vidicon automatic gain control (agc). The key component of this system is the focused-purge pinhole optical port assembly which utilizes a purging inert gas to keep debris from entering the port and a lens arrangement which transfers the pinhole to the outside of the port assembly. One additional feature of the port assembly is that it is not flush with the interior of the combustor.

  15. NSTX Tangential Divertor Camera

    SciTech Connect

    A.L. Roquemore; Ted Biewer; D. Johnson; S.J. Zweben; Nobuhiro Nishino; V.A. Soukhanovskii

    2004-07-16

    Strong magnetic field shear around the divertor x-point is numerically predicted to lead to strong spatial asymmetries in turbulence driven particle fluxes. To visualize the turbulence and associated impurity line emission near the lower x-point region, a new tangential observation port has been recently installed on NSTX. A reentrant sapphire window with a moveable in-vessel mirror images the divertor region from the center stack out to R 80 cm and views the x-point for most plasma configurations. A coherent fiber optic bundle transmits the image through a remotely selected filter to a fast camera, for example a 40500 frames/sec Photron CCD camera. A gas puffer located in the lower inboard divertor will localize the turbulence in the region near the x-point. Edge fluid and turbulent codes UEDGE and BOUT will be used to interpret impurity and deuterium emission fluctuation measurements in the divertor.

  16. 91. 22'X34' original blueprint, VariableAngle Launcher, 'CONNECTING BRIDGE, REAR VIEW ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    91. 22'X34' original blueprint, Variable-Angle Launcher, 'CONNECTING BRIDGE, REAR VIEW CAMERA HOUSE ASSEMBLY' drawn at 3/8=1'-0', 3'=1'-0'. (BUORD Sketch # 209042). - Variable Angle Launcher Complex, Variable Angle Launcher, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  17. Hemispherical Laue camera

    DOEpatents

    Li, James C. M.; Chu, Sungnee G.

    1980-01-01

    A hemispherical Laue camera comprises a crystal sample mount for positioning a sample to be analyzed at the center of sphere of a hemispherical, X-radiation sensitive film cassette, a collimator, a stationary or rotating sample mount and a set of standard spherical projection spheres. X-radiation generated from an external source is directed through the collimator to impinge onto the single crystal sample on the stationary mount. The diffracted beam is recorded on the hemispherical X-radiation sensitive film mounted inside the hemispherical film cassette in either transmission or back-reflection geometry. The distances travelled by X-radiation diffracted from the crystal to the hemispherical film are the same for all crystal planes which satisfy Bragg's Law. The recorded diffraction spots or Laue spots on the film thereby preserve both the symmetry information of the crystal structure and the relative intensities which are directly related to the relative structure factors of the crystal orientations. The diffraction pattern on the exposed film is compared with the known diffraction pattern on one of the standard spherical projection spheres for a specific crystal structure to determine the orientation of the crystal sample. By replacing the stationary sample support with a rotating sample mount, the hemispherical Laue camera can be used for crystal structure determination in a manner previously provided in conventional Debye-Scherrer cameras.

  18. Orbiter Camera Payload System

    NASA Technical Reports Server (NTRS)

    1980-01-01

    Components for an orbiting camera payload system (OCPS) include the large format camera (LFC), a gas supply assembly, and ground test, handling, and calibration hardware. The LFC, a high resolution large format photogrammetric camera for use in the cargo bay of the space transport system, is also adaptable to use on an RB-57 aircraft or on a free flyer satellite. Carrying 4000 feet of film, the LFC is usable over the visible to near IR, at V/h rates of from 11 to 41 milliradians per second, overlap of 10, 60, 70 or 80 percent and exposure times of from 4 to 32 milliseconds. With a 12 inch focal length it produces a 9 by 18 inch format (long dimension in line of flight) with full format low contrast resolution of 88 lines per millimeter (AWAR), full format distortion of less than 14 microns and a complement of 45 Reseau marks and 12 fiducial marks. Weight of the OCPS as supplied, fully loaded is 944 pounds and power dissipation is 273 watts average when in operation, 95 watts in standby. The LFC contains an internal exposure sensor, or will respond to external command. It is able to photograph starfields for inflight calibration upon command.

  19. Orbiter Camera Payload System

    NASA Astrophysics Data System (ADS)

    1980-12-01

    Components for an orbiting camera payload system (OCPS) include the large format camera (LFC), a gas supply assembly, and ground test, handling, and calibration hardware. The LFC, a high resolution large format photogrammetric camera for use in the cargo bay of the space transport system, is also adaptable to use on an RB-57 aircraft or on a free flyer satellite. Carrying 4000 feet of film, the LFC is usable over the visible to near IR, at V/h rates of from 11 to 41 milliradians per second, overlap of 10, 60, 70 or 80 percent and exposure times of from 4 to 32 milliseconds. With a 12 inch focal length it produces a 9 by 18 inch format (long dimension in line of flight) with full format low contrast resolution of 88 lines per millimeter (AWAR), full format distortion of less than 14 microns and a complement of 45 Reseau marks and 12 fiducial marks. Weight of the OCPS as supplied, fully loaded is 944 pounds and power dissipation is 273 watts average when in operation, 95 watts in standby. The LFC contains an internal exposure sensor, or will respond to external command. It is able to photograph starfields for inflight calibration upon command.

  20. Gamma ray camera

    DOEpatents

    Perez-Mendez, V.

    1997-01-21

    A gamma ray camera is disclosed for detecting rays emanating from a radiation source such as an isotope. The gamma ray camera includes a sensor array formed of a visible light crystal for converting incident gamma rays to a plurality of corresponding visible light photons, and a photosensor array responsive to the visible light photons in order to form an electronic image of the radiation therefrom. The photosensor array is adapted to record an integrated amount of charge proportional to the incident gamma rays closest to it, and includes a transparent metallic layer, photodiode consisting of a p-i-n structure formed on one side of the transparent metallic layer, and comprising an upper p-type layer, an intermediate layer and a lower n-type layer. In the preferred mode, the scintillator crystal is composed essentially of a cesium iodide (CsI) crystal preferably doped with a predetermined amount impurity, and the p-type upper intermediate layers and said n-type layer are essentially composed of hydrogenated amorphous silicon (a-Si:H). The gamma ray camera further includes a collimator interposed between the radiation source and the sensor array, and a readout circuit formed on one side of the photosensor array. 6 figs.

  1. Gamma ray camera

    DOEpatents

    Perez-Mendez, Victor

    1997-01-01

    A gamma ray camera for detecting rays emanating from a radiation source such as an isotope. The gamma ray camera includes a sensor array formed of a visible light crystal for converting incident gamma rays to a plurality of corresponding visible light photons, and a photosensor array responsive to the visible light photons in order to form an electronic image of the radiation therefrom. The photosensor array is adapted to record an integrated amount of charge proportional to the incident gamma rays closest to it, and includes a transparent metallic layer, photodiode consisting of a p-i-n structure formed on one side of the transparent metallic layer, and comprising an upper p-type layer, an intermediate layer and a lower n-type layer. In the preferred mode, the scintillator crystal is composed essentially of a cesium iodide (CsI) crystal preferably doped with a predetermined amount impurity, and the p-type upper intermediate layers and said n-type layer are essentially composed of hydrogenated amorphous silicon (a-Si:H). The gamma ray camera further includes a collimator interposed between the radiation source and the sensor array, and a readout circuit formed on one side of the photosensor array.

  2. Characterization of gravity waves at Venus cloud top from the Venus Monitoring Camera images

    NASA Astrophysics Data System (ADS)

    Piccialli, A.; Titov, D.; Svedhem, H.; Markiewicz, W. J.

    2012-04-01

    Since 2006 the European mission Venus Express (VEx) is studying Venus atmosphere with a focus on atmospheric dynamics and circulation. Recently, several experiments on board Venus Express have detected waves in the Venus atmosphere both as oscillations in the temperature and wind fields and as patterns on the cloud layer. Waves could be playing an important role in the maintenance of the atmospheric circulation of Venus since they can transport energy and momentum. High resolution images of Venus Northern hemisphere obtained with the Venus Monitoring Camera (VMC/VEx) show distinct wave patterns at the cloud tops (~70 km altitude) interpreted as gravity waves. Venus Monitoring Camera (VMC) is a CCD-based camera specifically designed to take images of Venus in four narrow band filters in UV (365 nm), visible (513 nm), and near-IR (965 and 1000 nm). A systematic visual search of waves in VMC images was performed; more than 1700 orbits were analyzed and wave patterns were observed in about 200 images. With the aim to characterize the wave types and their possible origin, we retrieved wave properties such as location (latitude and longitude), local time, solar zenith angle, packet length and width, and orientation. A wavelet analysis was also applied to determine the wavelength and the region of dominance of each wave. Four types of waves were identified in VMC images: long, medium, short and irregular waves. The long type waves are characterized by long and narrow straight features extending more than a few hundreds kilometers and with a wavelength within the range of 7 to 48 km. Medium type waves have irregular wavefronts extending more than 100 km and with wavelengths in the range 8 - 21 km. Short wave packets have a width of several tens of kilometers and extends to few hundreds kilometers and are characterized by small wavelengths (3 - 16 km). Often short waves trains are observed at the edges of long features and seem connected to them. Irregular wave fields extend beyond the field of view of VMC and appear to be the result of wave breaking or wave interference. The waves are often identified in all channels and are mostly found at high latitudes (60-80°N) in the Northern hemisphere and seem to be concentrated above Ishtar Terra, a continental size highland that includes the highest mountain belts of the planet, thus suggesting a possible orographic origin of the waves. However, at the moment it is not possible to rule out a bias in the observations due to the spacecraft orbit that prevents waves to be seen at lower latitudes, because of lower resolution, and on the night side of the planet.

  3. Cryogenic Detectors (Narrow Field Instruments)

    NASA Astrophysics Data System (ADS)

    Hoevers, H.; Verhoeve, P.

    Two cryogenic imaging spectrometer arrays are currently considered as focal plane instruments for XEUS. The narrow field imager 1 (NFI 1) will cover the energy range from 0.05 to 3 keV with an energy resolution of 2 eV, or better, at 500 eV. A second narrow field imager (NFI 2) covers the energy range from 1 to 15 keV with an energy resolution of 2 eV (at 1 keV) and 5 eV (at 7 keV), creating some overlap with part of the NFI 1 energy window. Both narrow field imagers have a 0.5 arcmin field of view. Their imaging capabilities are matched to the XEUS optics of 2 to 5 arcsec leading to 1 arcsec pixels. The detector arrays will be cooled by a closed cycle system comprising a mechanical cooler with a base temperature of 2.5 K and either a low temperature 3He sorption pump providing the very low temperature stage and/or an Adiabatic Demagnetization Refrigerator (ADR). The ADR cooler is explicitly needed to cool the NFI 2 array. The narrow field imager 1} Currently a 48 times 48 element array of superconducting tunnel junctions (STJ) is envisaged. Its operating temperature is in the range between 30 and 350 mK. Small, single Ta STJs (20-50 mum on a side) have shown 3.5 eV (FWHM) resolution at E = 525 eV and small arrays have been successfully demonstrated (6 times 6 pixels), or are currently tested (10 times 12 pixels). Alternatively, a prototype Distributed Read-Out Imaging Device (DROID), consisting of a linear superconducting Ta absorber of 20 times 100 mum2, including a 20 times 20 mum STJ for readout at either end, has shown a measured energy resolution of 2.4 eV (FWHM) at E = 500 eV. Simulations involving the diffusion properties as well as loss and tunnel rates have shown that the performance can be further improved by slight modifications in the geometry, and that the size of the DROIDS can be increased to 0.5-1.0 mm without loss in energy resolution. The relatively large areas and good energy resolution compared to single STJs make DROIDS good candidates for the basic elements of the NFI 1 detector array. With a DROID-based array of 48 times 10 elements covering the NFI 1 field of view of 0.5 arcmin, the number of signal wires would already be reduced by a factor 2.4 compared to a 48 times 48 array of single pixels. While the present prototype DROIDS are still covered with a 480 nm thick SiOx insulation layer, this layer could easily be reduced in thickness or omitted. The detection efficiency of such a device with a 500 nm thick Ta absorber would be >80% in the energy range of 100-3000eV, without any disturbing contributions from other layers as in single STJs. Further developments involve devices of lower Tc-superconductors for better energy resolution and faster diffusion (e.g. Mo). The narrow field imager 2 The NFI 2 will consist of an array of 32 times 32 detector pixels. Each detector is a microcalorimeter which consists of a a superconducting to normal phase transition edge thermometer (transition edge sensor, TES) with an operating temperature of 100 mK, and an absorber which allows a detection efficiency of >90% and a filling factor of the focal plane in excess of 90%. Single pixel microcalorimeters with a Ti/Au TES have already shown an energy resolution of 3.9 eV at 5.89 keV in combination with a thermal response time of 100 mus. These results imply that they the high-energy requirement for XEUS can be met, in terms of energy resolution and response time. It has been demonstrated that bismuth can be applied as absorber material without impeding on the detector performance. Bi increases the stopping power in excess of 90 % and allows for a high filling factor since the absorber is can be modeled in the shape of a mushroom, allowing that the wiring to the detector and the thermal support structure are placed under the hat of the mushroom. In order to realize the NFI 2 detector array, there are two major development areas. Firstly, there is the development of micromachined Si and SiN structures that will provide proper cooling for each of the pixels and the production of small membranes to support the detector pixels. Micromechanical prototypes of this cooling and support structure have been made and are currently characterized. Secondly, the read-out of the array has to be developed. The current baseline for research is frequency division multiplexing (FDM) which will allow that a large detector can be read-out with a minimum of low-temperature electronics (Superconducting Quantum Interference Devices) and with a minimum of wires to the detector, thus reducing the thermal load on the detector cooling. Significant progress has been achieved since a microcalorimeter has been successfully biased at a frequency of 46 kHz, showing a performance which is very similar to that under conventional dc-bias conditions, proving the FDM concept.

  4. Multispectral Photometry of the Moon and Absolute Calibration of the Clementine UV/Vis Camera

    NASA Astrophysics Data System (ADS)

    Hillier, John K.; Buratti, Bonnie J.; Hill, Kathryn

    1999-10-01

    We present a multispectral photometric study of the Moon between solar phase angles of 0 and 85°. Using Clementine images obtained between 0.4 and 1.0 μm, we produce a comprehensive study of the lunar surface containing the following results: (1) empirical photometric functions for the spectral range and viewing and illumination geometries mentioned, (2) photometric modeling that derives the physical properties of the upper regolith and includes a detailed study of the causes for the lunar opposition surge, (3) an absolute calibration of the Clementine UV/Vis camera. The calibration procedure given on the Clementine calibration web site produces reflectances relative to a halon standard and further appear significantly higher than those seen in groundbased observations. By comparing Clementine observations with prior groundbased observations of 15 sites on the Moon we have determined a good absolute calibration of the Clementine UV/Vis camera. A correction factor of 0.532 has been determined to convert the web site (www.planetary.brown.edu/clementine/calibration.html) reflectances to absolute values. From the calibrated data, we calculate empirical phase functions useful for performing photometric corrections to observations of the Moon between solar phase angles of 0 and 85° and in the spectral range 0.4 to 1.0μm. Finally, the calibrated data is used to fit a version of Hapke's photometric model modified to incorporate a new formulation, developed in this paper, of the lunar opposition surge which includes coherent backscatter. Recent studies of the lunar opposition effect have yielded contradictory results as to the mechanism responsible: shadow hiding, coherent backscatter, or both. We find that most of the surge can be explained by shadow hiding with a halfwidth of ˜8°. However, for the brightest regions (the highlands at 0.75-1.0μm) a small additional narrow component (halfwidth of <2°) of total amplitude ˜1/6 to 1/4 that of the shadow hiding surge is observed, which may be attributed to coherent backscatter. Interestingly, no evidence for the narrow component is seen in the maria or in the highlands at 0.415μm. A natural explanation for this is that these regions are too dark to exhibit enough multiple scattering for the effects of coherent backscatter to be seen. Finally, because the Moon is the only celestial body for which we have "ground truth" measurements, our results provide an important test for the robustness of photometric models of remote sensing observations.

  5. ETR BUILDING, TRA642, INTERIOR. BASEMENT. CAMERA IS IN SAME POSITION ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    ETR BUILDING, TRA-642, INTERIOR. BASEMENT. CAMERA IS IN SAME POSITION AS ID-33-G-98 BUT ANGLED TO SHOW FAR END OF CORRIDOR AND OTHER EXPERIMENTAL GEAR. CAMERA FACES WEST. INL NEGATIVE NO. HD46-30-3. Mike Crane, Photographer, 2/2005 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  6. Visual Feedback Stabilization of Balancing Tasks with Camera Misalignment

    NASA Astrophysics Data System (ADS)

    Hirata, Kentaro; Mizuno, Takashi

    In this paper, we consider visual feedback stabilization which tolerates small camera misalignment. Specifically, a balancing task with a cart-pendulum system using camera image is examined. Such a task is known to rely heavily on the detection of the vertical direction and the angle measurement error due to the camera misalignment could be fatal for stabilization. From a mathematical model of the measurement error, the effect of the misalignment is naturally represented by affine perturbation to the coefficient matrix of the output equation. Motivated by this fact, a special type of robust dynamic output feedback stabilization against polytopic uncertainty is investigated. By solving the related BMI, one can design a controller which tolerates the camera misalignment to some extent. The result is verified via experiments.

  7. Circuitry for Angle Measurements

    NASA Technical Reports Server (NTRS)

    Currie, J. R.; Kissel, R. R.

    1983-01-01

    Angle resolver pulsed and read under microprocessor control. Pulse generator excites resolver windings with dual slope pulse. System sequentially reads sine and cosine windings. Microprocessor determines angle through which resolver shaft turned from reference angle. Suitable applications include rate tables, antenna direction controllers, and machine tools.

  8. Optimal Number of Angle Images for Calculating Anterior Angle Volume and Iris Volume Measurements

    PubMed Central

    Blieden, Lauren S.; Chuang, Alice Z.; Baker, Laura A.; Bell, Nicholas P.; Fuller, Timothy S.; Mankiewicz, Kimberly A.; Feldman, Robert M.

    2015-01-01

    Purpose. We determined the optimal number of angle images required to obtain reliable measurements of trabecular-iris circumferential volume (TICV) and iris volume (IV) using swept-source Fourier domain anterior segment optical coherence tomography (SSFD-ASOCT) scans in narrow angle eyes. Methods. Scleral spur landmarks (SSL) were manually identified on ASOCT angle images from 128 meridians from each of 24 eyes with chronic primary angle closure (PAC) spectrum of disease. The anterior and posterior corneal curves, and the anterior and posterior iris surfaces were identified automatically by the anterior chamber analysis and interpretation (ACAI) software, then manually examined and edited by the reader if required. Trabecular-iris circumferential volume at 750 μm from SSL (TICV750) and IV were subsequently calculated using varying numbers of angle images. Threshold error was determined to be less than the lower 95% confidence limit of mean absolute percent error (MAPE) of the change in TICV or IV resulting from laser peripheral iridotomy, which would be 17% for TICV and 5% for IV, based on previous studies. The optimal number of angle images was the smallest number of images where MAPE was less than this threshold for TICV and IV. Results. A total of 32 equally-spaced angle images (16 meridians) was required to estimate TICV750 and 16 angle images (8 meridians) to estimate IV. Both were within 4.6% and 1.6% of MAPE, respectively. Conclusions. It is possible to determine TICV and IV parameters reliably in narrow angles without evaluating all 128 meridians obtained with SSFD-ASOCT. PMID:25829412

  9. Accurate camera calibration method specialized for virtual studios

    NASA Astrophysics Data System (ADS)

    Okubo, Hidehiko; Yamanouchi, Yuko; Mitsumine, Hideki; Fukaya, Takashi; Inoue, Seiki

    2008-02-01

    Virtual studio is a popular technology for TV programs, that makes possible to synchronize computer graphics (CG) to realshot image in camera motion. Normally, the geometrical matching accuracy between CG and realshot image is not expected so much on real-time system, we sometimes compromise on directions, not to come out the problem. So we developed the hybrid camera calibration method and CG generating system to achieve the accurate geometrical matching of CG and realshot on virtual studio. Our calibration method is intended for the camera system on platform and tripod with rotary encoder, that can measure pan/tilt angles. To solve the camera model and initial pose, we enhanced the bundle adjustment algorithm to fit the camera model, using pan/tilt data as known parameters, and optimizing all other parameters invariant against pan/tilt value. This initialization yields high accurate camera position and orientation consistent with any pan/tilt values. Also we created CG generator implemented the lens distortion function with GPU programming. By applying the lens distortion parameters obtained by camera calibration process, we could get fair compositing results.

  10. Adaptive compressive sensing camera

    NASA Astrophysics Data System (ADS)

    Hsu, Charles; Hsu, Ming K.; Cha, Jae; Iwamura, Tomo; Landa, Joseph; Nguyen, Charles; Szu, Harold

    2013-05-01

    We have embedded Adaptive Compressive Sensing (ACS) algorithm on Charge-Coupled-Device (CCD) camera based on the simplest concept that each pixel is a charge bucket, and the charges comes from Einstein photoelectric conversion effect. Applying the manufactory design principle, we only allow altering each working component at a minimum one step. We then simulated what would be such a camera can do for real world persistent surveillance taking into account of diurnal, all weather, and seasonal variations. The data storage has saved immensely, and the order of magnitude of saving is inversely proportional to target angular speed. We did design two new components of CCD camera. Due to the matured CMOS (Complementary metal-oxide-semiconductor) technology, the on-chip Sample and Hold (SAH) circuitry can be designed for a dual Photon Detector (PD) analog circuitry for changedetection that predicts skipping or going forward at a sufficient sampling frame rate. For an admitted frame, there is a purely random sparse matrix [Φ] which is implemented at each bucket pixel level the charge transport bias voltage toward its neighborhood buckets or not, and if not, it goes to the ground drainage. Since the snapshot image is not a video, we could not apply the usual MPEG video compression and Hoffman entropy codec as well as powerful WaveNet Wrapper on sensor level. We shall compare (i) Pre-Processing FFT and a threshold of significant Fourier mode components and inverse FFT to check PSNR; (ii) Post-Processing image recovery will be selectively done by CDT&D adaptive version of linear programming at L1 minimization and L2 similarity. For (ii) we need to determine in new frames selection by SAH circuitry (i) the degree of information (d.o.i) K(t) dictates the purely random linear sparse combination of measurement data a la [Φ]M,N M(t) = K(t) Log N(t).

  11. DEVICE CONTROLLER, CAMERA CONTROL

    Energy Science and Technology Software Center (ESTSC)

    1998-07-20

    This is a C++ application that is the server for the cameral control system. Devserv drives serial devices, such as cameras and videoswitchers used in a videoconference, upon request from a client such as the camxfgbfbx ccint program. cc Deverv listens on UPD ports for clients to make network contractions. After a client connects and sends a request to control a device (such as to pan,tilt, or zooma camera or do picture-in-picture with a videoswitcher),more » devserv formats the request into an RS232 message appropriate for the device and sends this message over the serial port to which the device is connected. Devserv then reads the reply from the device from the serial port to which the device is connected. Devserv then reads the reply from the device from the serial port and then formats and sends via multicast a status message. In addition, devserv periodically multicasts status or description messages so that all clients connected to the multicast channel know what devices are supported and their ranges of motion and the current position. The software design employs a class hierarchy such that an abstract base class for devices can be subclassed into classes for various device categories(e.g. sonyevid30, cononvco4, panasonicwjmx50, etc.). which are further subclassed into classes for various device categories. The devices currently supported are the Sony evi-D30, Canon, VCC1, Canon VCC3, and Canon VCC4 cameras and the Panasonic WJ-MX50 videoswitcher. However, developers can extend the class hierarchy to support other devices.« less

  12. Neutron Imaging Camera

    NASA Technical Reports Server (NTRS)

    Hunter, Stanley; deNolfo, G. A.; Barbier, L. M.; Link, J. T.; Son, S.; Floyd, S. R.; Guardala, N.; Skopec, M.; Stark, B.

    2008-01-01

    The Neutron Imaging Camera (NIC) is based on the Three-dimensional Track Imager (3DTI) technology developed at GSFC for gamma-ray astrophysics applications. The 3-DTI, a large volume time-projection chamber, provides accurate, approximately 0.4 mm resolution, 3-D tracking of charged particles. The incident direction of fast neutrons, En > 0.5 MeV, are reconstructed from the momenta and energies of the proton and triton fragments resulting from (sup 3)He(n,p) (sup 3)H interactions in the 3-DTI volume. The performance of the NIC from laboratory and accelerator tests is presented.

  13. Anger perceptually and conceptually narrows cognitive scope.

    PubMed

    Gable, Philip A; Poole, Bryan D; Harmon-Jones, Eddie

    2015-07-01

    For the last 50 years, research investigating the effect of emotions on scope of cognitive processing was based on models proposing that affective valence determined cognitive scope. More recently, our motivational intensity model suggests that this past work had confounded valence with motivational intensity. Research derived from this model supports the idea that motivational intensity, rather than affective valence, explains much of the variance emotions have on cognitive scope. However, the motivational intensity model is limited in that the empirical work has examined only positive affects high in approach and negative affects high in avoidance motivation. Thus, perhaps only approach-positive and avoidance-negative states narrow cognitive scope. The present research was designed to clarify these conceptual issues by examining the effect of anger, a negatively valenced approach-motivated state, on cognitive scope. Results revealed that anger narrowed attentional scope relative to a neutral state and that attentional narrowing to anger was similar to the attentional narrowing caused by high approach-motivated positive affects (Study 1). This narrowing of attention was related to trait approach motivation (Studies 2 and Study 3). Anger also narrowed conceptual cognitive categorization (Study 4). Narrowing of categorization related to participants' approach motivation toward anger stimuli. Together, these results suggest that anger, an approach-motivated negative affect, narrows perceptual and conceptual cognitive scope. More broadly, these results support the conceptual model that motivational intensity per se, rather than approach-positive and avoidance-negative states, causes a narrowing of cognitive scope. PMID:26011662

  14. Mars Science Laboratory Engineering Cameras

    NASA Technical Reports Server (NTRS)

    Maki, Justin N.; Thiessen, David L.; Pourangi, Ali M.; Kobzeff, Peter A.; Lee, Steven W.; Dingizian, Arsham; Schwochert, Mark A.

    2012-01-01

    NASA's Mars Science Laboratory (MSL) Rover, which launched to Mars in 2011, is equipped with a set of 12 engineering cameras. These cameras are build-to-print copies of the Mars Exploration Rover (MER) cameras, which were sent to Mars in 2003. The engineering cameras weigh less than 300 grams each and use less than 3 W of power. Images returned from the engineering cameras are used to navigate the rover on the Martian surface, deploy the rover robotic arm, and ingest samples into the rover sample processing system. The navigation cameras (Navcams) are mounted to a pan/tilt mast and have a 45-degree square field of view (FOV) with a pixel scale of 0.82 mrad/pixel. The hazard avoidance cameras (Haz - cams) are body-mounted to the rover chassis in the front and rear of the vehicle and have a 124-degree square FOV with a pixel scale of 2.1 mrad/pixel. All of the cameras utilize a frame-transfer CCD (charge-coupled device) with a 1024x1024 imaging region and red/near IR bandpass filters centered at 650 nm. The MSL engineering cameras are grouped into two sets of six: one set of cameras is connected to rover computer A and the other set is connected to rover computer B. The MSL rover carries 8 Hazcams and 4 Navcams.

  15. Narrow band 3 × 3 Mueller polarimetric endoscopy.

    PubMed

    Qi, Ji; Ye, Menglong; Singh, Mohan; Clancy, Neil T; Elson, Daniel S

    2013-01-01

    Mueller matrix polarimetric imaging has shown potential in tissue diagnosis but is challenging to implement endoscopically. In this work, a narrow band 3 × 3 Mueller matrix polarimetric endoscope was designed by rotating the endoscope to generate 0°, 45° and 90° linearly polarized illumination and positioning a rotating filter wheel in front of the camera containing three polarisers to permit polarization state analysis for backscattered light. The system was validated with a rotating linear polarizer and a diffuse reflection target. Initial measurements of 3 × 3 Mueller matrices on a rat are demonstrated, followed by matrix decomposition into the depolarization and retardance matrices for further analysis. Our work shows the feasibility of implementing polarimetric imaging in a rigid endoscope conveniently and economically in order to reveal diagnostic information. PMID:24298405

  16. Viewing angle analysis of integral imaging

    NASA Astrophysics Data System (ADS)

    Wang, Hong-Xia; Wu, Chun-Hong; Yang, Yang; Zhang, Lan

    2007-12-01

    Integral imaging (II) is a technique capable of displaying 3D images with continuous parallax in full natural color. It is becoming the most perspective technique in developing next generation three-dimensional TV (3DTV) and visualization field due to its outstanding advantages. However, most of conventional integral images are restricted by its narrow viewing angle. One reason is that the range in which a reconstructed integral image can be displayed with consistent parallax is limited. The other is that the aperture of system is finite. By far many methods , an integral imaging method to enhance the viewing angle of integral images has been proposed. Nevertheless, except Ren's MVW (Maximum Viewing Width) most of these methods involve complex hardware and modifications of optical system, which usually bring other disadvantages and make operation more difficult. At the same time the cost of these systems should be higher. In order to simplify optical systems, this paper systematically analyzes the viewing angle of traditional integral images instead of modified ones. Simultaneously for the sake of cost the research was based on computer generated integral images (CGII). With the analysis result we can know clearly how the viewing angle can be enhanced and how the image overlap or image flipping can be avoided. The result also promotes the development of optical instruments. Based on theoretical analysis, preliminary calculation was done to demonstrate how the other viewing properties which are closely related with the viewing angle, such as viewing distance, viewing zone, lens pitch, and etc. affect the viewing angle.

  17. PAU camera: detectors characterization

    NASA Astrophysics Data System (ADS)

    Casas, Ricard; Ballester, Otger; Cardiel-Sas, Laia; Castilla, Javier; Jiménez, Jorge; Maiorino, Marino; Pío, Cristóbal; Sevilla, Ignacio; de Vicente, Juan

    2012-07-01

    The PAU Camera (PAUCam) [1,2] is a wide field camera that will be mounted at the corrected prime focus of the William Herschel Telescope (Observatorio del Roque de los Muchachos, Canary Islands, Spain) in the next months. The focal plane of PAUCam is composed by a mosaic of 18 CCD detectors of 2,048 x 4,176 pixels each one with a pixel size of 15 microns, manufactured by Hamamatsu Photonics K. K. This mosaic covers a field of view (FoV) of 60 arcmin (minutes of arc), 40 of them are unvignetted. The behaviour of these 18 devices, plus four spares, and their electronic response should be characterized and optimized for the use in PAUCam. This job is being carried out in the laboratories of the ICE/IFAE and the CIEMAT. The electronic optimization of the CCD detectors is being carried out by means of an OG (Output Gate) scan and maximizing it CTE (Charge Transfer Efficiency) while the read-out noise is minimized. The device characterization itself is obtained with different tests. The photon transfer curve (PTC) that allows to obtain the electronic gain, the linearity vs. light stimulus, the full-well capacity and the cosmetic defects. The read-out noise, the dark current, the stability vs. temperature and the light remanence.

  18. Synthetic Doppler spectroscopy and curvilinear camera diagnostics in the ERO code

    NASA Astrophysics Data System (ADS)

    Makkonen, T.; Groth, M.; Airila, M. I.; Dux, R.; Janzer, A.; Kurki-Suonio, T.; Lunt, T.; Mueller, H. W.; Puetterich, T.; Viezzer, E.

    2013-08-01

    We present a set of new synthetic diagnostics, recently implemented in the ERO code, that were developed to facilitate direct comparisons between experiments and modeling of tokamak scrape-off-layer plasmas. The diagnostics calculate the spectroscopic Doppler shift and Doppler broadening of impurity lines of interest for any line of sight, and they also generate camera images from arbitrary viewing angles allowing for curvilinear (e.g., wide-angle or fisheye) lenses. The synthetic camera diagnostics can either replicate the distortions caused by curvilinear lenses or create a rectilinear synthetic camera image and correct the curvilinear distortions in the experimental image. Comparison to experimental data is presented.

  19. Object recognition through turbulence with a modified plenoptic camera

    NASA Astrophysics Data System (ADS)

    Wu, Chensheng; Ko, Jonathan; Davis, Christopher

    2015-03-01

    Atmospheric turbulence adds accumulated distortion to images obtained by cameras and surveillance systems. When the turbulence grows stronger or when the object is further away from the observer, increasing the recording device resolution helps little to improve the quality of the image. Many sophisticated methods to correct the distorted images have been invented, such as using a known feature on or near the target object to perform a deconvolution process, or use of adaptive optics. However, most of the methods depend heavily on the object's location, and optical ray propagation through the turbulence is not directly considered. Alternatively, selecting a lucky image over many frames provides a feasible solution, but at the cost of time. In our work, we propose an innovative approach to improving image quality through turbulence by making use of a modified plenoptic camera. This type of camera adds a micro-lens array to a traditional high-resolution camera to form a semi-camera array that records duplicate copies of the object as well as "superimposed" turbulence at slightly different angles. By performing several steps of image reconstruction, turbulence effects will be suppressed to reveal more details of the object independently (without finding references near the object). Meanwhile, the redundant information obtained by the plenoptic camera raises the possibility of performing lucky image algorithmic analysis with fewer frames, which is more efficient. In our work, the details of our modified plenoptic cameras and image processing algorithms will be introduced. The proposed method can be applied to coherently illuminated object as well as incoherently illuminated objects. Our result shows that the turbulence effect can be effectively suppressed by the plenoptic camera in the hardware layer and a reconstructed "lucky image" can help the viewer identify the object even when a "lucky image" by ordinary cameras is not achievable.

  20. 30. TACOMA NARROWS BRIDGE, LOOKING EAST THROUGH TOLL LANES, 29 ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    30. TACOMA NARROWS BRIDGE, LOOKING EAST THROUGH TOLL LANES, 29 AUGUST 1940. (ELDRIDGE, CLARK H. TACOMA NARROWS BRIDGE, TACOMA, WASHINGTON, FINAL REPORT ON DESIGN AND CONSTRUCTION, 1941) - Tacoma Narrows Bridge, Spanning Narrows at State Route 16, Tacoma, Pierce County, WA

  1. 31. TACOMA NARROWS BRIDGE, LOOKING WEST ACROSS TOLL PLAZA, 29 ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    31. TACOMA NARROWS BRIDGE, LOOKING WEST ACROSS TOLL PLAZA, 29 AUGUST 1940. (ELDRIDGE, CLARK M. TACOMA NARROWS BRIDGE, TACOMA, WASHINGTON, FINAL REPORT ON DESIGN AND CONSTRUCTION, 1941) - Tacoma Narrows Bridge, Spanning Narrows at State Route 16, Tacoma, Pierce County, WA

  2. Performance of the LSST Camera

    NASA Astrophysics Data System (ADS)

    Gilmore, D. Kirk

    2011-01-01

    The LSST camera will be the largest digital camera ever built. As such, its design presents a number of challenges. The field of view will be 3.5 degrees in diameter and will be sampled by a 3.2 billion pixel array of sensors. The entire array will be read-out in under 2 seconds, which all lead to demanding constraints on the sensor architecture and the read-out electronics. In addition, given the fast, optical beam (f/1.2), the camera tolerances on the assembly and alignment of the focal plane and optics are tight. The camera also incorporates three large refractive lenses, an array of five, wide-band large filters mounted on a carrousel, and a mechanical shutter. We present an overview of the baseline camera design, with an emphasis on the requirements and expected performance of the design that will allow the camera to meet its scientific objectives.

  3. Narrow-band radiation wavelength measurement by processing digital photographs in RAW format

    SciTech Connect

    Kraiskii, A V; Mironova, T V; Sultanov, T T

    2012-12-31

    The technique of measuring the mean wavelength of narrow-band radiation in the 455 - 625-nm range using the image of the emitting surface is presented. The data from the camera array unprocessed by the built-in processor (RAW format) are used. The method is applied for determining the parameters of response of holographic sensors. Depending on the wavelength and brightness of the image fragment, the mean square deviation of the wavelength amounts to 0.3 - 3 nm. (experimental techniques)

  4. Critical Heat Flux In Inclined Rectangular Narrow Long Channel

    SciTech Connect

    J. L. Rempe; S. W. Noh; Y. H. Kim; K. Y. Suh; F.B.Cheung; S. B. Kim

    2005-05-01

    In the TMI-2 accident, the lower part of the reactor pressure vessel had been overheated and then rather rapidly cooled down, as was later identified in a vessel investigation project. This accounted for the possibility of gap cooling feasibility. For this reason, several investigations were performed to determine the critical heat flux (CHF) from the standpoint of invessel retention. The experiments are conducted to investigate the general boiling phenomena, and the triggering mechanism for the CHF in a narrow gap using a 5 x 105 mm2 crevice type heater assembly and de-mineralized water. The test parameters include the gap size of 5 mm, and the surface orientation angles from the downward facing position (180o) to the vertical position (90o). The orientation angle affects the bubble layer and escape from the narrow gap. The CHF is less than that in a shorter channel, compared with the previous experiments having a heated length of 35 mmin the copper test section.

  5. LRO Camera Imaging of Potential Landing Sites in the South Pole-Aitken Basin

    NASA Astrophysics Data System (ADS)

    Jolliff, B. L.; Wiseman, S. M.; Gibson, K. E.; Lauber, C.; Robinson, M.; Gaddis, L. R.; Scholten, F.; Oberst, J.; LROC Science; Operations Team

    2010-12-01

    We show results of WAC (Wide Angle Camera) and NAC (Narrow Angle Camera) imaging of candidate landing sites within the South Pole-Aitken (SPA) basin of the Moon obtained by the Lunar Reconnaissance Orbiter during the first full year of operation. These images enable a greatly improved delineation of geologic units, determination of unit thicknesses and stratigraphy, and detailed surface characterization that has not been possible with previous data. WAC imaging encompasses the entire SPA basin, located within an area ranging from ~ 130-250 degrees east longitude and ~15 degrees south latitude to the South Pole, at different incidence angles, with the specific range of incidence dependent on latitude. The WAC images show morphology and surface detail at better than 100 m per pixel, with spatial coverage and quality unmatched by previous data sets. NAC images reveal details at the sub-meter pixel scale that enable new ways to evaluate the origins and stratigraphy of deposits. Key among new results is the capability to discern extents of ancient volcanic deposits that are covered by later crater ejecta (cryptomare) [see Petro et al., this conference] using new, complementary color data from Kaguya and Chandrayaan-1. Digital topographic models derived from WAC and NAC geometric stereo coverage show broad intercrater-plains areas where slopes are acceptably low for high-probability safe landing [see Archinal et al., this conference]. NAC images allow mapping and measurement of small, fresh craters that excavated boulders and thus provide information on surface roughness and depth to bedrock beneath regolith and plains deposits. We use these data to estimate deposit thickness in areas of interest for landing and potential sample collection to better understand the possible provenance of samples. Also, small regions marked by fresh impact craters and their associated boulder fields are readily identified by their bright ejecta patterns and marked as lander keep-out zones. We will show examples of LROC data including those for Constellation sites on the SPA rim and interior, a site between Bose and Alder Craters, sites east of Bhabha Crater, and sites on and near the “Mafic Mound” [see Pieters et al., this conference]. Together the LROC data and complementary products provide essential information for ensuring identification of safe landing and sampling sites within SPA basin that has never before been available for a planetary mission.

  6. Do narrow {Sigma}-hypernuclear states exist?

    SciTech Connect

    Chrien, R.E.

    1995-12-31

    Reports of narrow states in {Sigma}-hypernucleus production have appeared from time to time. The present experiment is a repeat of the first and seemingly most definitive such experiment, that on a target of {sup 9}Be, but with much better statistics. No narrow states were observed.

  7. Infants Experience Perceptual Narrowing for Nonprimate Faces

    ERIC Educational Resources Information Center

    Simpson, Elizabeth A.; Varga, Krisztina; Frick, Janet E.; Fragaszy, Dorothy

    2011-01-01

    Perceptual narrowing--a phenomenon in which perception is broad from birth, but narrows as a function of experience--has previously been tested with primate faces. In the first 6 months of life, infants can discriminate among individual human and monkey faces. Though the ability to discriminate monkey faces is lost after about 9 months, infants

  8. Infants Experience Perceptual Narrowing for Nonprimate Faces

    ERIC Educational Resources Information Center

    Simpson, Elizabeth A.; Varga, Krisztina; Frick, Janet E.; Fragaszy, Dorothy

    2011-01-01

    Perceptual narrowing--a phenomenon in which perception is broad from birth, but narrows as a function of experience--has previously been tested with primate faces. In the first 6 months of life, infants can discriminate among individual human and monkey faces. Though the ability to discriminate monkey faces is lost after about 9 months, infants…

  9. Transmission electron microscope CCD camera

    DOEpatents

    Downing, Kenneth H.

    1999-01-01

    In order to improve the performance of a CCD camera on a high voltage electron microscope, an electron decelerator is inserted between the microscope column and the CCD. This arrangement optimizes the interaction of the electron beam with the scintillator of the CCD camera while retaining optimization of the microscope optics and of the interaction of the beam with the specimen. Changing the electron beam energy between the specimen and camera allows both to be optimized.

  10. Narrow band gap amorphous silicon semiconductors

    DOEpatents

    Madan, A.; Mahan, A.H.

    1985-01-10

    Disclosed is a narrow band gap amorphous silicon semiconductor comprising an alloy of amorphous silicon and a band gap narrowing element selected from the group consisting of Sn, Ge, and Pb, with an electron donor dopant selected from the group consisting of P, As, Sb, Bi and N. The process for producing the narrow band gap amorphous silicon semiconductor comprises the steps of forming an alloy comprising amorphous silicon and at least one of the aforesaid band gap narrowing elements in amount sufficient to narrow the band gap of the silicon semiconductor alloy below that of amorphous silicon, and also utilizing sufficient amounts of the aforesaid electron donor dopant to maintain the amorphous silicon alloy as an n-type semiconductor.

  11. Reflectance characteristics of the Viking lander camera reference test charts

    NASA Technical Reports Server (NTRS)

    Wall, S. D.; Burcher, E. E.; Jabson, D. J.

    1975-01-01

    Reference test charts provide radiometric, colorimetric, and spatial resolution references for the Viking lander cameras on Mars. Reflectance measurements of these references are described, including the absolute bidirectional reflectance of the radiometric references and the relative spectral reflectance of both radiometric and colorimetric references. Results show that the bidirection reflectance of the radiometric references is Lambertian to within + or - 7% for incidence angles between 20 deg and 60 deg, and that their spectral reflectance is constant with wavelength to within + or - 5% over the spectral range of the cameras. Estimated accuracy of the measurements is + or - 0.05 in relative spectral reflectance.

  12. Neutron Imaging Camera

    NASA Technical Reports Server (NTRS)

    Hunter, Stanley D.; DeNolfo, Georgia; Floyd, Sam; Krizmanic, John; Link, Jason; Son, Seunghee; Guardala, Noel; Skopec, Marlene; Stark, Robert

    2008-01-01

    We describe the Neutron Imaging Camera (NIC) being developed for DTRA applications by NASA/GSFC and NSWC/Carderock. The NIC is based on the Three-dimensional Track Imager (3-DTI) technology developed at GSFC for gamma-ray astrophysics applications. The 3-DTI, a large volume time-projection chamber, provides accurate, approximately 0.4 mm resolution. 3-D tracking of charged particles. The incident direction of fast neutrons, E(sub N) > 0.5 MeV. arc reconstructed from the momenta and energies of the proton and triton fragments resulting from 3He(n,p)3H interactions in the 3-DTI volume. We present angular and energy resolution performance of the NIC derived from accelerator tests.

  13. A Motionless Camera

    NASA Technical Reports Server (NTRS)

    1994-01-01

    Omniview, a motionless, noiseless, exceptionally versatile camera was developed for NASA as a receiving device for guiding space robots. The system can see in one direction and provide as many as four views simultaneously. Developed by Omniview, Inc. (formerly TRI) under a NASA Small Business Innovation Research (SBIR) grant, the system's image transformation electronics produce a real-time image from anywhere within a hemispherical field. Lens distortion is removed, and a corrected "flat" view appears on a monitor. Key elements are a high resolution charge coupled device (CCD), image correction circuitry and a microcomputer for image processing. The system can be adapted to existing installations. Applications include security and surveillance, teleconferencing, imaging, virtual reality, broadcast video and military operations. Omniview technology is now called IPIX. The company was founded in 1986 as TeleRobotics International, became Omniview in 1995, and changed its name to Interactive Pictures Corporation in 1997.

  14. Dynamics of an oscillating bubble in a narrow gap

    NASA Astrophysics Data System (ADS)

    Azam, Fahad Ibn; Karri, Badarinath; Ohl, Siew-Wan; Klaseboer, Evert; Khoo, Boo Cheong

    2013-10-01

    The complex dynamics of a single bubble of a few millimeters in size oscillating inside a narrow fluid-filled gap between two parallel plates is studied using high-speed videography. Two synchronized high-speed cameras were used to observe both the side and front views of the bubble. The front-view images show bubble expansion and collapse with the formation of concentric dark and bright rings. The simultaneous recordings reveal the mechanism behind these rings. The side-view images reveal two different types of collapse behavior of the bubble including a previously unreported collapse phenomenon that is observed as the gap width is changed. At narrow widths, the bubble collapses towards the center of the gap; when the width is increased, the bubble splits before collapsing towards the walls. The bubble dynamics is also observed to be unaffected by the hydrophobic or hydrophilic nature of the plate surface due to the presence of a thin film of liquid between each of the plates and the bubble throughout the bubble lifetime. It is revealed that such systems do not behave as quasi-two-dimensional systems; three-dimensional effects are important.

  15. 3. Elevation view of entire midsection using ultrawide angle lens. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    3. Elevation view of entire midsection using ultrawide angle lens. Note opened south doors and closed north doors. The following photo WA-203-C-4 is similar except the camera position was moved right to include the slope of the south end. - Puget Sound Naval Shipyard, Munitions Storage Bunker, Naval Ammunitions Depot, South of Campbell Trail, Bremerton, Kitsap County, WA

  16. Bundle Adjustment for Multi-Camera Systems with Points at Infinity

    NASA Astrophysics Data System (ADS)

    Schneider, J.; Schindler, F.; Läbe, T.; Förstner, W.

    2012-07-01

    We present a novel approach for a rigorous bundle adjustment for omnidirectional and multi-view cameras, which enables an efficient maximum-likelihood estimation with image and scene points at infinity. Multi-camera systems are used to increase the resolution, to combine cameras with different spectral sensitivities (Z/I DMC, Vexcel Ultracam) or - like omnidirectional cameras - to augment the effective aperture angle (Blom Pictometry, Rollei Panoscan Mark III). Additionally multi-camera systems gain in importance for the acquisition of complex 3D structures. For stabilizing camera orientations - especially rotations - one should generally use points at the horizon over long periods of time within the bundle adjustment that classical bundle adjustment programs are not capable of. We use a minimal representation of homogeneous coordinates for image and scene points. Instead of eliminating the scale factor of the homogeneous vectors by Euclidean normalization, we normalize the homogeneous coordinates spherically. This way we can use images of omnidirectional cameras with single-view point like fisheye cameras and scene points, which are far away or at infinity. We demonstrate the feasibility and the potential of our approach on real data taken with a single camera, the stereo camera FinePix Real 3D W3 from Fujifilm and the multi-camera system Ladybug 3 from Point Grey.

  17. CPAPIR: a wide-field infrared camera for the Observatoire du Mont Megantic

    NASA Astrophysics Data System (ADS)

    Artigau, Etienne; Doyon, Rene; Vallee, Philippe; Riopel, Martin; Nadeau, Daniel

    2004-09-01

    CPAPIR is a wide-field infrared camera for use at the Observatoire du mont Megantic and CTIO 1.5 m telescopes. The camera will be primarily a survey instrument with a half-degree field of view, making it one of the most efficient of its kind. CPAPIR will provide broad and narrow band filters within its 0.8 to 2.5 ?m bandpass. The camera is based on a Hawaii-2 2048x2048 HgCdTe detector.

  18. The underwater camera calibration based on virtual camera lens distortion

    NASA Astrophysics Data System (ADS)

    Qin, Dahui; Mao, Ting; Cheng, Peng; Zhang, Zhiliang

    2011-08-01

    The machine view is becoming more and more popular in underwater. It is a challenge to calibrate the camera underwater because of the complicated light ray path in underwater and air environment. In this paper we firstly analyzed characteristic of the camera when light transported from air to water. Then we proposed a new method that takes the high-level camera distortion model to compensate the deviation of the light refraction when light ray come through the water and air media. In the end experience result shows the high-level distortion model can simulate the effect made by the underwater light refraction which also makes effect on the camera's image in the process of the camera underwater calibration.

  19. Camera Calibration for Uav Application Using Sensor of Mobile Camera

    NASA Astrophysics Data System (ADS)

    Takahashi, Y.; Chikatsu, H.

    2015-05-01

    Recently, 3D measurements using small unmanned aerial vehicles (UAVs) have increased in Japan, because small type UAVs is easily available at low cost and the analysis software can be created the easily 3D models. However, small type UAVs have a problem: they have very short flight times and a small payload. In particular, as the payload of a small type UAV increases, its flight time decreases. Therefore, it is advantageous to use lightweight sensors in small type UAVs. A mobile camera is lightweight and has many sensors such as an accelerometer, a magnetic field, and a gyroscope. Moreover, these sensors can be used simultaneously. Therefore, the authors think that the problems of small UAVs can be solved using the mobile camera. The authors executed camera calibration using a test target for evaluating sensor values measured using a mobile camera. Consequently, the authors confirmed the same accuracy with normal camera calibration.

  20. Design and modeling of the hybrid portable gamma camera system

    SciTech Connect

    Smith, L.E.; He, Z.; Wehe, D.K.; Knoll, G.F.; Wilderman, S.J.

    1998-06-01

    The combination of a mechanically-collimated camera with an electronically-collimated camera offers both the high efficiency and good angular resolution typical in a mechanically-collimated camera for lower energies and the uncoupling of spatial resolution and efficiency provided by an electronically-collimated camera at higher energies. The application is an industrial gamma-ray imaging system with good angular resolution and efficiency over a broad energy range: 50 keV to 3 MeV. The design and performance modeling of the Hybrid Portable Gamma Camera, currently being built, is described here. The optimization of the Angerlogic first detector module in terms of spatial and energy resolution is accomplished using a Monte Carlo optical photon modeling code and Cramer-Rao lower bound calculations. Approximately 6 mm spatial resolution and 7.5% FWHM (statistical contribution only) energy resolution for a 140 keV incident energy are expected for the 100 x 100 x 10 mm{sup 3} NaI(Tl) first detector. Analytical calculations of angular resolution components and efficiency for the Hybrid Portable Gamma Camera are compared to Monte Carlo calculations of the same quantities. The expected angular resolution performance for on-axis point sources, a central scattering angle of 30{degree} and a detector separation distance of 35 cm ranges from 3--5{degree} FWHM over the sensitive energy range. Intrinsic efficiency results over the same energy range are also presented.

  1. Reading Angles in Maps

    ERIC Educational Resources Information Center

    Izard, Vronique; O'Donnell, Evan; Spelke, Elizabeth S.

    2014-01-01

    Preschool children can navigate by simple geometric maps of the environment, but the nature of the geometric relations they use in map reading remains unclear. Here, children were tested specifically on their sensitivity to angle. Forty-eight children (age 47:15-53:30months) were presented with fragments of geometric maps, in which angle sections

  2. Reading Angles in Maps

    ERIC Educational Resources Information Center

    Izard, Véronique; O'Donnell, Evan; Spelke, Elizabeth S.

    2014-01-01

    Preschool children can navigate by simple geometric maps of the environment, but the nature of the geometric relations they use in map reading remains unclear. Here, children were tested specifically on their sensitivity to angle. Forty-eight children (age 47:15-53:30 months) were presented with fragments of geometric maps, in which angle sections…

  3. The Advanced Camera for the Hubble Space Telescope

    NASA Astrophysics Data System (ADS)

    Ford, H.; Broadhurst, T.; Feldman, P.; Bartko, F.; Bely, P.; Brown, R.; Burrows, C.; Clampin, M.; Crocker, J.; Hartig, G.; Postman, M.; Sparks, W.; White, R.; Cheng, E.; Kimble, R.; Neff, S.; Illingworth, G.; Lesser, M.; Miley, G.; Woodruff, R.

    1995-05-01

    The JHU and Ball Aerospace Advanced Camera for the HST will have a high throughput, wide field (200'' times 200''), optical and I-band camera which is critically sampled at 1000 nm, a high resolution optical and near-UV camera critically sampled at 500 nm, and a high throughput, far-UV camera. The AC's survey capability will be optimized for optical and NIR studies of the early Universe. The optimization is achieved by combining a novel, three-mirror optical design for the wide field camera with high reflectivity optical and NIR mirror and window coatings, a large format CCD optimized for the NIR, and a camera orientation chosen to minimize the time required to move to an adjacent field and begin a new exposure. The AC will increase HST's capability for surveys and discovery in the NIR by at least a factor of 10. We will use ~ 350 CVZ orbits to take contiguous deep V- and I-band WFC images of 0.7 square degrees of sky to investigate the formation and evolution of galaxies and clusters of galaxies, and the nature and large scale distribution of dark matter. In the second survey, we will use Surface Brightness Fluctuations in deep WFC I-band images of early type galaxies to map large scale flow. We will use narrow band and polarimetric HRC and WFC images to address QSOs and AGNs, our second major science area. The cornerstone of our approach to building the AC within the cost and schedule constraints set out in the NASA AO is reliance on STIS design and technology. The detectors and electronics for the far-UV and high resolution cameras are STIS design, and, in fact, may be STIS flight spares. Approximately 80% of the AC electronics modules and mechanisms are ``build to print'' from STIS drawings.

  4. Omnidirectional narrow bandpass filters based on one-dimensional superconductor-dielectric photonic crystal heterostructors

    NASA Astrophysics Data System (ADS)

    Barvestani, Jamal

    2015-01-01

    By using transfer matrix method, narrow passbands of TE wave from one-dimensional superconductor-dielectric photonic crystal heterostructures are presented. Various superconductor within the two-fluid model are considered. Results show that by selecting proper width for superconductor and dielectric layers and proper materials selection, single narrow passband in visible region can be obtained. Behavior of these passbands versus the temperature of superconductors, external magnetic field and incident angle are considered. We have shown that it is possible to obtain omnidirectional passbands with examining temperature, the dilation factor of the half part of a heterostructure and the other parameters of the heterostrutures. These tunable narrow passband may be useful in designing of narrow band filters or multichannel filters.

  5. Radiation camera motion correction system

    DOEpatents

    Hoffer, P.B.

    1973-12-18

    The device determines the ratio of the intensity of radiation received by a radiation camera from two separate portions of the object. A correction signal is developed to maintain this ratio at a substantially constant value and this correction signal is combined with the camera signal to correct for object motion. (Official Gazette)

  6. Camera artifacts in IUE spectra

    NASA Technical Reports Server (NTRS)

    Bruegman, O. W.; Crenshaw, D. M.

    1994-01-01

    This study of emission line mimicking features in the IUE cameras has produced an atlas of artifiacts in high-dispersion images with an accompanying table of prominent artifacts and a table of prominent artifacts in the raw images along with a medium image of the sky background for each IUE camera.

  7. An Educational PET Camera Model

    ERIC Educational Resources Information Center

    Johansson, K. E.; Nilsson, Ch.; Tegner, P. E.

    2006-01-01

    Positron emission tomography (PET) cameras are now in widespread use in hospitals. A model of a PET camera has been installed in Stockholm House of Science and is used to explain the principles of PET to school pupils as described here.

  8. Multi-PSPMT scintillation camera

    SciTech Connect

    Pani, R.; Pellegrini, R.; Trotta, G.; Scopinaro, F.; Soluri, A.; Vincentis, G. de; Scafe, R.; Pergola, A.

    1999-06-01

    Gamma ray imaging is usually accomplished by the use of a relatively large scintillating crystal coupled to either a number of photomultipliers (PMTs) (Anger Camera) or to a single large Position Sensitive PMT (PSPMT). Recently the development of new diagnostic techniques, such as scintimammography and radio-guided surgery, have highlighted a number of significant limitations of the Anger camera in such imaging procedures. In this paper a dedicated gamma camera is proposed for clinical applications with the aim of improving image quality by utilizing detectors with an appropriate size and shape for the part of the body under examination. This novel scintillation camera is based upon an array of PSPMTs (Hamamatsu R5900-C8). The basic concept of this camera is identical to the Anger Camera with the exception of the substitution of PSPMTs for the PMTs. In this configuration it is possible to use the high resolution of the PSPMTs and still correctly position events lying between PSPMTs. In this work the test configuration is a 2 by 2 array of PSPMTs. Some advantages of this camera are: spatial resolution less than 2 mm FWHM, good linearity, thickness less than 3 cm, light weight, lower cost than equivalent area PSPMT, large detection area when coupled to scintillating arrays, small dead boundary zone (< 3 mm) and flexibility in the shape of the camera.

  9. Digital Cameras for Student Use.

    ERIC Educational Resources Information Center

    Simpson, Carol

    1997-01-01

    Describes the features, equipment and operations of digital cameras and compares three different digital cameras for use in education. Price, technology requirements, features, transfer software, and accessories for the Kodak DC25, Olympus D-200L and Casio QV-100 are presented in a comparison table. (AEF)

  10. Airborne ballistic camera tracking systems

    NASA Technical Reports Server (NTRS)

    Redish, W. L.

    1976-01-01

    An operational airborne ballistic camera tracking system was tested for operational and data reduction feasibility. The acquisition and data processing requirements of the system are discussed. Suggestions for future improvements are also noted. A description of the data reduction mathematics is outlined. Results from a successful reentry test mission are tabulated. The test mission indicated that airborne ballistic camera tracking systems are feasible.

  11. The "All Sky Camera Network"

    ERIC Educational Resources Information Center

    Caldwell, Andy

    2005-01-01

    In 2001, the "All Sky Camera Network" came to life as an outreach program to connect the Denver Museum of Nature and Science (DMNS) exhibit "Space Odyssey" with Colorado schools. The network is comprised of cameras placed strategically at schools throughout Colorado to capture fireballs--rare events that produce meteorites. Meteorites have great…

  12. Mars Exploration Rover engineering cameras

    USGS Publications Warehouse

    Maki, J.N.; Bell, J.F., III; Herkenhoff, K. E.; Squyres, S. W.; Kiely, A.; Klimesh, M.; Schwochert, M.; Litwin, T.; Willson, R.; Johnson, Aaron H.; Maimone, M.; Baumgartner, E.; Collins, A.; Wadsworth, M.; Elliot, S.T.; Dingizian, A.; Brown, D.; Hagerott, E.C.; Scherr, L.; Deen, R.; Alexander, D.; Lorre, J.

    2003-01-01

    NASA's Mars Exploration Rover (MER) Mission will place a total of 20 cameras (10 per rover) onto the surface of Mars in early 2004. Fourteen of the 20 cameras are designated as engineering cameras and will support the operation of the vehicles on the Martian surface. Images returned from the engineering cameras will also be of significant importance to the scientific community for investigative studies of rock and soil morphology. The Navigation cameras (Navcams, two per rover) are a mast-mounted stereo pair each with a 45?? square field of view (FOV) and an angular resolution of 0.82 milliradians per pixel (mrad/pixel). The Hazard Avoidance cameras (Hazcams, four per rover) are a body-mounted, front- and rear-facing set of stereo pairs, each with a 124?? square FOV and an angular resolution of 2.1 mrad/pixel. The Descent camera (one per rover), mounted to the lander, has a 45?? square FOV and will return images with spatial resolutions of ???4 m/pixel. All of the engineering cameras utilize broadband visible filters and 1024 x 1024 pixel detectors. Copyright 2003 by the American Geophysical Union.

  13. Overview of Neutrino Mixing Models and Their Mixing Angle Predictions

    SciTech Connect

    Albright, Carl H.

    2009-11-01

    An overview of neutrino-mixing models is presented with emphasis on the types of horizontal flavor and vertical family symmetries that have been invoked. Distributions for the mixing angles of many models are displayed. Ways to differentiate among the models and to narrow the list of viable models are discussed.

  14. LISS-4 camera for Resourcesat

    NASA Astrophysics Data System (ADS)

    Paul, Sandip; Dave, Himanshu; Dewan, Chirag; Kumar, Pradeep; Sansowa, Satwinder Singh; Dave, Amit; Sharma, B. N.; Verma, Anurag

    2006-12-01

    The Indian Remote Sensing Satellites use indigenously developed high resolution cameras for generating data related to vegetation, landform /geomorphic and geological boundaries. This data from this camera is used for working out maps at 1:12500 scale for national level policy development for town planning, vegetation etc. The LISS-4 Camera was launched onboard Resourcesat-1 satellite by ISRO in 2003. LISS-4 is a high-resolution multi-spectral camera with three spectral bands and having a resolution of 5.8m and swath of 23Km from 817 Km altitude. The panchromatic mode provides a swath of 70Km and 5-day revisit. This paper briefly discusses the configuration of LISS-4 Camera of Resourcesat-1, its onboard performance and also the changes in the Camera being developed for Resourcesat-2. LISS-4 camera images the earth in push-broom mode. It is designed around a three mirror un-obscured telescope, three linear 12-K CCDs and associated electronics for each band. Three spectral bands are realized by splitting the focal plane in along track direction using an isosceles prism. High-speed Camera Electronics is designed for each detector with 12- bit digitization and digital double sampling of video. Seven bit data selected from 10 MSBs data by Telecommand is transmitted. The total dynamic range of the sensor covers up to 100% albedo. The camera structure has heritage of IRS- 1C/D. The optical elements are precisely glued to specially designed flexure mounts. The camera is assembled onto a rotating deck on spacecraft to facilitate +/- 26° steering in Pitch-Yaw plane. The camera is held on spacecraft in a stowed condition before deployment. The excellent imageries from LISS-4 Camera onboard Resourcesat-1 are routinely used worldwide. Such second Camera is being developed for Resourcesat-2 launch in 2007 with similar performance. The Camera electronics is optimized and miniaturized. The size and weight are reduced to one third and the power to half of the values in Resourcesat-1.

  15. Coherent infrared imaging camera (CIRIC)

    SciTech Connect

    Hutchinson, D.P.; Simpson, M.L.; Bennett, C.A.; Richards, R.K.; Emery, M.S.; Crutcher, R.I.; Sitter, D.N. Jr.; Wachter, E.A.; Huston, M.A.

    1995-07-01

    New developments in 2-D, wide-bandwidth HgCdTe (MCT) and GaAs quantum-well infrared photodetectors (QWIP) coupled with Monolithic Microwave Integrated Circuit (MMIC) technology are now making focal plane array coherent infrared (IR) cameras viable. Unlike conventional IR cameras which provide only thermal data about a scene or target, a coherent camera based on optical heterodyne interferometry will also provide spectral and range information. Each pixel of the camera, consisting of a single photo-sensitive heterodyne mixer followed by an intermediate frequency amplifier and illuminated by a separate local oscillator beam, constitutes a complete optical heterodyne receiver. Applications of coherent IR cameras are numerous and include target surveillance, range detection, chemical plume evolution, monitoring stack plume emissions, and wind shear detection.

  16. IMAX camera (12-IML-1)

    NASA Technical Reports Server (NTRS)

    1992-01-01

    The IMAX camera system is used to record on-orbit activities of interest to the public. Because of the extremely high resolution of the IMAX camera, projector, and audio systems, the audience is afforded a motion picture experience unlike any other. IMAX and OMNIMAX motion picture systems were designed to create motion picture images of superior quality and audience impact. The IMAX camera is a 65 mm, single lens, reflex viewing design with a 15 perforation per frame horizontal pull across. The frame size is 2.06 x 2.77 inches. Film travels through the camera at a rate of 336 feet per minute when the camera is running at the standard 24 frames/sec.

  17. PET with the HIDAC camera?

    NASA Astrophysics Data System (ADS)

    Townsend, D. W.

    1988-06-01

    In 1982 the first prototype high density avalanche chamber (HIDAC) positron camera became operational in the Division of Nuclear Medicine of Geneva University Hospital. The camera consisted of dual 20 cm × 20 cm HIDAC detectors mounted on a rotating gantry. In 1984, these detectors were replaced by 30 cm × 30 cm detectors with improved performance and reliability. Since then, the larger detectors have undergone clinical evaluation. This article discusses certain aspects of the evaluation program and the conclusions that can be drawn from the results. The potential of the HIDAC camera for quantitative positron emission tomography (PET) is critically examined, and its performance compared with a state-of-the-art, commercial ring camera. Guidelines for the design of a future HIDAC camera are suggested.

  18. Photoelectric angle converter

    NASA Astrophysics Data System (ADS)

    Podzharenko, Volodymyr A.; Kulakov, Pavlo I.

    2001-06-01

    The photo-electric angle transmitter of rotation is offered, at which the output voltage is linear function of entering magnitude. In a transmitter the linear phototransducer is used on the basis of pair photo diode -- operating amplifier, which output voltage is linear function of the area of an illuminated photosensitive stratum, and modulator of a light stream of the special shape, which ensures a linear dependence of this area from an angle of rotation. The transmitter has good frequent properties and can be used for dynamic measurements of an angular velocity and angle of rotation, in systems of exact drives and systems of autocontrol.

  19. Camera sensitivity study

    NASA Astrophysics Data System (ADS)

    Schlueter, Jonathan; Murphey, Yi L.; Miller, John W. V.; Shridhar, Malayappan; Luo, Yun; Khairallah, Farid

    2004-12-01

    As the cost/performance Ratio of vision systems improves with time, new classes of applications become feasible. One such area, automotive applications, is currently being investigated. Applications include occupant detection, collision avoidance and lane tracking. Interest in occupant detection has been spurred by federal automotive safety rules in response to injuries and fatalities caused by deployment of occupant-side air bags. In principle, a vision system could control airbag deployment to prevent this type of mishap. Employing vision technology here, however, presents a variety of challenges, which include controlling costs, inability to control illumination, developing and training a reliable classification system and loss of performance due to production variations due to manufacturing tolerances and customer options. This paper describes the measures that have been developed to evaluate the sensitivity of an occupant detection system to these types of variations. Two procedures are described for evaluating how sensitive the classifier is to camera variations. The first procedure is based on classification accuracy while the second evaluates feature differences.

  20. Proportional counter radiation camera

    DOEpatents

    Borkowski, C.J.; Kopp, M.K.

    1974-01-15

    A gas-filled proportional counter camera that images photon emitting sources is described. A two-dimensional, positionsensitive proportional multiwire counter is provided as the detector. The counter consists of a high- voltage anode screen sandwiched between orthogonally disposed planar arrays of multiple parallel strung, resistively coupled cathode wires. Two terminals from each of the cathode arrays are connected to separate timing circuitry to obtain separate X and Y coordinate signal values from pulse shape measurements to define the position of an event within the counter arrays which may be recorded by various means for data display. The counter is further provided with a linear drift field which effectively enlarges the active gas volume of the counter and constrains the recoil electrons produced from ionizing radiation entering the counter to drift perpendicularly toward the planar detection arrays. A collimator is interposed between a subject to be imaged and the counter to transmit only the radiation from the subject which has a perpendicular trajectory with respect to the planar cathode arrays of the detector. (Official Gazette)

  1. Narrow Vertical Caves: Mapping Volcanic Fissure Geometries

    NASA Astrophysics Data System (ADS)

    Parcheta, C.; Nash, J.; Parness, A.; Mitchell, K. L.; Pavlov, C. A.

    2015-10-01

    Volcanic conduits are difficult to quantify, but their geometry fundamentally influences how eruptions occur. We robotically map old fissure conduits - elongated narrow cracks in the ground that transported magma to the surface during an eruption.

  2. Developments towards a filter wheel hyperspectral camera for planetary exploration

    NASA Astrophysics Data System (ADS)

    Gunn, M.; Langstaff, D. P.; Barnes, D.

    2011-10-01

    The benefits of hyperspectral imaging in remote sensing applications are well established and it is now routinely exploited in terrestrial applications. However the restrictions imposed on mass and power consumption and the extreme operating conditions encountered in extra-terrestrial environments have limited its widespread use for planetary exploration. Instead multispectral camera systems with typically 10-12 discrete filters are employed, providing only coarse spectral information. By exploiting the properties of interference filters off axis it is possible to obtain additional spectral information. Recent advances in filter technology have made it possible to develop a simple and lightweight wide angle hyperspectral camera employing a filter wheel. The theory of operation and early test results from a prototype camera system are presented.

  3. Testing of the Apollo 15 Metric Camera System.

    NASA Technical Reports Server (NTRS)

    Helmering, R. J.; Alspaugh, D. H.

    1972-01-01

    Description of tests conducted (1) to assess the quality of Apollo 15 Metric Camera System data and (2) to develop production procedures for total block reduction. Three strips of metric photography over the Hadley Rille area were selected for the tests. These photographs were utilized in a series of evaluation tests culminating in an orbitally constrained block triangulation solution. Results show that film deformations up to 25 and 5 microns are present in the mapping and stellar materials, respectively. Stellar reductions can provide mapping camera orientations with an accuracy that is consistent with the accuracies of other parameters in the triangulation solutions. Pointing accuracies of 4 to 10 microns can be expected for the mapping camera materials, depending on variations in resolution caused by changing sun angle conditions.

  4. Star Identification Algorithm for Uncalibrated, Wide FOV Cameras

    NASA Astrophysics Data System (ADS)

    Ajdadi, Mohamad Javad; Ghafarzadeh, Mahdi; Taheri, Mojtaba; Mosadeq, Ehsan; Khakian Ghomi, Mahdi

    2015-06-01

    A novel method is proposed for star identification via uncalibrated cameras with wide fields of view (FOVs). In this approach some of the triangles created by the stars in the FOV are selected for pattern recognition. The triangles are selected considering the sensitivity of their interior angles to the calibration error. The algorithm is based on the intersection between sets of triangles that are found in the database for each selected triangle of the image. By this method, most of the image stars contribute to pattern recognition and thereby it is very robust against the noise and the calibration error. The algorithm is performed on 150 night sky images, which are taken by an uncalibrated camera in FOV of 114 12 with a success rate of 94% and no false positives. Based on the identification approach, an adaptive method is also developed for calibrating and obtaining the projection function of an uncalibrated camera.

  5. Angles, Time, and Proportion

    ERIC Educational Resources Information Center

    Pagni, David L.

    2005-01-01

    This article describes an investigation making connections between the time on an analog clock and the angle between the minute hand and the hour hand. It was posed by a middle school mathematics teacher. (Contains 8 tables and 6 figures.)

  6. CCD Camera Observations

    NASA Astrophysics Data System (ADS)

    Buchheim, Bob; Argyle, R. W.

    One night late in 1918, astronomer William Milburn, observing the region of Cassiopeia from Reverend T.H.E.C. Espin's observatory in Tow Law (England), discovered a hitherto unrecorded double star (Wright 1993). He reported it to Rev. Espin, who measured the pair using his 24-in. reflector: the fainter star was 6.0 arcsec from the primary, at position angle 162.4 ^{circ } (i.e. the fainter star was south-by-southeast from the primary) (Espin 1919). Some time later, it was recognized that the astrograph of the Vatican Observatory had taken an image of the same star-field a dozen years earlier, in late 1906. At that earlier epoch, the fainter star had been separated from the brighter one by only 4.8 arcsec, at position angle 186.2 ^{circ } (i.e. almost due south). Were these stars a binary pair, or were they just two unrelated stars sailing past each other? Some additional measurements might have begun to answer this question. If the secondary star was following a curved path, that would be a clue of orbital motion; if it followed a straight-line path, that would be a clue that these are just two stars passing in the night. Unfortunately, nobody took the trouble to re-examine this pair for almost a century, until the 2MASS astrometric/photometric survey recorded it in late 1998. After almost another decade, this amateur astronomer took some CCD images of the field in 2007, and added another data point on the star's trajectory, as shown in Fig. 15.1.

  7. ``Magic Angle Precession''

    NASA Astrophysics Data System (ADS)

    Binder, Bernd

    2008-01-01

    An advanced and exact geometric description of nonlinear precession dynamics modeling very accurately natural and artificial couplings showing Lorentz symmetry is derived. In the linear description it is usually ignored that the geometric phase of relativistic motion couples back to the orbital motion providing for a non-linear recursive precession dynamics. The high coupling strength in the nonlinear case is found to be a gravitomagnetic charge proportional to the precession angle and angular velocity generated by geometric phases, which are induced by high-speed relativistic rotations and are relevant to propulsion technologies but also to basic interactions. In the quantum range some magic precession angles indicating strong coupling in a phase-locked chaotic system are identified, emerging from a discrete time dynamical system known as the cosine map showing bifurcations at special precession angles relevant to heavy nuclei stability. The "Magic Angle Precession" (MAP) dynamics can be simulated and visualized by cones rolling in or on each other, where the apex and precession angles are indexed by spin, charge or precession quantum numbers, and corresponding magic angles. The most extreme relativistic warping and twisting effect is given by the Dirac spinor half spin constellation with "Hyperdiamond" MAP, which resembles quark confinement.

  8. 'Magic Angle Precession'

    SciTech Connect

    Binder, Bernd

    2008-01-21

    An advanced and exact geometric description of nonlinear precession dynamics modeling very accurately natural and artificial couplings showing Lorentz symmetry is derived. In the linear description it is usually ignored that the geometric phase of relativistic motion couples back to the orbital motion providing for a non-linear recursive precession dynamics. The high coupling strength in the nonlinear case is found to be a gravitomagnetic charge proportional to the precession angle and angular velocity generated by geometric phases, which are induced by high-speed relativistic rotations and are relevant to propulsion technologies but also to basic interactions. In the quantum range some magic precession angles indicating strong coupling in a phase-locked chaotic system are identified, emerging from a discrete time dynamical system known as the cosine map showing bifurcations at special precession angles relevant to heavy nuclei stability. The 'Magic Angle Precession' (MAP) dynamics can be simulated and visualized by cones rolling in or on each other, where the apex and precession angles are indexed by spin, charge or precession quantum numbers, and corresponding magic angles. The most extreme relativistic warping and twisting effect is given by the Dirac spinor half spin constellation with 'Hyperdiamond' MAP, which resembles quark confinement.

  9. Mass movement slope streaks imaged by the Mars Orbiter Camera

    NASA Astrophysics Data System (ADS)

    Sullivan, Robert; Thomas, Peter; Veverka, Joseph; Malin, Michael; Edgett, Kenneth S.

    2001-10-01

    Narrow, fan-shaped dark streaks on steep Martian slopes were originally observed in Viking Orbiter images, but a definitive explanation was not possible because of resolution limitations. Pictures acquired by the Mars Orbiter Camera (MOC) aboard the Mars Global Surveyor (MGS) spacecraft show innumerable examples of dark slope streaks distributed widely, but not uniformly, across the brighter equatorial regions, as well as individual details of these features that were not visible in Viking Orbiter data. Dark slope streaks (as well as much rarer bright slope streaks) represent one of the most widespread and easily recognized styles of mass movement currently affecting the Martian surface. New dark streaks have formed since Viking and even during the MGS mission, confirming earlier suppositions that higher contrast dark streaks are younger, and fade (brighten) with time. The darkest slope streaks represent ~10% contrast with surrounding slope materials. No small outcrops supplying dark material (or bright material, for bright streaks) have been found at streak apexes. Digitate downslope ends indicate slope streak formation involves a ground-hugging flow subject to deflection by minor topographic obstacles. The model we favor explains most dark slope streaks as scars from dust avalanches following oversteepening of air fall deposits. This process is analogous to terrestrial avalanches of oversteepened dry, loose snow which produce shallow avalanche scars with similar morphologies. Low angles of internal friction typically 10-30¡ for terrestrial loess and clay materials suggest that mass movement of (low-cohesion) Martian dusty air fall is possible on a wide range of gradients. Martian gravity, presumed low density of the air fall deposits, and thin (unresolved by MOC) failed layer depths imply extremely low cohesive strength at time of failure, consistent with expectations for an air fall deposit of dust particles. As speed increases during a dust avalanche, a growing fraction of the avalanching dust particles acquires sufficient kinetic energy to be lost to the atmosphere in suspension, limiting the momentum of the descending avalanche front. The equilibrium speed, where rate of mass lost to the atmosphere is balanced by mass continually entrained as the avalanche front descends, decreases with decreasing gradient. This mechanism explains observations from MOC images indicating slope streaks formed with little reserve kinetic energy for run-outs on to valley floors and explains why large distal deposits of displaced material are not found at downslope streak ends. The mass movement process of dark (and bright) slope streak formation through dust avalanches involves renewable sources of dust only, leaving underlying slope materials unaffected. Areas where dark and bright slope streaks currently form and fade in cycles are closely correlated with low thermal inertia and probably represent regions where dust currently is accumulating, not just residing.

  10. Accuracy in fixing ship's positions by camera survey of bearings

    NASA Astrophysics Data System (ADS)

    Naus, Krzysztof; Wąż, Mariusz

    2011-01-01

    The paper presents the results of research on the possibilities of fixing ship position coordinates based on results of surveying bearings on navigational marks with the use of the CCD camera. Accuracy of the determination of ship position coordinates, expressed in terms of the mean error, was assumed to be the basic criterion of this estimation. The first part of the paper describes the method of the determination of the resolution and the mean error of the angle measurement, taken with a camera, and also the method of the determination of the mean error of position coordinates when two or more bearings were measured. There have been defined three software applications assigned for the development of navigational sea charts with accuracy areas mapped on. The second part contains the results of studying accuracy in fixing ship position coordinates, carried out in the Gulf of Gdansk, with the use of bearings taken obtained with the Rolleiflex and Sony cameras. The results are presented in a form of diagrams of the mean error of angle measurement, also in the form of navigational charts with accuracy fields mapped on. In the final part, basing on results obtained, the applicability of CCD cameras in automation of coastal navigation performance process is discussed.

  11. Observation of Planetary Motion Using a Digital Camera

    ERIC Educational Resources Information Center

    Meyn, Jan-Peter

    2008-01-01

    A digital SLR camera with a standard lens (50 mm focal length, f/1.4) on a fixed tripod is used to obtain photographs of the sky which contain stars up to 8[superscript m] apparent magnitude. The angle of view is large enough to ensure visual identification of the photograph with a large sky region in a stellar map. The resolution is sufficient to…

  12. Observation of Planetary Motion Using a Digital Camera

    ERIC Educational Resources Information Center

    Meyn, Jan-Peter

    2008-01-01

    A digital SLR camera with a standard lens (50 mm focal length, f/1.4) on a fixed tripod is used to obtain photographs of the sky which contain stars up to 8[superscript m] apparent magnitude. The angle of view is large enough to ensure visual identification of the photograph with a large sky region in a stellar map. The resolution is sufficient to

  13. Performance of new low-cost 1/3" security cameras for meteor surveillance

    NASA Astrophysics Data System (ADS)

    Samuels, Dave; Wray, James; Gural, Peter S.; Jenniskens, Peter

    2014-02-01

    It has been almost 5 years since the CAMS (Cameras for All-sky Meteor Surveillance) system specifications were designed for video meteor surveillance. CAMS has been based on a relatively expensive black-and-white Watec WAT-902H2 Ultimate camera, which uses a 1/2" sensor. In this paper, we investigate the ability of new, lower cost color cameras based on smaller 1/3" sensors to be able to perform adequately for CAMS. We did not expect them to equal or outperform the sensitivity for the same field of view of the Watec 1/2" camera, but the goal was to see if they could perform within the tolerances of the sensitivity requirements for the CAMS project. Their lower cost brings deployment of meteor surveillance cameras within reach of amateur astronomers and makes it possible to deploy many more cameras to increase yield. The lens focal length is matched to the elevation angle of the camera to maintain an image scale and spatial resolution close to that of the standard CAMS camera and lens combination, crucial for obtaining sufficiently accurate orbital elements. An all-sky array based on 16 such cameras, to be operated from a single computer, was built and the performance of individual cameras was tested.

  14. Vision Sensors and Cameras

    NASA Astrophysics Data System (ADS)

    Hoefflinger, Bernd

    Silicon charge-coupled-device (CCD) imagers have been and are a specialty market ruled by a few companies for decades. Based on CMOS technologies, active-pixel sensors (APS) began to appear in 1990 at the 1 ?m technology node. These pixels allow random access, global shutters, and they are compatible with focal-plane imaging systems combining sensing and first-level image processing. The progress towards smaller features and towards ultra-low leakage currents has provided reduced dark currents and ?m-size pixels. All chips offer Mega-pixel resolution, and many have very high sensitivities equivalent to ASA 12.800. As a result, HDTV video cameras will become a commodity. Because charge-integration sensors suffer from a limited dynamic range, significant processing effort is spent on multiple exposure and piece-wise analog-digital conversion to reach ranges >10,000:1. The fundamental alternative is log-converting pixels with an eye-like response. This offers a range of almost a million to 1, constant contrast sensitivity and constant colors, important features in professional, technical and medical applications. 3D retino-morphic stacking of sensing and processing on top of each other is being revisited with sub-100 nm CMOS circuits and with TSV technology. With sensor outputs directly on top of neurons, neural focal-plane processing will regain momentum, and new levels of intelligent vision will be achieved. The industry push towards thinned wafers and TSV enables backside-illuminated and other pixels with a 100% fill-factor. 3D vision, which relies on stereo or on time-of-flight, high-speed circuitry, will also benefit from scaled-down CMOS technologies both because of their size as well as their higher speed.

  15. Dark energy survey and camera

    SciTech Connect

    William Wester

    2004-08-16

    The authors describe the Dark Energy Survey and Camera. The survey will image 5000 sq. deg. in the southern sky to collect 300 million galaxies, 30,000 galaxy clusters and 2000 Type Ia supernovae. They expect to derive a value for the dark energy equation of state parameters, w, to a precision of 5% by combining four distinct measurement techniques. They describe the mosaic camera that will consist of CCDs with enhanced sensitivity in the near infrared. The camera will be mounted at the prime focus of the 4m Blanco telescope.

  16. A liquid xenon radioisotope camera.

    NASA Technical Reports Server (NTRS)

    Zaklad, H.; Derenzo, S. E.; Muller, R. A.; Smadja, G.; Smits, R. G.; Alvarez, L. W.

    1972-01-01

    A new type of gamma-ray camera is discussed that makes use of electron avalanches in liquid xenon and is currently under development. It is shown that such a radioisotope camera promises many advantages over any other existing gamma-ray cameras. Spatial resolution better than 1 mm and counting rates higher than one million C/sec are possible. An energy resolution of 11% FWHM has recently been achieved with a collimated Hg-203 source using a parallel-plate ionization chamber containing a Frisch grid.

  17. Exposure interlock for oscilloscope cameras

    NASA Technical Reports Server (NTRS)

    Spitzer, C. R.; Stainback, J. D. (Inventor)

    1973-01-01

    An exposure interlock has been developed for oscilloscope cameras which cuts off ambient light from the oscilloscope screen before the shutter of the camera is tripped. A flap is provided which may be selectively positioned to an open position which enables viewing of the oscilloscope screen and a closed position which cuts off the oscilloscope screen from view and simultaneously cuts off ambient light from the oscilloscope screen. A mechanical interlock is provided between the flap to be activated to its closed position before the camera shutter is tripped, thereby preventing overexposure of the film.

  18. Focal Plane Metrology for the LSST Camera

    SciTech Connect

    A Rasmussen, Andrew P.; Hale, Layton; Kim, Peter; Lee, Eric; Perl, Martin; Schindler, Rafe; Takacs, Peter; Thurston, Timothy; /SLAC

    2007-01-10

    Meeting the science goals for the Large Synoptic Survey Telescope (LSST) translates into a demanding set of imaging performance requirements for the optical system over a wide (3.5{sup o}) field of view. In turn, meeting those imaging requirements necessitates maintaining precise control of the focal plane surface (10 {micro}m P-V) over the entire field of view (640 mm diameter) at the operating temperature (T {approx} -100 C) and over the operational elevation angle range. We briefly describe the hierarchical design approach for the LSST Camera focal plane and the baseline design for assembling the flat focal plane at room temperature. Preliminary results of gravity load and thermal distortion calculations are provided, and early metrological verification of candidate materials under cold thermal conditions are presented. A detailed, generalized method for stitching together sparse metrology data originating from differential, non-contact metrological data acquisition spanning multiple (non-continuous) sensor surfaces making up the focal plane, is described and demonstrated. Finally, we describe some in situ alignment verification alternatives, some of which may be integrated into the camera's focal plane.

  19. Focal plane metrology for the LSST camera

    NASA Astrophysics Data System (ADS)

    Rasmussen, Andrew P.; Hale, Layton; Kim, Peter; Lee, Eric; Perl, Martin; Schindler, Rafe; Takacs, Peter; Thurston, Timothy

    2006-06-01

    Meeting the science goals for the Large Synoptic Survey Telescope (LSST) translates into a demanding set of imaging performance requirements for the optical system over a wide (3.5°) field of view. In turn, meeting those imaging requirements necessitates maintaining precise control of the focal plane surface (10 μm P-V) over the entire field of view (640 mm diameter) at the operating temperature (T ~ -100°C) and over the operational elevation angle range. We briefly describe the heirarchical design approach for the LSST Camera focal plane and the baseline design for assembling the flat focal plane at room temperature. Preliminary results of gravity load and thermal distortion calculations are provided, and early metrological verification of candidate materials under cold thermal conditions are presented. A detailed, generalized method for stitching together sparse metrology data originating from differential, non-contact metrological data acquisition spanning multiple (non-continuous) sensor surfaces making up the focal plane, is described and demonstrated. Finally, we describe some in situ alignment verification alternatives, some of which may be integrated into the camera's focal plane.

  20. X-ray Pinhole Camera Measurements

    SciTech Connect

    Nelson, D. S.; Berninger, M. J.; Flores, P. A.; Good, D. E.; Henderson, D. J.; Hogge, K. W.; Huber, S. R.; Lutz, S. S.; Mitchell, S. E.; Howe, R. A.; Mitton, C. V.; Molina, I.; Bozman, D. R.; Cordova, S. R.; Mitchell, D. R.; Oliver, B. V.; Ormond, E. C.

    2013-07-01

    The development of the rod pinch diode [1] has led to high-resolution radiography for dynamic events such as explosive tests. Rod pinch diodes use a small diameter anode rod, which extends through the aperture of a cathode plate. Electrons borne off the aperture surface can self-insulate and pinch onto the tip of the rod, creating an intense, small x-ray source (Primary Pinch). This source has been utilized as the main diagnostic on numerous experiments that include high-value, single-shot events. In such applications there is an emphasis on machine reliability, x-ray reproducibility, and x-ray quality [2]. In tests with the baseline rod pinch diode, we have observed that an additional pinch (Secondary Pinch) occurs at the interface near the anode rod and the rod holder. This suggests that stray electrons exist that are not associated with the Primary Pinch. In this paper we present measurements on both pinches using an x-ray pinhole camera. The camera is placed downstream of the Primary Pinch at an angle of 60° with respect to the diode centerline. This diagnostic will be employed to diagnose x-ray reproducibility and quality. In addition, we will investigate the performance of hybrid diodes relating to the formation of the Primary and Secondary Pinches.

  1. Auto-preview camera orientation for environment perception on a mobile robot

    NASA Astrophysics Data System (ADS)

    Radovnikovich, Micho; Vempaty, Pavan K.; Cheok, Ka C.

    2010-01-01

    Using wide-angle or omnidirectional camera lenses to increase a mobile robot's field of view introduces nonlinearity in the image due to the 'fish-eye' effect. This complicates distance perception, and increases image processing overhead. Using multiple cameras avoids the fish-eye complications, but involves using more electrical and processing power to interface them to a computer. Being able to control the orientation of a single camera, both of these disadvantages are minimized while still allowing the robot to preview a wider area. In addition, controlling the orientation allows the robot to optimize its environment perception by only looking where the most useful information can be discovered. In this paper, a technique is presented that creates a two dimensional map of objects of interest surrounding a mobile robot equipped with a panning camera on a telescoping shaft. Before attempting to negotiate a difficult path planning situation, the robot takes snapshots at different camera heights and pan angles and then produces a single map of the surrounding area. Distance perception is performed by making calibration measurements of the camera and applying coordinate transformations to project the camera's findings into the vehicle's coordinate frame. To test the system, obstacles and lines were placed to form a chicane. Several snapshots were taken with different camera orientations, and the information from each were stitched together to yield a very useful map of the surrounding area for the robot to use to plan a path through the chicane.

  2. The champagne angle.

    PubMed

    Pemberton, P L; Calder, I; O'Sullivan, C; Crockard, H A

    2002-04-01

    A patient's observation led us to investigate whether drinking from a champagne flute required more cranio-cervical extension than drinking from other types of wine glasses. We measured the cranio-cervical extension required by normal volunteers to drink from four different types of glass. The mean [95% confidence intervals] extension from the neutral position required to drain each glass was: narrow flute 40 degrees [35-44]; wide flute 22 degrees [19-25]; wine glass 26 degrees [24-29]; champagne saucer 0 degree [-1-2]. Drinking from the narrow rimmed champagne flute required significantly more extension than the other types of glass (p < 0.001), and 73% of the total available cranio-cervical extension. PMID:11949646

  3. An Inexpensive Digital Infrared Camera

    ERIC Educational Resources Information Center

    Mills, Allan

    2012-01-01

    Details are given for the conversion of an inexpensive webcam to a camera specifically sensitive to the near infrared (700-1000 nm). Some experiments and practical applications are suggested and illustrated. (Contains 9 figures.)

  4. Construction of multichannel camera gamuts

    NASA Astrophysics Data System (ADS)

    Helling, Stephan

    2006-01-01

    Device gamuts are commonly defined for output devices, such as monitors or printers. In this paper, a definition of gamuts of input devices will be examined, considering multispectral cameras as examples. A method appropriate to calculate them as a function of the camera model and the spectral reconstruction algorithm will be proposed. The method will be applied to multispectral camera models with a variable number of channels. The characteristics of the resulting gamuts will be shown and examined as a function of the number of channels. Implications on the minimum number of channels needed will be derived. The method proposed here to characterize input devices can be used in addition to common quality criteria such as color distances like ΔE 00, spectral errors, etc. The advantage of the proposed method is the independence of any given spectral data set. This makes it a quality criterion universal for linear (multispectral) cameras and reconstruction algorithms.

  5. The emission of narrow-band Jovian kilometric radiation

    NASA Technical Reports Server (NTRS)

    Fung, S. F.; Papadopoulos, K.

    1987-01-01

    A model based on the nonlinear coupling of electrostatic plasma waves is proposed to explain the emission of the narrow-band Jovian kilometric radiation (nKOM) observed by the Voyager spacecraft. It is shown that upper-hybrid branch electrostatic waves propagating through the inhomogeneities in the outer periphery of the Io plasma torus can attain the proper geometry for localized upconversion interactions leading to pump depletion. Plasma waves propagating into a weak density gradient and reflected at the critical layer interact with the incident waves leading to the electromagnetic emission, which is beamed at large angles with respect to the background magnetic field. In general, both L-O and R-X mode waves can be generated. The observed power and net polarization (L-O) are consistent with pump depletion of electrostatic waves at a level of about 10 mV/m. A possible excitation mechanism for the electrostatic waves is also discussed.

  6. Fast camera objective designs for spectrograph of Mont Megantique telescope

    NASA Astrophysics Data System (ADS)

    Thibault, Simon; Wang, Min

    2004-02-01

    All-reflective optics is conventionally required for extended spectral region observations in astronomical spectrograph. But the spatial resolution is usually not good enough while the large-size CCD will be used for observation in all-reflective optics. In this paper, all-refractive optics has been investigated to design a fast (F/1.55) and wide angle camera objective for large spectral coverage, from UV to VIS and up to NIR, when a large-size CCD is used on the focal plane of the spectrograph of Mont Megantique telescope. The case of achromatic and apochromatic condition has been investigated for axial and lateral color controls. The new proposed solutions have been optimized from two to three different glass combinations in order to have higher throughputs for large spectral coverage, especially in UV region. The used components have been minimized to reduce the light inherent lost. The monochromatic aberrations have been corrected and controlled by using optimized lens bending and shapes to make the camera have the CCD pixel resolution. Ray tracing results displayed the good optical performance of the camera to cover from 350 nm to 1000 nm spectral region with high resolution. The broadband AR coating, enhanced on UV region, will be used on each surface of the lenses in the camera. Final throughputs for the designed camera has been estimated and given in the paper.

  7. Astronomy and the camera obscura

    NASA Astrophysics Data System (ADS)

    Feist, M.

    2000-02-01

    The camera obscura (from Latin meaning darkened chamber) is a simple optical device with a long history. In the form considered here, it can be traced back to 1550. It had its heyday during the Victorian era when it was to be found at the seaside as a tourist attraction or sideshow. It was also used as an artist's drawing aid and, in 1620, the famous astronomer-mathematician, Johannes Kepler used a small tent camera obscura to trace the scenery.

  8. The future of consumer cameras

    NASA Astrophysics Data System (ADS)

    Battiato, Sebastiano; Moltisanti, Marco

    2015-03-01

    In the last two decades multimedia, and in particular imaging devices (camcorders, tablets, mobile phones, etc.) have been dramatically diffused. Moreover the increasing of their computational performances, combined with an higher storage capability, allows them to process large amount of data. In this paper an overview of the current trends of consumer cameras market and technology will be given, providing also some details about the recent past (from Digital Still Camera up today) and forthcoming key issues.

  9. Solid State Television Camera (CID)

    NASA Technical Reports Server (NTRS)

    Steele, D. W.; Green, W. T.

    1976-01-01

    The design, development and test are described of a charge injection device (CID) camera using a 244x248 element array. A number of video signal processing functions are included which maximize the output video dynamic range while retaining the inherently good resolution response of the CID. Some of the unique features of the camera are: low light level performance, high S/N ratio, antiblooming, geometric distortion, sequential scanning and AGC.

  10. Streak camera dynamic range optimization

    SciTech Connect

    Wiedwald, J.D.; Lerche, R.A.

    1987-09-01

    The LLNL optical streak camera is used by the Laser Fusion Program in a wide range of applications. Many of these applications require a large recorded dynamic range. Recent work has focused on maximizing the dynamic range of the streak camera recording system. For our streak cameras, image intensifier saturation limits the upper end of the dynamic range. We have developed procedures to set the image intensifier gain such that the system dynamic range is maximized. Specifically, the gain is set such that a single streak tube photoelectron is recorded with an exposure of about five times the recording system noise. This ensures detection of single photoelectrons, while not consuming intensifier or recording system dynamic range through excessive intensifier gain. The optimum intensifier gain has been determined for two types of film and for a lens-coupled CCD camera. We have determined that by recording the streak camera image with a CCD camera, the system is shot-noise limited up to the onset of image intensifier nonlinearity. When recording on film, the film determines the noise at high exposure levels. There is discussion of the effects of slit width and image intensifier saturation on dynamic range. 8 refs.

  11. Wide Dynamic Range CCD Camera

    NASA Astrophysics Data System (ADS)

    Younse, J. M.; Gove, R. J.; Penz, P. A.; Russell, D. E.

    1984-11-01

    A liquid crystal attenuator (LCA) operated as a variable neutral density filter has been attached to a charge-coupled device (CCD) imager to extend the dynamic range of a solid-state TV camera by an order of magnitude. Many applications are best served by a camera with a dynamic range of several thousand. For example, outside security systems must operate unattended with "dawn-to-dusk" lighting conditions. Although this can be achieved with available auto-iris lens assemblies, more elegant solutions which provide the small size, low power, high reliability advantages of solid state technology are now available. This paper will describe one such unique way of achieving these dynamic ranges using standard optics by making the CCD imager's glass cover a controllable neutral density filter. The liquid crystal attenuator's structure and theoretical properties for this application will be described along with measured transmittance. A small integrated TV camera which utilizes a "virtual-phase" CCD sensor coupled to a LCA will be described and test results for a number of the camera's optical and electrical parameters will be given. These include the following camera parameters: dynamic range, Modulation Transfer Function (MTF), spectral response, and uniformity. Also described will be circuitry which senses the ambient scene illuminance and automatically provides feedback signals to appropriately adjust the transmittance of the LCA. Finally, image photographs using this camera, under various scene illuminations, will be shown.

  12. The virtual gamma camera room.

    PubMed

    Penrose, J M; Trowbridge, E A; Tindale, W B

    1996-05-01

    The installation of a gamma camera is time-consuming and costly and, once installed, the camera position is unlikely to be altered during its working life. Poor choice of camera position therefore has long-term consequences. Additional equipment such as collimators and carts, the operator's workstation and wall-mounted display monitors must also be situated to maximize access and ease of use. The layout of a gamma camera room can be optimized prior to installation by creating a virtual environment. Super-Scape VRT software running on an upgraded 486 PC microprocessor was used to create a 'virtual camera room'. The simulation included an operator's viewpoint and a controlled tour of the room. Equipment could be repositioned as required, allowing potential problems to be identified at the design stage. Access for bed-ridden patients, operator ergonomics, operator and patient visibility were addressed. The display can also be used for patient education. Creation of a virtual environment is a valuable tool which allows different camera systems to be compared interactively in terms of dimensions, extent of movement and use of a defined space. Such a system also has applications in radiopharmacy design and simulation. PMID:8736511

  13. Teleconferencing system using virtual camera

    NASA Astrophysics Data System (ADS)

    Shibuichi, Daisuke; Tanaka, Tsukasa; Terashima, Nobuyoshi; Tominaga, Hideyoshi

    2000-05-01

    Teleconferencing systems are becoming more popular because of advance in image processing and broadband network. Nevertheless, communicating with someone at a remote location through a teleconferencing system still presents problems because of the difficulty of establishing and maintaining eye contact. Eye contact is essential to having a natural dialog. The purpose of our study is to make eye contact possible during dialog by using image processing with no particular devices, such as color markers, sensors, which are equipped users with and IR cameras. Proposed teleconferencing system is composed of a computer, a display attached to the computer, and four cameras. We define virtual camera as the camera, which exists virtually in 3D space. By using the proposed method, we can acquire a front view of a person that is taken with the virtual camera. The image taken with virtual camera is generated by extracting a same feature point among four face images. Feature point sets among four face images are automatically corresponded by using Epipolar Plane Images (EPIs). The users can establish eye contact by acquiring the front face view, and moreover, they also can obtain various views of the image because 3D points of the object can be extracted from EPIs. Through these facilities, the proposed system will provide users with better communication than previous systems. In this paper, we describe the concept, implementation and the evaluation are described in various perspective.

  14. Narrowing of intersensory speech perception in infancy.

    PubMed

    Pons, Ferran; Lewkowicz, David J; Soto-Faraco, Salvador; Sebastián-Gallés, Núria

    2009-06-30

    The conventional view is that perceptual/cognitive development is an incremental process of acquisition. Several striking findings have revealed, however, that the sensitivity to non-native languages, faces, vocalizations, and music that is present early in life declines as infants acquire experience with native perceptual inputs. In the language domain, the decline in sensitivity is reflected in a process of perceptual narrowing that is thought to play a critical role during the acquisition of a native-language phonological system. Here, we provide evidence that such a decline also occurs in infant response to multisensory speech. We found that infant intersensory response to a non-native phonetic contrast narrows between 6 and 11 months of age, suggesting that the perceptual system becomes increasingly more tuned to key native-language audiovisual correspondences. Our findings lend support to the notion that perceptual narrowing is a domain-general as well as a pan-sensory developmental process. PMID:19541648

  15. Discovery of a narrow line quasar

    NASA Technical Reports Server (NTRS)

    Stocke, J.; Liebert, J.; Maccacaro, T.; Griffiths, R. E.; Steiner, J. E.

    1982-01-01

    A stellar object is reported which, while having X-ray and optical luminosities typical of quasars, has narrow permitted and forbidden emission lines over the observed spectral range. The narrow-line spectrum is high-excitation, the Balmer lines seem to be recombinational, and a redder optical spectrum than that of most quasars is exhibited, despite detection as a weak radio source. The object does not conform to the relationships between H-beta parameters and X-ray flux previously claimed for a large sample of the active galactic nuclei. Because reddish quasars with narrow lines, such as the object identified, may not be found by the standard techniques for the discovery of quasars, the object may be a prototype of a new class of quasars analogous to high-luminosity Seyfert type 2 galaxies. It is suggested that these objects cannot comprise more than 10% of all quasars.

  16. Multi-Angle View of the Canary Islands

    NASA Technical Reports Server (NTRS)

    2000-01-01

    A multi-angle view of the Canary Islands in a dust storm, 29 February 2000. At left is a true-color image taken by the Multi-angle Imaging SpectroRadiometer (MISR) instrument on NASA's Terra satellite. This image was captured by the MISR camera looking at a 70.5-degree angle to the surface, ahead of the spacecraft. The middle image was taken by the MISR downward-looking (nadir) camera, and the right image is from the aftward 70.5-degree camera. The images are reproduced using the same radiometric scale, so variations in brightness, color, and contrast represent true variations in surface and atmospheric reflectance with angle. Windblown dust from the Sahara Desert is apparent in all three images, and is much brighter in the oblique views. This illustrates how MISR's oblique imaging capability makes the instrument a sensitive detector of dust and other particles in the atmosphere. Data for all channels are presented in a Space Oblique Mercator map projection to facilitate their co-registration. The images are about 400 km (250 miles)wide, with a spatial resolution of about 1.1 kilometers (1,200 yards). North is toward the top. MISR was built and is managed by NASA's Jet Propulsion Laboratory, Pasadena, CA, for NASA's Office of Earth Science, Washington, DC. The Terra satellite is managed by NASA's Goddard Space Flight Center, Greenbelt, MD. JPL is a division of the California Institute of Technology.

  17. Design, modeling and performance of a hybrid portable gamma camera

    NASA Astrophysics Data System (ADS)

    Smith, Leon Eric

    The combination of a mechanically-collimated gamma-ray camera with an electronically-collimated gamma camera offers both the high efficiency and good angular resolution typical in a mechanically-collimated camera for lower photon energies and the uncoupling of spatial resolution and efficiency provided by an electronically-collimated camera at higher energies. The design, construction, performance modeling and measured performance of the Hybrid Portable Gamma Camera (HPGC) are presented here. Intended for industrial use, the HPGC offers good angular resolution and efficiency over a broad energy range (50 keV to 2 MeV) by combining a MURA coded aperture camera with a Compton scatter camera in a single system. The HPGC consists of two detector modules: (1) a NaI(Tl) scintillator with Anger logic readout and (2) a CsI(Na) pixellated crystal viewed by a position-sensitive photomultiplier tube. Analytical calculations of angular resolution components and efficiency for the HPGC were compared to Monte Carlo calculations of the same quantities. The predicted angular resolution performance for on-axis point sources, a central scattering angle of 45sp° and a detector separation distance of 35 cm ranges from 3.5-6sp° FWHM over the sensitive energy range. The mechanical collimation intrinsic efficiency for energies up to 800 keV varies from 0.50 to 0.05 while the electronic collimation intrinsic efficiency for energies above 400 keV is 7.0×10sp{-4} to 5×10sp{-5}. The experimentally measured angular resolution and efficiency values show good agreement with the modeling predictions for incident energies of 412 keV and 662 keV. Although work has been done on mechanical collimation cameras and electronic collimation cameras operating independently, no truly hybrid imaging system has been constructed that uses the same gamma ray for both mechanical collimation and electronic collimation information. This dissertation compares the relative information per photon for three imaging modalities: mechanical collimation, electronic collimation and hybrid collimation. The analysis is done for point sources at two incident energies (412 keV and 662 keV) in the medium energy range of operation for the HPGC (400 keV to 800 keV) where neither mechanical collimation nor electronic collimation performs particularly well acting independently. A tool from estimation theory called resolution-variance analysis is used to compare the three modalities. Results show that hybrid collimation is superior to mechanical and electronic collimation at both 412 keV and 662 keV over the resolution range likely to be used for such a camera.

  18. II-VI Narrow-Bandgap Semiconductors for Optoelectronics

    NASA Astrophysics Data System (ADS)

    Baker, Ian

    The field of narrow-gap II-VI materials is dominated by the compound semiconductor mercury cadmium telluride, (Hg1-x Cd x Te or MCT), which supports a large industry in infrared detectors, cameras and infrared systems. It is probably true to say that HgCdTe is the third most studied semiconductor after silicon and gallium arsenide. Hg1-x Cd x Te is the material most widely used in high-performance infrared detectors at present. By changing the composition x the spectral response of the detector can be made to cover the range from 1 μm to beyond 17 μm. The advantages of this system arise from a number of features, notably: close lattice matching, high optical absorption coefficient, low carrier generation rate, high electron mobility and readily available doping techniques. These advantages mean that very sensitive infrared detectors can be produced at relatively high operating temperatures. Hg1-x Cd x Te multilayers can be readily grown in vapor-phase epitaxial processes. This provides the device engineer with complex doping and composition profiles that can be used to further enhance the electro-optic performance, leading to low-cost, large-area detectors in the future. The main purpose of this chapter is to describe the applications, device physics and technology of II-VI narrow-bandgap devices, focusing on HgCdTe but also including Hg1-x Mn x Te and Hg1-x Zn x Te. It concludes with a review of the research and development programs into third-generation infrared detector technology (so-called GEN III detectors) being performed in centers around the world.

  19. Efficient, Narrow-Pass-Band Optical Filters

    NASA Technical Reports Server (NTRS)

    Sandford, Stephen P.

    1996-01-01

    Optical filters with both narrow pass bands and high efficiencies fabricated to design specifications. Offer tremendous improvements in performance for number of optical (including infrared) systems. In fiber-optic and free-space communication systems, precise frequency discrimination afforded by narrow pass bands of filters provide higher channel capacities. In active and passive remote sensors like lidar and gas-filter-correlation radiometers, increased efficiencies afforded by filters enhance detection of small signals against large background noise. In addition, sizes, weights, and power requirements of many optical and infrared systems reduced by taking advantage of gains in signal-to-noise ratios delivered by filters.

  20. The MC and LFC cameras. [metric camera (MC); large format camera (LFC)

    NASA Technical Reports Server (NTRS)

    Norton, Clarice L.; Schroeder, Manfried; Mollberg, Bernard

    1986-01-01

    The characteristics of the shuttle-borne Large Format Camera are listed. The LFC focal plane format was 23 by 46 cm, double the usual size, thereby acquiring approximately double the ground area. Forward motion compensation was employed. With the stable platform (shuttle) it was possible to use the slow exposure, high resolution, Kodak aerial films; 3414 and 3412 black and white, SO-242 color, and SO-131 aerochrome infrared. The camera was designed to maintain stability during varying temperature extremes of space.

  1. Casting and Angling.

    ERIC Educational Resources Information Center

    Smith, Julian W.

    As part of a series of books and pamphlets on outdoor education, this manual consists of easy-to-follow instructions for fishing activities dealing with casting and angling. The manual may be used as a part of the regular physical education program in schools and colleges or as a club activity for the accomplished weekend fisherman or the…

  2. Cross-ratio-based line scan camera calibration using a planar pattern

    NASA Astrophysics Data System (ADS)

    Li, Dongdong; Wen, Gongjian; Qiu, Shaohua

    2016-01-01

    A flexible new technique is proposed to calibrate the geometric model of line scan cameras. In this technique, the line scan camera is rigidly coupled to a calibrated frame camera to establish a pair of stereo cameras. The linear displacements and rotation angles between the two cameras are fixed but unknown. This technique only requires the pair of stereo cameras to observe a specially designed planar pattern shown at a few (at least two) different orientations. At each orientation, a stereo pair is obtained including a linear array image and a frame image. Radial distortion of the line scan camera is modeled. The calibration scheme includes two stages. First, point correspondences are established from the pattern geometry and the projective invariance of cross-ratio. Second, with a two-step calibration procedure, the intrinsic parameters of the line scan camera are recovered from several stereo pairs together with the rigid transform parameters between the pair of stereo cameras. Both computer simulation and real data experiments are conducted to test the precision and robustness of the calibration algorithm, and very good calibration results have been obtained. Compared with classical techniques which use three-dimensional calibration objects or controllable moving platforms, our technique is affordable and flexible in close-range photogrammetric applications.

  3. Sub-Camera Calibration of a Penta-Camera

    NASA Astrophysics Data System (ADS)

    Jacobsen, K.; Gerke, M.

    2016-03-01

    Penta cameras consisting of a nadir and four inclined cameras are becoming more and more popular, having the advantage of imaging also facades in built up areas from four directions. Such system cameras require a boresight calibration of the geometric relation of the cameras to each other, but also a calibration of the sub-cameras. Based on data sets of the ISPRS/EuroSDR benchmark for multi platform photogrammetry the inner orientation of the used IGI Penta DigiCAM has been analyzed. The required image coordinates of the blocks Dortmund and Zeche Zollern have been determined by Pix4Dmapper and have been independently adjusted and analyzed by program system BLUH. With 4.1 million image points in 314 images respectively 3.9 million image points in 248 images a dense matching was provided by Pix4Dmapper. With up to 19 respectively 29 images per object point the images are well connected, nevertheless the high number of images per object point are concentrated to the block centres while the inclined images outside the block centre are satisfying but not very strongly connected. This leads to very high values for the Student test (T-test) of the finally used additional parameters or in other words, additional parameters are highly significant. The estimated radial symmetric distortion of the nadir sub-camera corresponds to the laboratory calibration of IGI, but there are still radial symmetric distortions also for the inclined cameras with a size exceeding 5μm even if mentioned as negligible based on the laboratory calibration. Radial and tangential effects of the image corners are limited but still available. Remarkable angular affine systematic image errors can be seen especially in the block Zeche Zollern. Such deformations are unusual for digital matrix cameras, but it can be caused by the correlation between inner and exterior orientation if only parallel flight lines are used. With exception of the angular affinity the systematic image errors for corresponding cameras of both blocks have the same trend, but as usual for block adjustments with self calibration, they still show significant differences. Based on the very high number of image points the remaining image residuals can be safely determined by overlaying and averaging the image residuals corresponding to their image coordinates. The size of the systematic image errors, not covered by the used additional parameters, is in the range of a square mean of 0.1 pixels corresponding to 0.6μm. They are not the same for both blocks, but show some similarities for corresponding cameras. In general the bundle block adjustment with a satisfying set of additional parameters, checked by remaining systematic errors, is required for use of the whole geometric potential of the penta camera. Especially for object points on facades, often only in two images and taken with a limited base length, the correct handling of systematic image errors is important. At least in the analyzed data sets the self calibration of sub-cameras by bundle block adjustment suffers from the correlation of the inner to the exterior calibration due to missing crossing flight directions. As usual, the systematic image errors differ from block to block even without the influence of the correlation to the exterior orientation.

  4. Photometric Calibration of Consumer Video Cameras

    NASA Technical Reports Server (NTRS)

    Suggs, Robert; Swift, Wesley, Jr.

    2007-01-01

    Equipment and techniques have been developed to implement a method of photometric calibration of consumer video cameras for imaging of objects that are sufficiently narrow or sufficiently distant to be optically equivalent to point or line sources. Heretofore, it has been difficult to calibrate consumer video cameras, especially in cases of image saturation, because they exhibit nonlinear responses with dynamic ranges much smaller than those of scientific-grade video cameras. The present method not only takes this difficulty in stride but also makes it possible to extend effective dynamic ranges to several powers of ten beyond saturation levels. The method will likely be primarily useful in astronomical photometry. There are also potential commercial applications in medical and industrial imaging of point or line sources in the presence of saturation.This development was prompted by the need to measure brightnesses of debris in amateur video images of the breakup of the Space Shuttle Columbia. The purpose of these measurements is to use the brightness values to estimate relative masses of debris objects. In most of the images, the brightness of the main body of Columbia was found to exceed the dynamic ranges of the cameras. A similar problem arose a few years ago in the analysis of video images of Leonid meteors. The present method is a refined version of the calibration method developed to solve the Leonid calibration problem. In this method, one performs an endto- end calibration of the entire imaging system, including not only the imaging optics and imaging photodetector array but also analog tape recording and playback equipment (if used) and any frame grabber or other analog-to-digital converter (if used). To automatically incorporate the effects of nonlinearity and any other distortions into the calibration, the calibration images are processed in precisely the same manner as are the images of meteors, space-shuttle debris, or other objects that one seeks to analyze. The light source used to generate the calibration images is an artificial variable star comprising a Newtonian collimator illuminated by a light source modulated by a rotating variable neutral- density filter. This source acts as a point source, the brightness of which varies at a known rate. A video camera to be calibrated is aimed at this source. Fixed neutral-density filters are inserted in or removed from the light path as needed to make the video image of the source appear to fluctuate between dark and saturated bright. The resulting video-image data are analyzed by use of custom software that determines the integrated signal in each video frame and determines the system response curve (measured output signal versus input brightness). These determinations constitute the calibration, which is thereafter used in automatic, frame-by-frame processing of the data from the video images to be analyzed.

  5. The fly's eye camera system

    NASA Astrophysics Data System (ADS)

    Mészáros, L.; Pál, A.; Csépány, G.; Jaskó, A.; Vida, K.; Oláh, K.; Mezö, G.

    2014-12-01

    We introduce the Fly's Eye Camera System, an all-sky monitoring device intended to perform time domain astronomy. This camera system design will provide complementary data sets for other synoptic sky surveys such as LSST or Pan-STARRS. The effective field of view is obtained by 19 cameras arranged in a spherical mosaic form. These individual cameras of the device stand on a hexapod mount that is fully capable of achieving sidereal tracking for the subsequent exposures. This platform has many advantages. First of all it requires only one type of moving component and does not include unique parts. Hence this design not only eliminates problems implied by unique elements, but the redundancy of the hexapod allows smooth operations even if one or two of the legs are stuck. In addition, it can calibrate itself by observed stars independently from both the geographical location (including northen and southern hemisphere) and the polar alignment of the full mount. All mechanical elements and electronics are designed within the confines of our institute Konkoly Observatory. Currently, our instrument is in testing phase with an operating hexapod and reduced number of cameras.

  6. The Clementine longwave infrared camera

    SciTech Connect

    Priest, R.E.; Lewis, I.T.; Sewall, N.R.; Park, H.S.; Shannon, M.J.; Ledebuhr, A.G.; Pleasance, L.D.; Massie, M.A.; Metschuleit, K.

    1995-04-01

    The Clementine mission provided the first ever complete, systematic surface mapping of the moon from the ultra-violet to the near-infrared regions. More than 1.7 million images of the moon, earth and space were returned from this mission. The longwave-infrared (LWIR) camera supplemented the UV/Visible and near-infrared mapping cameras providing limited strip coverage of the moon, giving insight to the thermal properties of the soils. This camera provided {approximately}100 m spatial resolution at 400 km periselene, and a 7 km across-track swath. This 2.1 kg camera using a 128 x 128 Mercury-Cadmium-Telluride (MCT) FPA viewed thermal emission of the lunar surface and lunar horizon in the 8.0 to 9.5 {micro}m wavelength region. A description of this light-weight, low power LWIR camera along with a summary of lessons learned is presented. Design goals and preliminary on-orbit performance estimates are addressed in terms of meeting the mission`s primary objective for flight qualifying the sensors for future Department of Defense flights.

  7. CARTOGAM: a portable gamma camera

    NASA Astrophysics Data System (ADS)

    Gal, O.; Izac, C.; Lainé, F.; Nguyen, A.

    1997-02-01

    The gamma camera is devised to establish the cartography of radioactive sources against a visible background in quasi real time. This device is designed to spot sources from a distance during the preparation of interventions on active areas of nuclear installations. This implement will permit to optimize interventions especially on the dosimetric level. The camera consists of a double cone collimator, a scintillator and an intensified CCD camera. This chain of detection provides the formation of both gamma images and visible images. Even though it is wrapped in a denal shield, the camera is still portable (mass < 15 kg) and compact (external diameter = 8 cm). The angular resolution is of the order of one degree for gamma rays of 1 MeV. In a few minutes, the device is able to measure a dose rate of 10 μGy/h delivered for instance by a source of 60Co of 90 mCi located at 10 m from the detector. The first images recorded in the laboratory will be presented and will illustrate the performances obtained with this camera.

  8. Narrow-Band Applications of Communications Satellites.

    ERIC Educational Resources Information Center

    Cowlan, Bert; Horowitz, Andrew

    This paper attempts to describe the advantages of "narrow-band" applications of communications satellites for education. It begins by discussing the general controversy surrounding the use of satellites in education, by placing the concern within the larger context of the general debate over the uses of new technologies in education, and by…

  9. Narrow Feshbach Dance of Two Trapped Atoms

    NASA Astrophysics Data System (ADS)

    Lopez Valdez, Nicolas; Timmermans, Eddy; Tsai, Shan-Wen

    2012-06-01

    Near a narrow Feshbach resonance (with magnetic field width 10 mG or smaller) the ultra-cold atom interactions acquire an effective range that can be comparable to the average inter-particle distance. Although requiring a more accurate magnetic field control than their broad counterparts, the narrow Feshbach resonances can free cold atom physics from its straightjacket of the contact interaction paradigm. The finite-range effects can give rise to roton features in the phonon dispersion of dilute Bose-Einstein condensates (BEC's) and BEC's can support a ground state with modulated density patterns that breaks translational symmetry. We show that the finite range interaction is the consequence of the time-delay in atom-atom collisions. The narrow regime is also the parameter region in which the interacting atoms can spend a significant fraction of their time in the spin-rearranged (also called ``closed'') channel. To study the interaction physics we describe two atoms in a harmonic trap, interacting near a narrow resonance. We find the fraction of time that the atoms spend in the closed channel at fixed magnetic field and we study the time evolution of this system under conditions of a time-varying magnetic field.

  10. WIDE-FIELD ASTRONOMICAL MULTISCALE CAMERAS

    SciTech Connect

    Marks, Daniel L.; Brady, David J.

    2013-05-15

    In order to produce sufficiently low aberrations with a large aperture, telescopes have a limited field of view. Because of this narrow field, large areas of the sky at a given time are unobserved. We propose several telescopes based on monocentric reflective, catadioptric, and refractive objectives that may be scaled to wide fields of view and achieve 1.''1 resolution, which in most locations is the practical seeing limit of the atmosphere. The reflective and Schmidt catadioptric objectives have relatively simple configurations and enable large fields to be captured at the expense of the obscuration of the mirror by secondary optics, a defect that may be managed by image plane design. The refractive telescope design does not have an obscuration but the objective has substantial bulk. The refractive design is a 38 gigapixel camera which consists of a single monocentric objective and 4272 microcameras. Monocentric multiscale telescopes, with their wide fields of view, may observe phenomena that might otherwise be unnoticed, such as supernovae, glint from orbital space debris, and near-earth objects.

  11. Wide field camera observations of Baade's Window

    NASA Technical Reports Server (NTRS)

    Holtzman, Jon A.; Light, R. M.; Baum, William A.; Worthey, Guy; Faber, S. M.; Hunter, Deidre A.; O'Neil, Earl J., Jr.; Kreidl, Tobias J.; Groth, E. J.; Westphal, James A.

    1993-01-01

    We have observed a field in Baade's Window using the Wide Field Camera (WFC) of the Hubble Space Telescope (HST) and obtain V- and I-band photometry down to V approximately 22.5. These data go several magnitudes fainter than previously obtained from the ground. The location of the break in the luminosity function suggests that there are a significant number of intermediate age (less than 10 Gyr) stars in the Galactic bulge. This conclusion rests on the assumptions that the extinction towards our field is similar to that seen in other parts of Baade's Window, that the distance to the bulge is approximately 8 kpc, and that we can determine fairly accurate zero points for the HST photometry. Changes in any one of these assumptions could increase the inferred age, but a conspiracy of lower reddening, a shorter distance to the bulge, and/or photometric zero-point errors would be needed to imply a population entirely older than 10 Gyr. We infer an initial mass function slope for the main-sequence stars, and find that it is consistent with that measured in the solar neighborhood; unfortunately, the slope is poorly constrained because we sample only a narrow range of stellar mass and because of uncertainties in the observed luminosity function at the faint end.

  12. Method for shaping and aiming narrow beams. [sonar mapping and target identification

    NASA Technical Reports Server (NTRS)

    Heyser, R. C. (Inventor)

    1981-01-01

    A sonar method and apparatus is discribed which utilizes a linear frequency chirp in a transmitter/receiver having a correlator to synthesize a narrow beamwidth pattern from otherwise broadbeam transducers when there is relative velocity between the transmitter/receiver and the target. The chirp is so produced in a generator in bandwidth, B, and time, T, as to produce a time bandwidth product, TB, that is increased for a narrower angle. A replica of the chirp produced in a generator is time delayed and Doppler shifted for use as a reference in the receiver for correlation of received chirps from targets. This reference is Doppler shifted to select targets preferentially, thereby to not only synthesize a narrow beam but also aim the beam in azimuth and elevation.

  13. Dynamic calibration of pan-tilt-zoom cameras for traffic monitoring.

    PubMed

    Song, Kai-Tai; Tai, Jen-Chao

    2006-10-01

    Pan-tilt-zoom (PTZ) cameras have been widely used in recent years for monitoring and surveillance applications. These cameras provide flexible view selection as well as a wider observation range. This makes them suitable for vision-based traffic monitoring and enforcement systems. To employ PTZ cameras for image measurement applications, one first needs to calibrate the camera to obtain meaningful results. For instance, the accuracy of estimating vehicle speed depends on the accuracy of camera calibration and that of vehicle tracking results. This paper presents a novel calibration method for a PTZ camera overlooking a traffic scene. The proposed approach requires no manual operation to select the positions of special features. It automatically uses a set of parallel lane markings and the lane width to compute the camera parameters, namely, focal length, tilt angle, and pan angle. Image processing procedures have been developed for automatically finding parallel lane markings. Interesting experimental results are presented to validate the robustness and accuracy of the proposed method. PMID:17036815

  14. A testbed for wide-field, high-resolution, gigapixel-class cameras.

    PubMed

    Kittle, David S; Marks, Daniel L; Son, Hui S; Kim, Jungsang; Brady, David J

    2013-05-01

    The high resolution and wide field of view (FOV) of the AWARE (Advanced Wide FOV Architectures for Image Reconstruction and Exploitation) gigapixel class cameras present new challenges in calibration, mechanical testing, and optical performance evaluation. The AWARE system integrates an array of micro-cameras in a multiscale design to achieve gigapixel sampling at video rates. Alignment and optical testing of the micro-cameras is vital in compositing engines, which require pixel-level accurate mappings over the entire array of cameras. A testbed has been developed to automatically calibrate and measure the optical performance of the entire camera array. This testbed utilizes translation and rotation stages to project a ray into any micro-camera of the AWARE system. A spatial light modulator is projected through a telescope to form an arbitrary object space pattern at infinity. This collimated source is then reflected by an elevation stage mirror for pointing through the aperture of the objective into the micro-optics and eventually the detector of the micro-camera. Different targets can be projected with the spatial light modulator for measuring the modulation transfer function (MTF) of the system, fiducials in the overlap regions for registration and compositing, distortion mapping, illumination profiles, thermal stability, and focus calibration. The mathematics of the testbed mechanics are derived for finding the positions of the stages to achieve a particular incident angle into the camera, along with calibration steps for alignment of the camera and testbed coordinate axes. Measurement results for the AWARE-2 gigapixel camera are presented for MTF, focus calibration, illumination profile, fiducial mapping across the micro-camera for registration and distortion correction, thermal stability, and alignment of the camera on the testbed. PMID:23742532

  15. A testbed for wide-field, high-resolution, gigapixel-class cameras

    NASA Astrophysics Data System (ADS)

    Kittle, David S.; Marks, Daniel L.; Son, Hui S.; Kim, Jungsang; Brady, David J.

    2013-05-01

    The high resolution and wide field of view (FOV) of the AWARE (Advanced Wide FOV Architectures for Image Reconstruction and Exploitation) gigapixel class cameras present new challenges in calibration, mechanical testing, and optical performance evaluation. The AWARE system integrates an array of micro-cameras in a multiscale design to achieve gigapixel sampling at video rates. Alignment and optical testing of the micro-cameras is vital in compositing engines, which require pixel-level accurate mappings over the entire array of cameras. A testbed has been developed to automatically calibrate and measure the optical performance of the entire camera array. This testbed utilizes translation and rotation stages to project a ray into any micro-camera of the AWARE system. A spatial light modulator is projected through a telescope to form an arbitrary object space pattern at infinity. This collimated source is then reflected by an elevation stage mirror for pointing through the aperture of the objective into the micro-optics and eventually the detector of the micro-camera. Different targets can be projected with the spatial light modulator for measuring the modulation transfer function (MTF) of the system, fiducials in the overlap regions for registration and compositing, distortion mapping, illumination profiles, thermal stability, and focus calibration. The mathematics of the testbed mechanics are derived for finding the positions of the stages to achieve a particular incident angle into the camera, along with calibration steps for alignment of the camera and testbed coordinate axes. Measurement results for the AWARE-2 gigapixel camera are presented for MTF, focus calibration, illumination profile, fiducial mapping across the micro-camera for registration and distortion correction, thermal stability, and alignment of the camera on the testbed.

  16. Cameras for semiconductor process control

    NASA Technical Reports Server (NTRS)

    Porter, W. A.; Parker, D. L.

    1977-01-01

    The application of X-ray topography to semiconductor process control is described, considering the novel features of the high speed camera and the difficulties associated with this technique. The most significant results on the effects of material defects on device performance are presented, including results obtained using wafers processed entirely within this institute. Defects were identified using the X-ray camera and correlations made with probe data. Also included are temperature dependent effects of material defects. Recent applications and improvements of X-ray topographs of silicon-on-sapphire and gallium arsenide are presented with a description of a real time TV system prototype and of the most recent vacuum chuck design. Discussion is included of our promotion of the use of the camera by various semiconductor manufacturers.

  17. The GISMO-2 Bolometer Camera

    NASA Technical Reports Server (NTRS)

    Staguhn, Johannes G.; Benford, Dominic J.; Fixsen, Dale J.; Hilton, Gene; Irwin, Kent D.; Jhabvala, Christine A.; Kovacs, Attila; Leclercq, Samuel; Maher, Stephen F.; Miller, Timothy M.; Moseley, Samuel H.; Sharp, Elemer H.; Wollack, Edward J.

    2012-01-01

    We present the concept for the GISMO-2 bolometer camera) which we build for background-limited operation at the IRAM 30 m telescope on Pico Veleta, Spain. GISM0-2 will operate Simultaneously in the 1 mm and 2 mm atmospherical windows. The 1 mm channel uses a 32 x 40 TES-based Backshort Under Grid (BUG) bolometer array, the 2 mm channel operates with a 16 x 16 BUG array. The camera utilizes almost the entire full field of view provided by the telescope. The optical design of GISM0-2 was strongly influenced by our experience with the GISMO 2 mm bolometer camera which is successfully operating at the 30m telescope. GISMO is accessible to the astronomical community through the regular IRAM call for proposals.

  18. Dark Energy Camera for Blanco

    SciTech Connect

    Binder, Gary A.; /Caltech /SLAC

    2010-08-25

    In order to make accurate measurements of dark energy, a system is needed to monitor the focus and alignment of the Dark Energy Camera (DECam) to be located on the Blanco 4m Telescope for the upcoming Dark Energy Survey. One new approach under development is to fit out-of-focus star images to a point spread function from which information about the focus and tilt of the camera can be obtained. As a first test of a new algorithm using this idea, simulated star images produced from a model of DECam in the optics software Zemax were fitted. Then, real images from the Mosaic II imager currently installed on the Blanco telescope were used to investigate the algorithm's capabilities. A number of problems with the algorithm were found, and more work is needed to understand its limitations and improve its capabilities so it can reliably predict camera alignment and focus.

  19. Perceptual Color Characterization of Cameras

    PubMed Central

    Vazquez-Corral, Javier; Connah, David; Bertalmío, Marcelo

    2014-01-01

    Color camera characterization, mapping outputs from the camera sensors to an independent color space, such as XY Z, is an important step in the camera processing pipeline. Until now, this procedure has been primarily solved by using a 3 × 3 matrix obtained via a least-squares optimization. In this paper, we propose to use the spherical sampling method, recently published by Finlayson et al., to perform a perceptual color characterization. In particular, we search for the 3 × 3 matrix that minimizes three different perceptual errors, one pixel based and two spatially based. For the pixel-based case, we minimize the CIE ΔE error, while for the spatial-based case, we minimize both the S-CIELAB error and the CID error measure. Our results demonstrate an improvement of approximately 3% for the ΔE error, 7% for the S-CIELAB error and 13% for the CID error measures. PMID:25490586

  20. Perceptual color characterization of cameras.

    PubMed

    Vazquez-Corral, Javier; Connah, David; Bertalmío, Marcelo

    2014-01-01

    Color camera characterization, mapping outputs from the camera sensors to an independent color space, such as XYZ, is an important step in the camera processing pipeline. Until now, this procedure has been primarily solved by using a 3 × 3 matrix obtained via a least-squares optimization. In this paper, we propose to use the spherical sampling method, recently published by Finlayson et al., to perform a perceptual color characterization. In particular, we search for the 3 × 3 matrix that minimizes three different perceptual errors, one pixel based and two spatially based. For the pixel-based case, we minimize the CIE ΔE error, while for the spatial-based case, we minimize both the S-CIELAB error and the CID error measure. Our results demonstrate an improvement of approximately 3% for the ΔE error, 7% for the S-CIELAB error and 13% for the CID error measures. PMID:25490586

  1. Camera-on-a-Chip

    NASA Technical Reports Server (NTRS)

    1999-01-01

    Jet Propulsion Laboratory's research on a second generation, solid-state image sensor technology has resulted in the Complementary Metal- Oxide Semiconductor Active Pixel Sensor (CMOS), establishing an alternative to the Charged Coupled Device (CCD). Photobit Corporation, the leading supplier of CMOS image sensors, has commercialized two products of their own based on this technology: the PB-100 and PB-300. These devices are cameras on a chip, combining all camera functions. CMOS "active-pixel" digital image sensors offer several advantages over CCDs, a technology used in video and still-camera applications for 30 years. The CMOS sensors draw less energy, they use the same manufacturing platform as most microprocessors and memory chips, and they allow on-chip programming of frame size, exposure, and other parameters.

  2. A Different Angle on Perspective

    ERIC Educational Resources Information Center

    Frantz, Marc

    2012-01-01

    When a plane figure is photographed from different viewpoints, lengths and angles appear distorted. Hence it is often assumed that lengths, angles, protractors, and compasses have no place in projective geometry. Here we describe a sense in which certain angles are preserved by projective transformations. These angles can be constructed with…

  3. A Different Angle on Perspective

    ERIC Educational Resources Information Center

    Frantz, Marc

    2012-01-01

    When a plane figure is photographed from different viewpoints, lengths and angles appear distorted. Hence it is often assumed that lengths, angles, protractors, and compasses have no place in projective geometry. Here we describe a sense in which certain angles are preserved by projective transformations. These angles can be constructed with

  4. Angles in the Sky?

    NASA Astrophysics Data System (ADS)

    Behr, Bradford

    2005-09-01

    Tycho Brahe lived and worked in the late 1500s before the telescope was invented. He made highly accurate observations of the positions of planets, stars, and comets using large angle-measuring devices of his own design. You can use his techniques to observe the sky as well. For example, the degree, a common unit of measurement in astronomy, can be measured by holding your fist at arm's length up to the sky. Open your fist and observe the distance across the sky covered by the width of your pinky fingernail. That is, roughly, a degree! After some practice, and knowing that one degree equals four minutes, you can measure elapsed time by measuring the angle of the distance that the Moon appears to have moved and multiplying that number by four. You can also figure distances and sizes of things. These are not precise measurements, but rough estimates that can give you a "close-enough" answer.

  5. Optically trapped atomic resonant devices for narrow linewidth spectral imaging

    NASA Astrophysics Data System (ADS)

    Qian, Lipeng

    This thesis focuses on the development of atomic resonant devices for spectroscopic applications. The primary emphasis is on the imaging properties of optically thick atomic resonant fluorescent filters and their applications. In addition, this thesis presents a new concept for producing very narrow linewidth light as from an atomic vapor lamp pumped by a nanosecond pulse system. This research was motivated by application for missile warning system, and presents an innovative approach to a wide angle, ultra narrow linewidth imaging filter using a potassium vapor cell. The approach is to image onto and collect the fluorescent photons emitted from the surface of an optically thick potassium vapor cell, generating a 2 GHz pass-band imaging filter. This linewidth is narrow enough to fall within a Fraunhefer dark zone in the solar spectrum, thus make the detection solar blind. Experiments are conducted to measure the absorption line shape of the potassium resonant filter, the quantum efficiency of the fluorescent behavior, and the resolution of the fluorescent image. Fluorescent images with different spatial frequency components are analyzed by using a discrete Fourier transform, and the imaging capability of the fluorescent filter is described by its Modulation Transfer Function. For the detection of radiation that is spectrally broader than the linewidth of the potassium imaging filter, the fluorescent image is seen to be blurred by diffuse fluorescence from the slightly off resonant photons. To correct this, an ultra-thin potassium imaging filter is developed and characterized. The imaging property of the ultra-thin potassium imaging cell is tested with a potassium seeded flame, yielding a resolution image of ˜ 20 lines per mm. The physics behind the atomic resonant fluorescent filter is radiation trapping. The diffusion process of the resonant photons trapped in the atomic vapor is theoretically described in this thesis. A Monte Carlo method is used to simulate the absorption and fluorescence. The optimum resolution of the fluorescent image is predicted by simulation. Radiation trapping is also shown to be useful for the generation of ultra-narrow linewidth light from an atomic vapor flash lamp. A 2 nanosecond, high voltage pulse is used to excite low pressure mercury vapor mixed with noble gases, producing high intensity emission at the mercury resonant line at 253.7 nm. With a nanosecond pumping time and high electrical current, the radiation intensity of the mercury discharge is increased significantly compared to a normal glow discharge lamp, while simultaneously suppressing the formation of an arc discharge. By avoiding the arc discharge, discrete spectral lines of mercury were kept at narrow bandwidth. Due to radiation trapping, the emission linewidth from the nanosecond mercury lamp decreases with time and produces ultra-narrow linewidth emission 100 ns after of the excitation, this linewidth is verified by absorption measurements through low pressure mercury absorption filter. The lamp is used along with mercury absorption filters for spectroscopic applications, including Filtered Rayleigh Scattering with different CO2 pressures and Raman scattering from methanol.

  6. Laser angle sensor

    NASA Technical Reports Server (NTRS)

    Pond, C. R.; Texeira, P. D.

    1985-01-01

    A laser angle measurement system was designed and fabricated for NASA Langley Research Center. The instrument is a fringe counting interferometer that monitors the pitch attitude of a model in a wind tunnel. A laser source and detector are mounted above the model. Interference fringes are generated by a small passive element on the model. The fringe count is accumulated and displayed by a processor in the wind tunnel control room. This report includes optical and electrical schematics, system maintenance and operation procedures.

  7. Measurement of small angle using phase shifted Lau interferometry

    NASA Astrophysics Data System (ADS)

    Disawal, Reena; Dhanotia, Jitendra; Prakash, Shashi

    2014-10-01

    An incoherent white-light source illuminates a set of two identical gratings placed in tandem, resulting in the generation of the Fresnel image. This image is projected onto a reflecting object and the reflected images from the object are projected onto the third grating. The resulting moiré fringes are recorded using CCD camera. Inclination angle of the object is a function of the interferometric phase. Phase shifting interferometry has been used for the determination of interferometric phase. Hence accurate determination of small tilt angle of object surface could be successfully undertaken. Technique is automated and provides high precision in measurement.

  8. Small Angle Neutron Scattering

    SciTech Connect

    Urban, Volker S

    2012-01-01

    Small Angle Neutron Scattering (SANS) probes structural details at the nanometer scale in a non-destructive way. This article gives an introduction to scientists who have no prior small-angle scattering knowledge, but who seek a technique that allows elucidating structural information in challenging situations that thwart approaches by other methods. SANS is applicable to a wide variety of materials including metals and alloys, ceramics, concrete, glasses, polymers, composites and biological materials. Isotope and magnetic interactions provide unique methods for labeling and contrast variation to highlight specific structural features of interest. In situ studies of a material s responses to temperature, pressure, shear, magnetic and electric fields, etc., are feasible as a result of the high penetrating power of neutrons. SANS provides statistical information on significant structural features averaged over the probed sample volume, and one can use SANS to quantify with high precision the structural details that are observed, for example, in electron microscopy. Neutron scattering is non-destructive; there is no need to cut specimens into thin sections, and neutrons penetrate deeply, providing information on the bulk material, free from surface effects. The basic principles of a SANS experiment are fairly simple, but the measurement, analysis and interpretation of small angle scattering data involves theoretical concepts that are unique to the technique and that are not widely known. This article includes a concise description of the basics, as well as practical know-how that is essential for a successful SANS experiment.

  9. Angle states in quantum mechanics

    NASA Astrophysics Data System (ADS)

    de la Torre, A. C.; Iguain, J. L.

    1998-12-01

    Angle states and angle operators are defined for a system with arbitrary angular momentum. They provide a reasonable formalization of the concept of angle provided that we accept that the angular orientation is quantized. The angle operator is the generator of boosts in angular momentum and is, almost everywhere, linearly related to the logarithm of the shift operator. Angle states for fermions and bosons behave differently under parity transformation.

  10. Exploring the Moon at High-Resolution: First Results From the Lunar Reconnaissance Orbiter Camera (LROC)

    NASA Astrophysics Data System (ADS)

    Robinson, Mark; Hiesinger, Harald; McEwen, Alfred; Jolliff, Brad; Thomas, Peter C.; Turtle, Elizabeth; Eliason, Eric; Malin, Mike; Ravine, A.; Bowman-Cisneros, Ernest

    The Lunar Reconnaissance Orbiter (LRO) spacecraft was launched on an Atlas V 401 rocket from the Cape Canaveral Air Force Station Launch Complex 41 on June 18, 2009. After spending four days in Earth-Moon transit, the spacecraft entered a three month commissioning phase in an elliptical 30×200 km orbit. On September 15, 2009, LRO began its planned one-year nominal mapping mission in a quasi-circular 50 km orbit. A multi-year extended mission in a fixed 30×200 km orbit is optional. The Lunar Reconnaissance Orbiter Camera (LROC) consists of a Wide Angle Camera (WAC) and two Narrow Angle Cameras (NACs). The WAC is a 7-color push-frame camera, which images the Moon at 100 and 400 m/pixel in the visible and UV, respectively, while the two NACs are monochrome narrow-angle linescan imagers with 0.5 m/pixel spatial resolution. LROC was specifically designed to address two of the primary LRO mission requirements and six other key science objectives, including 1) assessment of meter-and smaller-scale features in order to select safe sites for potential lunar landings near polar resources and elsewhere on the Moon; 2) acquire multi-temporal synoptic 100 m/pixel images of the poles during every orbit to unambiguously identify regions of permanent shadow and permanent or near permanent illumination; 3) meter-scale mapping of regions with permanent or near-permanent illumination of polar massifs; 4) repeat observations of potential landing sites and other regions to derive high resolution topography; 5) global multispectral observations in seven wavelengths to characterize lunar resources, particularly ilmenite; 6) a global 100-m/pixel basemap with incidence angles (60° -80° ) favorable for morphological interpretations; 7) sub-meter imaging of a variety of geologic units to characterize their physical properties, the variability of the regolith, and other key science questions; 8) meter-scale coverage overlapping with Apollo-era panoramic images (1-2 m/pixel) to document the number of small impacts since 1971-1972. LROC allows us to determine the recent impact rate of bolides in the size range of 0.5 to 10 meters, which is currently not well known. Determining the impact rate at these sizes enables engineering remediation measures for future surface operations and interplanetary travel. The WAC has imaged nearly the entire Moon in seven wavelengths. A preliminary global WAC stereo-based topographic model is in preparation [1] and global color processing is underway [2]. As the mission progresses repeat global coverage will be obtained as lighting conditions change providing a robust photometric dataset. The NACs are revealing a wealth of morpho-logic features at the meter scale providing the engineering and science constraints needed to support future lunar exploration. All of the Apollo landing sites have been imaged, as well as the majority of robotic landing and impact sites. Through the use of off-nadir slews a collection of stereo pairs is being acquired that enable 5-m scale topographic mapping [3-7]. Impact mor-phologies (terraces, impact melt, rays, etc) are preserved in exquisite detail at all Copernican craters and are enabling new studies of impact mechanics and crater size-frequency distribution measurements [8-12]. Other topical studies including, for example, lunar pyroclastics, domes, and tectonics are underway [e.g., 10-17]. The first PDS data release of LROC data will be in March 2010, and will include all images from the commissioning phase and the first 3 months of the mapping phase. [1] Scholten et al. (2010) 41st LPSC, #2111; [2] Denevi et al. (2010a) 41st LPSC, #2263; [3] Beyer et al. (2010) 41st LPSC, #2678; [4] Archinal et al. (2010) 41st LPSC, #2609; [5] Mattson et al. (2010) 41st LPSC, #1871; [6] Tran et al. (2010) 41st LPSC, #2515; [7] Oberst et al. (2010) 41st LPSC, #2051; [8] Bray et al. (2010) 41st LPSC, #2371; [9] Denevi et al. (2010b) 41st LPSC, #2582; [10] Hiesinger et al. (2010a) 41st LPSC, #2278; [11] Hiesinger et al. (2010b) 41st LPSC, #2304; [12] van der Bogert et al. (2010) 41st LPSC, #2165; [13] Plescia et al. (2010) 41st LPSC, #2160; [14] Lawrence et al. (2010) 41st LPSC, #1906; [15] Gaddis et al. (2010) 41st LPSC, #2059; [16] Watters et al. (2010) 41st LPSC, #1863; [17] Garry et al. (2010) 41st LPSC, #2278.

  11. The role of contact angle on unstable flow formation during infiltration and drainage in wettable porous media

    NASA Astrophysics Data System (ADS)

    Wallach, Rony; Margolis, Michal; Graber, Ellen R.

    2013-10-01

    The impact of contact angle on 2-D spatial and temporal water-content distribution during infiltration and drainage was experimentally studied. The 0.3-0.5 mm fraction of a quartz dune sand was treated and turned subcritically repellent (contact angle of 33°, 48°, 56°, and 75° for S33, S48, S56, and S75, respectively). The media were packed uniformly in transparent flow chambers and water was supplied to the surface as a point source at different rates (1-20 ml/min). A sequence of gray-value images was taken by CCD camera during infiltration and subsequent drainage; gray values were converted to volumetric water content by water volume balance. Narrow and long plumes with water accumulation behind the downward moving wetting front (tip) and negative water gradient above it (tail) developed in the S56 and S75 media during infiltration at lower water application rates. The plumes became bulbous with spatially uniform water-content distribution as water application rates increased. All plumes in these media propagated downward at a constant rate during infiltration and did not change their shape during drainage. In contrast, regular plume shapes were observed in the S33 and S48 media at all flow rates, and drainage profiles were nonmonotonic with a transition plane at the depth that water reached during infiltration. Given that the studied media have similar pore-size distributions, the conclusion is that imbibition hindered by the nonzero contact angle induced pressure buildup at the wetting front (dynamic water-entry value) that controlled the plume shape and internal water-content distribution during infiltration and drainage.

  12. Measurement of the surface wavelength distribution of narrow-band radiation by a colorimetric method

    SciTech Connect

    Kraiskii, A V; Mironova, T V; Sultanov, T T

    2010-09-10

    A method is suggested for determining the wavelength of narrow-band light from a digital photograph of a radiating surface. The digital camera used should be appropriately calibrated. The accuracy of the wavelength measurement is better than 1 nm. The method was tested on the yellow doublet of mercury spectrum and on the adjacent continuum of the incandescent lamp radiation spectrum. By means of the method suggested the homogeneity of holographic sensor swelling was studied in stationary and transient cases. (laser applications and other topics in quantum electronics)

  13. Contact Angle Measurements Using a Simplified Experimental Setup

    ERIC Educational Resources Information Center

    Lamour, Guillaume; Hamraoui, Ahmed; Buvailo, Andrii; Xing, Yangjun; Keuleyan, Sean; Prakash, Vivek; Eftekhari-Bafrooei, Ali; Borguet, Eric

    2010-01-01

    A basic and affordable experimental apparatus is described that measures the static contact angle of a liquid drop in contact with a solid. The image of the drop is made with a simple digital camera by taking a picture that is magnified by an optical lens. The profile of the drop is then processed with ImageJ free software. The ImageJ contact…

  14. Contact Angle Measurements Using a Simplified Experimental Setup

    ERIC Educational Resources Information Center

    Lamour, Guillaume; Hamraoui, Ahmed; Buvailo, Andrii; Xing, Yangjun; Keuleyan, Sean; Prakash, Vivek; Eftekhari-Bafrooei, Ali; Borguet, Eric

    2010-01-01

    A basic and affordable experimental apparatus is described that measures the static contact angle of a liquid drop in contact with a solid. The image of the drop is made with a simple digital camera by taking a picture that is magnified by an optical lens. The profile of the drop is then processed with ImageJ free software. The ImageJ contact

  15. Development of filter exchangeable 3CCD camera for multispectral imaging acquisition

    NASA Astrophysics Data System (ADS)

    Lee, Hoyoung; Park, Soo Hyun; Kim, Moon S.; Noh, Sang Ha

    2012-05-01

    There are a lot of methods to acquire multispectral images. Dynamic band selective and area-scan multispectral camera has not developed yet. This research focused on development of a filter exchangeable 3CCD camera which is modified from the conventional 3CCD camera. The camera consists of F-mounted lens, image splitter without dichroic coating, three bandpass filters, three image sensors, filer exchangeable frame and electric circuit for parallel image signal processing. In addition firmware and application software have developed. Remarkable improvements compared to a conventional 3CCD camera are its redesigned image splitter and filter exchangeable frame. Computer simulation is required to visualize a pathway of ray inside of prism when redesigning image splitter. Then the dimensions of splitter are determined by computer simulation which has options of BK7 glass and non-dichroic coating. These properties have been considered to obtain full wavelength rays on all film planes. The image splitter is verified by two line lasers with narrow waveband. The filter exchangeable frame is designed to make swap bandpass filters without displacement change of image sensors on film plane. The developed 3CCD camera is evaluated to application of detection to scab and bruise on Fuji apple. As a result, filter exchangeable 3CCD camera could give meaningful functionality for various multispectral applications which need to exchange bandpass filter.

  16. Payload topography camera of Chang'e-3

    NASA Astrophysics Data System (ADS)

    Yu, Guo-Bin; Liu, En-Hai; Zhao, Ru-Jin; Zhong, Jie; Zhou, Xiang-Dong; Zhou, Wu-Lin; Wang, Jin; Chen, Yuan-Pei; Hao, Yong-Jie

    2015-11-01

    Chang'e-3 was China's first soft-landing lunar probe that achieved a successful roving exploration on the Moon. A topography camera functioning as the lander's “eye” was one of the main scientific payloads installed on the lander. It was composed of a camera probe, an electronic component that performed image compression, and a cable assembly. Its exploration mission was to obtain optical images of the lunar topography in the landing zone for investigation and research. It also observed rover movement on the lunar surface and finished taking pictures of the lander and rover. After starting up successfully, the topography camera obtained static images and video of rover movement from different directions, 360° panoramic pictures of the lunar surface around the lander from multiple angles, and numerous pictures of the Earth. All images of the rover, lunar surface, and the Earth were clear, and those of the Chinese national flag were recorded in true color. This paper describes the exploration mission, system design, working principle, quality assessment of image compression, and color correction of the topography camera. Finally, test results from the lunar surface are provided to serve as a reference for scientific data processing and application.

  17. Characterization of a PET Camera Optimized for ProstateImaging

    SciTech Connect

    Huber, Jennifer S.; Choong, Woon-Seng; Moses, William W.; Qi,Jinyi; Hu, Jicun; Wang, G.C.; Wilson, David; Oh, Sang; Huesman, RonaldH.; Derenzo, Stephen E.

    2005-11-11

    We present the characterization of a positron emission tomograph for prostate imaging that centers a patient between a pair of external curved detector banks (ellipse: 45 cm minor, 70 cm major axis). The distance between detector banks adjusts to allow patient access and to position the detectors as closely as possible for maximum sensitivity with patients of various sizes. Each bank is composed of two axial rows of 20 HR+ block detectors for a total of 80 detectors in the camera. The individual detectors are angled in the transaxial plane to point towards the prostate to reduce resolution degradation in that region. The detectors are read out by modified HRRT data acquisition electronics. Compared to a standard whole-body PET camera, our dedicated-prostate camera has the same sensitivity and resolution, less background (less randoms and lower scatter fraction) and a lower cost. We have completed construction of the camera. Characterization data and reconstructed images of several phantoms are shown. Sensitivity of a point source in the center is 946 cps/mu Ci. Spatial resolution is 4 mm FWHM in the central region.

  18. Interference-induced angle-independent acoustical transparency

    SciTech Connect

    Qi, Lehua; Yu, Gaokun Wang, Ning; Wang, Xinlong; Wang, Guibo

    2014-12-21

    It is revealed that the Fano-like interference leads to the extraordinary acoustic transmission through a slab metamaterial of thickness much smaller than the wavelength, with each unit cell consisting of a Helmholtz resonator and a narrow subwavelength slit. More importantly, both the theoretical analysis and experimental measurement show that the angle-independent acoustical transparency can be realized by grafting a Helmholtz resonator and a quarter-wave resonator to the wall of a narrow subwavelength slit in each unit cell of a slit array. The observed phenomenon results from the interferences between the waves propagating in the slit, those re-radiated by the Helmholtz resonator, and those re-radiated by the quarter-wave resonator. The proposed design may find its applications in designing angle-independent acoustical filters and controlling the phase of the transmitted waves.

  19. Current Propagation in Narrow Bipolar Pulses

    NASA Astrophysics Data System (ADS)

    Watson, S. S.; Marshall, T. C.

    2005-12-01

    We model the observed electric fields of a particular narrow bipolar pulse (NBP) published in Eack [2004]. We assume an exponential growth of current carriers due to a runaway breakdown avalanche and show that this leads to a corresponding increase in current. With specific input values for discharge altitude, length, current, and propagation velocity, the model does a good job of reproducing the observed near and far electric field. The ability of the model to reproduce the observed electric fields is an indication that our assumptions concerning the runaway avalanche may be correct, and this indication is further strengthened by the inability of the simple transmission line model to reproduce simultaneously both the near and far electric fields. Eack, K. B. (2004), Electrical characteristics of narrow bipolar events, Geophys. Res. Lett., 31, L20102, doi:10.1029/2004/GL021117.

  20. Creep turns linear in narrow ferromagnetic nanostrips

    NASA Astrophysics Data System (ADS)

    Leliaert, Jonathan; van de Wiele, Ben; Vansteenkiste, Arne; Laurson, Lasse; Durin, Gianfranco; Dupré, Luc; van Waeyenberge, Bartel

    2016-02-01

    The motion of domain walls in magnetic materials is a typical example of a creep process, usually characterised by a stretched exponential velocity-force relation. By performing large-scale micromagnetic simulations, and analyzing an extended 1D model which takes the effects of finite temperatures and material defects into account, we show that this creep scaling law breaks down in sufficiently narrow ferromagnetic strips. Our analysis of current-driven transverse domain wall motion in disordered Permalloy nanostrips reveals instead a creep regime with a linear dependence of the domain wall velocity on the applied field or current density. This originates from the essentially point-like nature of domain walls moving in narrow, line- like disordered nanostrips. An analogous linear relation is found also by analyzing existing experimental data on field-driven domain wall motion in perpendicularly magnetised media.

  1. Creep turns linear in narrow ferromagnetic nanostrips

    PubMed Central

    Leliaert, Jonathan; Van de Wiele, Ben; Vansteenkiste, Arne; Laurson, Lasse; Durin, Gianfranco; Dupré, Luc; Van Waeyenberge, Bartel

    2016-01-01

    The motion of domain walls in magnetic materials is a typical example of a creep process, usually characterised by a stretched exponential velocity-force relation. By performing large-scale micromagnetic simulations, and analyzing an extended 1D model which takes the effects of finite temperatures and material defects into account, we show that this creep scaling law breaks down in sufficiently narrow ferromagnetic strips. Our analysis of current-driven transverse domain wall motion in disordered Permalloy nanostrips reveals instead a creep regime with a linear dependence of the domain wall velocity on the applied field or current density. This originates from the essentially point-like nature of domain walls moving in narrow, line- like disordered nanostrips. An analogous linear relation is found also by analyzing existing experimental data on field-driven domain wall motion in perpendicularly magnetised media. PMID:26843125

  2. Creep turns linear in narrow ferromagnetic nanostrips.

    PubMed

    Leliaert, Jonathan; Van de Wiele, Ben; Vansteenkiste, Arne; Laurson, Lasse; Durin, Gianfranco; Dupré, Luc; Van Waeyenberge, Bartel

    2016-01-01

    The motion of domain walls in magnetic materials is a typical example of a creep process, usually characterised by a stretched exponential velocity-force relation. By performing large-scale micromagnetic simulations, and analyzing an extended 1D model which takes the effects of finite temperatures and material defects into account, we show that this creep scaling law breaks down in sufficiently narrow ferromagnetic strips. Our analysis of current-driven transverse domain wall motion in disordered Permalloy nanostrips reveals instead a creep regime with a linear dependence of the domain wall velocity on the applied field or current density. This originates from the essentially point-like nature of domain walls moving in narrow, line- like disordered nanostrips. An analogous linear relation is found also by analyzing existing experimental data on field-driven domain wall motion in perpendicularly magnetised media. PMID:26843125

  3. High speed multiwire photon camera

    NASA Technical Reports Server (NTRS)

    Lacy, Jeffrey L. (Inventor)

    1991-01-01

    An improved multiwire proportional counter camera having particular utility in the field of clinical nuclear medicine imaging. The detector utilizes direct coupled, low impedance, high speed delay lines, the segments of which are capacitor-inductor networks. A pile-up rejection test is provided to reject confused events otherwise caused by multiple ionization events occuring during the readout window.

  4. High speed multiwire photon camera

    NASA Technical Reports Server (NTRS)

    Lacy, Jeffrey L. (Inventor)

    1989-01-01

    An improved multiwire proportional counter camera having particular utility in the field of clinical nuclear medicine imaging. The detector utilizes direct coupled, low impedance, high speed delay lines, the segments of which are capacitor-inductor networks. A pile-up rejection test is provided to reject confused events otherwise caused by multiple ionization events occurring during the readout window.

  5. Camera assisted multimodal user interaction

    NASA Astrophysics Data System (ADS)

    Hannuksela, Jari; Silvén, Olli; Ronkainen, Sami; Alenius, Sakari; Vehviläinen, Markku

    2010-01-01

    Since more processing power, new sensing and display technologies are already available in mobile devices, there has been increased interest in building systems to communicate via different modalities such as speech, gesture, expression, and touch. In context identification based user interfaces, these independent modalities are combined to create new ways how the users interact with hand-helds. While these are unlikely to completely replace traditional interfaces, they will considerably enrich and improve the user experience and task performance. We demonstrate a set of novel user interface concepts that rely on built-in multiple sensors of modern mobile devices for recognizing the context and sequences of actions. In particular, we use the camera to detect whether the user is watching the device, for instance, to make the decision to turn on the display backlight. In our approach the motion sensors are first employed for detecting the handling of the device. Then, based on ambient illumination information provided by a light sensor, the cameras are turned on. The frontal camera is used for face detection, while the back camera provides for supplemental contextual information. The subsequent applications triggered by the context can be, for example, image capturing, or bar code reading.

  6. Measuring Distances Using Digital Cameras

    ERIC Educational Resources Information Center

    Kendal, Dave

    2007-01-01

    This paper presents a generic method of calculating accurate horizontal and vertical object distances from digital images taken with any digital camera and lens combination, where the object plane is parallel to the image plane or tilted in the vertical plane. This method was developed for a project investigating the size, density and spatial…

  7. Stratoscope 2 integrating television camera

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The development, construction, test and delivery of an integrating television camera for use as the primary data sensor on Flight 9 of Stratoscope 2 is described. The system block diagrams are presented along with the performance data, and definition of the interface of the telescope with the power, telemetry, and communication system.

  8. New Rotating Prism Drum Camera

    NASA Astrophysics Data System (ADS)

    Hochtl, W.; Wittwer, W.

    1983-03-01

    The car industry is using high speed photography for crash testing and other dynamic testing of components. For this purpose high speed cameras with frame rates between 400 and 1000 frames per second are being used, whereas the time period to be measured is in the range of about 100 - 250 ms.

  9. Camera lens adapter magnifies image

    NASA Technical Reports Server (NTRS)

    Moffitt, F. L.

    1967-01-01

    Polaroid Land camera with an illuminated 7-power magnifier adapted to the lens, photographs weld flaws. The flaws are located by inspection with a 10-power magnifying glass and then photographed with this device, thus providing immediate pictorial data for use in remedial procedures.

  10. Directing Performers for the Cameras.

    ERIC Educational Resources Information Center

    Wilson, George P., Jr.

    An excellent way for an undergraduate, novice director of television and film to pick up background experience in directing performers for cameras is by participating in nonbroadcast-film activities, such as theatre, dance, and variety acts, both as performer and as director. This document describes the varieties of activities, including creative,…

  11. The Camera Comes to Court.

    ERIC Educational Resources Information Center

    Floren, Leola

    After the Lindbergh kidnapping trial in 1935, the American Bar Association sought to eliminate electronic equipment from courtroom proceedings. Eventually, all but two states adopted regulations applying that ban to some extent, and a 1965 Supreme Court decision encouraged the banning of television cameras at trials as well. Currently, some states…

  12. The Camera Comes to Court.

    ERIC Educational Resources Information Center

    Floren, Leola

    After the Lindbergh kidnapping trial in 1935, the American Bar Association sought to eliminate electronic equipment from courtroom proceedings. Eventually, all but two states adopted regulations applying that ban to some extent, and a 1965 Supreme Court decision encouraged the banning of television cameras at trials as well. Currently, some states

  13. OSIRIS camera barrel optomechanical design

    NASA Astrophysics Data System (ADS)

    Farah, Alejandro; Tejada, Carlos; Gonzalez, Jesus; Cobos, Francisco J.; Sanchez, Beatriz; Fuentes, Javier; Ruiz, Elfego

    2004-09-01

    A Camera Barrel, located in the OSIRIS imager/spectrograph for the Gran Telescopio Canarias (GTC), is described in this article. The barrel design has been developed by the Institute for Astronomy of the University of Mexico (IA-UNAM), in collaboration with the Institute for Astrophysics of Canarias (IAC), Spain. The barrel is being manufactured by the Engineering Center for Industrial Development (CIDESI) at Queretaro, Mexico. The Camera Barrel includes a set of eight lenses (three doublets and two singlets), with their respective supports and cells, as well as two subsystems: the Focusing Unit, which is a mechanism that modifies the first doublet relative position; and the Passive Displacement Unit (PDU), which uses the third doublet as thermal compensator to maintain the camera focal length and image quality when the ambient temperature changes. This article includes a brief description of the scientific instrument; describes the design criteria related with performance justification; and summarizes the specifications related with misalignment errors and generated stresses. The Camera Barrel components are described and analytical calculations, FEA simulations and error budgets are also included.

  14. Television Camera Operator. Student's Manual.

    ERIC Educational Resources Information Center

    Grimes, L. A., Jr.

    This student manual is one in a series of individualized instructional materials for use under the supervision of an instructor. The self-contained manual was developed for persons training to become television camera operators. Each assignment has all the information needed, including a list of objectives that should be met and exercise questions…

  15. Gamma-ray camera flyby

    SciTech Connect

    2010-01-01

    Animation based on an actual classroom demonstration of the prototype CCI-2 gamma-ray camera's ability to image a hidden radioactive source, a cesium-137 line source, in three dimensions. For more information see http://newscenter.lbl.gov/feature-stories/2010/06/02/applied-nuclear-physics/.

  16. Inexpensive scintillation camera study device.

    PubMed

    Brandt, H M; Baard, W P; Heerden, P D

    1977-05-01

    A commerically available inexpensive calculator was modified and mounted next to one of the display oscilloscopes on a scintillation-camera console. This enabled the technologist to dial in each patient's identification number, which then appeared on every frame of the 35-mm film used. By using this device, labeling errors have been reduced to a minimum. PMID:870639

  17. Recent advances in digital camera optics

    NASA Astrophysics Data System (ADS)

    Ishiguro, Keizo

    2012-10-01

    The digital camera market has extremely expanded in the last ten years. The zoom lens for digital camera is especially the key determining factor of the camera body size and image quality. Its technologies have been based on several analog technological progresses including the method of aspherical lens manufacturing and the mechanism of image stabilization. Panasonic is one of the pioneers of both technologies. I will introduce the previous trend in optics of zoom lens as well as original optical technologies of Panasonic digital camera "LUMIX", and in addition optics in 3D camera system. Besides, I would like to suppose the future trend in digital cameras.

  18. Analysis of Reference Sources for the Characterization and Calibration of Infrared Cameras

    NASA Astrophysics Data System (ADS)

    Gutschwager, B.; Taubert, D.; Hollandt, J.

    2015-03-01

    This paper gives an analysis of the radiometric properties of different types of reference sources applied for the characterization and calibration of infrared cameras. For the absolute radiance measurement with an infrared camera, a metrological characterization and calibration of the instrument are essential. Similar to the calibration of radiation thermometers, this calibration is generally performed with reference sources of known radiance. As infrared cameras are optically and electronically more complex than radiation thermometers, which are equipped with a single element detector, the applied reference sources have to be carefully characterized and limitations in their performance have to be considered. Each pixel of the image measured with an infrared camera should depict correctly the desired physical quantity value of the projected object area. This should be achieved for all relevant conditions of observation, e.g., at different distances or at different incident angles. The performance of cavity radiators and plate radiators is analyzed based on ray-tracing calculations and spatially and angularly resolved radiance measurements with radiation thermometers and cameras. Relevant components of a calibration facility for infrared cameras at PTB are presented with their specifications. A first analysis of the relevant characteristics of the applied infrared calibration sources and infrared cameras is presented as the essential basic information for the realization of the calibration of infrared cameras.

  19. Laser angle measurement system

    NASA Technical Reports Server (NTRS)

    Pond, C. R.; Texeira, P. D.; Wilbert, R. E.

    1980-01-01

    The design and fabrication of a laser angle measurement system is described. The instrument is a fringe counting interferometer that monitors the pitch attitude of a model in a wind tunnel. A laser source and detector are mounted above the mode. Interference fringes are generated by a small passive element on the model. The fringe count is accumulated and displayed by a processor in the wind tunnel control room. Optical and electrical schematics, system maintenance and operation procedures are included, and the results of a demonstration test are given.

  20. LDEF yaw and pitch angle estimates

    NASA Technical Reports Server (NTRS)

    Banks, Bruce A.; Gebauer, Linda

    1992-01-01

    Quantification of the LDEF yaw and pitch misorientations is crucial to the knowledge of atomic oxygen exposure of samples placed on LDEF. Video camera documentation of the LDEF spacecraft prior to grapple attachment, atomic oxygen shadows on experiment trays and longerons, and a pinhole atomic oxygen camera placed on LDEF provided sources of documentation of the yaw and pitch misorientation. Based on uncertainty-weighted averaging of data, the LDEF yaw offset was found to be 8.1 plus or minus 0.6 degrees, allowing higher atomic oxygen exposure of row 12 than initially anticipated. The LDEF pitch angle offset was found to be 0.8 plus or minus 0.4 degrees, such that the space end was tipped forward toward the direction of travel. The resulting consequences of the yaw and pitch misorientation of LDEF on the atomic oxygen fluence is a factor of 2.16 increase for samples located on row 12, and a factor of 1.18 increase for samples located on the space end compared to that which would be expected for perfect orientation.

  1. Ortho-Rectification of Narrow Band Multi-Spectral Imagery Assisted by Dslr RGB Imagery Acquired by a Fixed-Wing Uas

    NASA Astrophysics Data System (ADS)

    Rau, J.-Y.; Jhan, J.-P.; Huang, C.-Y.

    2015-08-01

    Miniature Multiple Camera Array (MiniMCA-12) is a frame-based multilens/multispectral sensor composed of 12 lenses with narrow band filters. Due to its small size and light weight, it is suitable to mount on an Unmanned Aerial System (UAS) for acquiring high spectral, spatial and temporal resolution imagery used in various remote sensing applications. However, due to its wavelength range is only 10 nm that results in low image resolution and signal-to-noise ratio which are not suitable for image matching and digital surface model (DSM) generation. In the meantime, the spectral correlation among all 12 bands of MiniMCA images are low, it is difficult to perform tie-point matching and aerial triangulation at the same time. In this study, we thus propose the use of a DSLR camera to assist automatic aerial triangulation of MiniMCA-12 imagery and to produce higher spatial resolution DSM for MiniMCA12 ortho-image generation. Depending on the maximum payload weight of the used UAS, these two kinds of sensors could be collected at the same time or individually. In this study, we adopt a fixed-wing UAS to carry a Canon EOS 5D Mark2 DSLR camera and a MiniMCA-12 multi-spectral camera. For the purpose to perform automatic aerial triangulation between a DSLR camera and the MiniMCA-12, we choose one master band from MiniMCA-12 whose spectral range has overlap with the DSLR camera. However, all lenses of MiniMCA-12 have different perspective centers and viewing angles, the original 12 channels have significant band misregistration effect. Thus, the first issue encountered is to reduce the band misregistration effect. Due to all 12 MiniMCA lenses being frame-based, their spatial offsets are smaller than 15 cm and all images are almost 98% overlapped, we thus propose a modified projective transformation (MPT) method together with two systematic error correction procedures to register all 12 bands of imagery on the same image space. It means that those 12 bands of images acquired at the same exposure time will have same interior orientation parameters (IOPs) and exterior orientation parameters (EOPs) after band-to-band registration (BBR). Thus, in the aerial triangulation stage, the master band of MiniMCA-12 was treated as a reference channel to link with DSLR RGB images. It means, all reference images from the master band of MiniMCA-12 and all RGB images were triangulated at the same time with same coordinate system of ground control points (GCP). Due to the spatial resolution of RGB images is higher than the MiniMCA-12, the GCP can be marked on the RGB images only even they cannot be recognized on the MiniMCA images. Furthermore, a one meter gridded digital surface model (DSM) is created by the RGB images and applied to the MiniMCA imagery for ortho-rectification. Quantitative error analyses show that the proposed BBR scheme can achieve 0.33 pixels of average misregistration residuals length and the co-registration errors among 12 MiniMCA ortho-images and between MiniMCA and Canon RGB ortho-images are all less than 0.6 pixels. The experimental results demonstrate that the proposed method is robust, reliable and accurate for future remote sensing applications.

  2. Variable angle correlation spectroscopy

    SciTech Connect

    Lee, Y K

    1994-05-01

    In this dissertation, a novel nuclear magnetic resonance (NMR) technique, variable angle correlation spectroscopy (VACSY) is described and demonstrated with {sup 13}C nuclei in rapidly rotating samples. These experiments focus on one of the basic problems in solid state NMR: how to extract the wealth of information contained in the anisotropic component of the NMR signal while still maintaining spectral resolution. Analysis of the anisotropic spectral patterns from poly-crystalline systems reveal information concerning molecular structure and dynamics, yet in all but the simplest of systems, the overlap of spectral patterns from chemically distinct sites renders the spectral analysis difficult if not impossible. One solution to this problem is to perform multi-dimensional experiments where the high-resolution, isotropic spectrum in one dimension is correlated with the anisotropic spectral patterns in the other dimensions. The VACSY technique incorporates the angle between the spinner axis and the static magnetic field as an experimental parameter that may be incremented during the course of the experiment to help correlate the isotropic and anisotropic components of the spectrum. The two-dimensional version of the VACSY experiments is used to extract the chemical shift anisotropy tensor values from multi-site organic molecules, study molecular dynamics in the intermediate time regime, and to examine the ordering properties of partially oriented samples. The VACSY technique is then extended to three-dimensional experiments to study slow molecular reorientations in a multi-site polymer system.

  3. Analysis and protection of stray light for the space camera at geosynchronous orbit

    NASA Astrophysics Data System (ADS)

    Jin, Xiaorui; Lin, Li

    2012-11-01

    Stray light is the general term for all non-normal transmission of light in the optical system. The influence of stray light is different according to optical system's structure. Large area array camera at geosynchronous orbit is facing more serious influence of stray light, especially for the small incident angle of sunlight on the system. It is in dire need of a detailed analysis of stray light of the basic shape of the optical system .In the paper, the influence on the camera used in space from stray light and the necessity to eliminate stray light are presented. The definition of the stray light coefficient and PST(point source transmittance) is briefed. In Tracepro, analysis of the impact of sunlight incident was made at different angles on the space camera, in the case of stray light factor for the quantitative evaluation. The design principle of the inside and outside hood is presented for the R-C (Ritchey Chretien) optical system. On this basis, in order to reduce stray light interference for the space camera, the primary and secondary mirror's hoods were designed. Finally, when the incidence angle of sunlight is more than 3° incidence on the space camera, the coefficient of stray light is less than 2%. It meets the engineering requirements.

  4. Lytro camera technology: theory, algorithms, performance analysis

    NASA Astrophysics Data System (ADS)

    Georgiev, Todor; Yu, Zhan; Lumsdaine, Andrew; Goma, Sergio

    2013-03-01

    The Lytro camera is the first implementation of a plenoptic camera for the consumer market. We consider it a successful example of the miniaturization aided by the increase in computational power characterizing mobile computational photography. The plenoptic camera approach to radiance capture uses a microlens array as an imaging system focused on the focal plane of the main camera lens. This paper analyzes the performance of Lytro camera from a system level perspective, considering the Lytro camera as a black box, and uses our interpretation of Lytro image data saved by the camera. We present our findings based on our interpretation of Lytro camera file structure, image calibration and image rendering; in this context, artifacts and final image resolution are discussed.

  5. A method for measuring the base angle of axicon lens based on chromatic dispersion

    NASA Astrophysics Data System (ADS)

    Zhang, Yunbo; Zeng, Aijun; Wang, Ying; Huang, Huijie

    2015-07-01

    A method for measuring the base angle of axicon lens is presented. This method utilizes two coaxial laser beams with different wavelengths. When the two laser beams passing through the axicon lens, there will be a small divergence angle between them resulted from chromatic dispersion. After collected by an achromatic lens, these two laser beams will generate two spots on an image camera. The base angle can be figured out with the distance between two spots recorded by the image sensor. Furthermore, this method can also be used to calculate the cone angle of axicon lens.

  6. A 10-microm infrared camera.

    PubMed

    Arens, J F; Jernigan, J G; Peck, M C; Dobson, C A; Kilk, E; Lacy, J; Gaalema, S

    1987-09-15

    An IR camera has been built at the University of California at Berkeley for astronomical observations. The camera has been used primarily for high angular resolution imaging at mid-IR wavelengths. It has been tested at the University of Arizona 61- and 90-in. telescopes near Tucson and the NASA Infrared Telescope Facility on Mauna Kea, HI. In the observations the system has been used as an imager with interference coated and Fabry-Perot filters. These measurements have demonstrated a sensitivity consistent with photon shot noise, showing that the system is limited by the radiation from the telescope and atmosphere. Measurements of read noise, crosstalk, and hysteresis have been made in our laboratory. PMID:20490151

  7. The Dark Energy Camera (DECam)

    NASA Astrophysics Data System (ADS)

    DePoy, D. L.; Abbott, T.; Annis, J.; Antonik, M.; Barceló, M.; Bernstein, R.; Bigelow, B.; Brooks, D.; Buckley-Geer, E.; Campa, J.; Cardiel, L.; Castander, F.; Castilla, J.; Cease, H.; Chappa, S.; Dede, E.; Derylo, G.; Diehl, H. T.; Doel, P.; DeVicente, J.; Estrada, J.; Finley, D.; Flaugher, B.; Gaztanaga, E.; Gerdes, D.; Gladders, M.; Guarino, V.; Gutierrez, G.; Hamilton, J.; Haney, M.; Holland, S.; Honscheid, K.; Huffman, D.; Karliner, I.; Kau, D.; Kent, S.; Kozlovsky, M.; Kubik, D.; Kuehn, K.; Kuhlmann, S.; Kuk, K.; Leger, F.; Lin, H.; Martinez, G.; Martinez, M.; Merritt, W.; Mohr, J.; Moore, P.; Moore, T.; Nord, B.; Ogando, R.; Olsen, J.; Onal, B.; Peoples, J.; Qian, T.; Roe, N.; Sanchez, E.; Scarpine, V.; Schmidt, R.; Schmitt, R.; Schubnell, M.; Schultz, K.; Selen, M.; Shaw, T.; Simaitis, V.; Slaughter, J.; Smith, C.; Spinka, H.; Stefanik, A.; Stuermer, W.; Talaga, R.; Tarle, G.; Thaler, J.; Tucker, D.; Walker, A.; Worswick, S.; Zhao, A.

    2008-07-01

    We describe the Dark Energy Camera (DECam), which will be the primary instrument used in the Dark Energy Survey. DECam will be a 3 sq. deg. mosaic camera mounted at the prime focus of the Blanco 4m telescope at the Cerro-Tololo International Observatory (CTIO). DECam includes a large mosaic CCD focal plane, a five element optical corrector, five filters (g,r,i,z,Y), and the associated infrastructure for operation in the prime focus cage. The focal plane consists of 62 2K x 4K CCD modules (0.27"/pixel) arranged in a hexagon inscribed within the roughly 2.2 degree diameter field of view. The CCDs will be 250 micron thick fully-depleted CCDs that have been developed at the Lawrence Berkeley National Laboratory (LBNL). Production of the CCDs and fabrication of the optics, mechanical structure, mechanisms, and control system for DECam are underway; delivery of the instrument to CTIO is scheduled for 2010.

  8. Electronographic cameras for space astronomy.

    NASA Technical Reports Server (NTRS)

    Carruthers, G. R.; Opal, C. B.

    1972-01-01

    Magnetically-focused electronographic cameras have been under development at the Naval Research Laboratory for use in far-ultraviolet imagery and spectrography, primarily in astronomical and optical-geophysical observations from sounding rockets and space vehicles. Most of this work has been with cameras incorporating internal optics of the Schmidt or wide-field all-reflecting types. More recently, we have begun development of electronographic spectrographs incorporating an internal concave grating, operating at normal or grazing incidence. We also are developing electronographic image tubes of the conventional end-window-photo-cathode type, for far-ultraviolet imagery at the focus of a large space telescope, with image formats up to 120 mm in diameter.

  9. Development of Nikon Space Camera

    NASA Astrophysics Data System (ADS)

    Goto, Tetsuro

    After Soviet cosmonaut Gagarin succeeded as the first human to orbit the Earth in 1961, American astronaut Glenn succeeded in a similar mission the following year, 1962, aboard the Friendship 7 spacecraft for the Mercury-Atlas 6 mission. Since before this event, the National Aeronautics and Space Administration (NASA) has used a large amount of imaging equipment to successfully record major astronomical phenomena and acquire analysis data. Nikon has made significant contributions to the American space program since the Apollo Program by continuously providing NASA with space cameras that meet their strict demands in terms of reliability, quality and durability under the most extreme conditions. The following details our achievements and specifics regarding modifications necessary for use in space, and also touches on space cameras provided by manufacturers other than Nikon, for which information may be quite limited.

  10. Combustion pinhole-camera system

    DOEpatents

    Witte, A.B.

    1982-05-19

    A pinhole camera system is described utilizing a sealed optical-purge assembly which provides optical access into a coal combustor or other energy conversion reactors. The camera system basically consists of a focused-purge pinhole optical port assembly, a conventional TV vidicon receiver, an external, variable density light filter which is coupled electronically to the vidicon automatic gain control (agc). The key component of this system is the focused-purge pinhole optical port assembly which utilizes a purging inert gas to keep debris from entering the port and a lens arrangement which transfers the pinhole to the outside of the port assembly. One additional feature of the port assembly is that it is not flush with the interior of the combustor.

  11. Switchable viewing angle display with a compact directional backlight and striped diffuser.

    PubMed

    Wang, Yi-Jun; Lu, Jian-Gang; Chao, Wei-Chung; Shieh, Han-Ping D

    2015-08-10

    A compact high-directionality backlight module combined with a striped diffuser is proposed to achieve an adjustable viewing angle for eco-display. The micro-prisms on the compact light guide plate guide the emitting rays to the normal viewing angle, whereas a set of striped diffusers scatter the rays to a wide viewing angle. View cones of ± 10° / ± 55° were obtained for narrow/wide viewing modes with 88% / 85% uniformity of spatial luminance, respectively. Compared with the conventional backlight, the optical efficiencies were increased by factors of 1.47 and 1.38 in narrow and wide viewing modes, respectively. In addition, only 5% of power consumption was needed when the backlight worked in private narrow viewing mode to maintain the same luminance as that of a conventional backlight. PMID:26367992

  12. Heterodyne Interferometer Angle Metrology

    NASA Technical Reports Server (NTRS)

    Hahn, Inseob; Weilert, Mark A.; Wang, Xu; Goullioud, Renaud

    2010-01-01

    A compact, high-resolution angle measurement instrument has been developed that is based on a heterodyne interferometer. The common-path heterodyne interferometer metrology is used to measure displacements of a reflective target surface. In the interferometer setup, an optical mask is used to sample the measurement laser beam reflecting back from a target surface. Angular rotations, around two orthogonal axes in a plane perpendicular to the measurement- beam propagation direction, are determined simultaneously from the relative displacement measurement of the target surface. The device is used in a tracking telescope system where pitch and yaw measurements of a flat mirror were simultaneously performed with a sensitivity of 0.1 nrad, per second, and a measuring range of 0.15 mrad at a working distance of an order of a meter. The nonlinearity of the device is also measured less than one percent over the measurement range.

  13. Sun angle calculator

    NASA Technical Reports Server (NTRS)

    Flippin, A.; Schmitt, A. L. (Inventor)

    1976-01-01

    A circular computer and system is disclosed for determining the sun angle relative to the horizon from any given place and at any time. The computer includes transparent, rotatably mounted discs on both sides of the circular disc member. Printed on one side of the circular disc member are outer and inner circular sets of indicia respectively representative of site longitude and Greenwich Mean Time. Printed on an associated one of the rotatable discs is a set of indicia representative of Solar Time. Printed on the other side of the circular disc member are parallel lines representative of latitude between diametral representations of North and South poles. Elliptical lines extending between the North and South poles are proportionally disposed on the surface to scale Solar Time in hours.

  14. Graphic design of pinhole cameras

    NASA Technical Reports Server (NTRS)

    Edwards, H. B.; Chu, W. P.

    1979-01-01

    The paper describes a graphic technique for the analysis and optimization of pinhole size and focal length. The technique is based on the use of the transfer function of optical elements described by Scott (1959) to construct the transfer function of a circular pinhole camera. This transfer function is the response of a component or system to a pattern of lines having a sinusoidally varying radiance at varying spatial frequencies. Some specific examples of graphic design are presented.

  15. ISO camera array development status

    NASA Technical Reports Server (NTRS)

    Sibille, F.; Cesarsky, C.; Agnese, P.; Rouan, D.

    1989-01-01

    A short outline is given of the Infrared Space Observatory Camera (ISOCAM), one of the 4 instruments onboard the Infrared Space Observatory (ISO), with the current status of its two 32x32 arrays, an InSb charge injection device (CID) and a Si:Ga direct read-out (DRO), and the results of the in orbit radiation simulation with gamma ray sources. A tentative technique for the evaluation of the flat fielding accuracy is also proposed.

  16. MAMBA All-sky Camera

    NASA Astrophysics Data System (ADS)

    Pier, E.; Jim, K.; Hadmack, M.

    MAMBA is an actively calibrated, thermal, all-sky camera designed to measure the precipitable water vapor (PWV) column in any arbitrary direction. This has applications in the calibration of remote sensing, astronomy, radio frequency transmissions, meteorology, and climatology. MAMBA produces an all-sky cloud map, day or night, which is useful in astronomy and SSA for telescope aiming. The system is based on a new optical design and an extensive set of atmospheric radiative transfer models.

  17. The Flow of Gases in Narrow Channels

    NASA Technical Reports Server (NTRS)

    Rasmussen, R E H

    1951-01-01

    Measurements were made of the flow of gases through various narrow channels a few microns wide at average pressures from 0.00003 to 40 cm. Hg. The flow rate, defined as the product of pressure and volume rate of flow at unit pressure difference, first decreased linearly with decrease in mean pressure in the channel, in agreement with laminar-flow theory, reached a minimum when the mean path length was approximately equal to the channel width, and then increased to a constant value. The product of flow rate and square root of molecular number was approximately the same function of mean path length for all gases for a given channel.

  18. Superparamagnetic colloids confined in narrow corrugated substrates

    NASA Astrophysics Data System (ADS)

    Herrera-Velarde, S.; Castañeda-Priego, R.

    2008-04-01

    We report a Brownian dynamics simulation study of the structure and dynamics of superparamagnetic colloids subject to external substrate potentials and confined in narrow channels. Our study is motivated by the importance of phenomena like commensurable-incommensurable phase transitions, anomalous diffusion, and stochastic activation processes that are closely related to the system under investigation. We focus mainly on the role of the substrate in the order-disorder mechanisms that lead to a rich variety of commensurate and incommensurate phases, as well as its effect on the single-file diffusion in interacting systems and the depinning transition in one dimension.

  19. Recording of essential ballistic data with a new generation of digital ballistic range camera

    NASA Astrophysics Data System (ADS)

    Haddleton, Graham P.; Honour, Jo

    2007-01-01

    Scientists and Engineers still require to record essential parameters during the design and testing of new (or refined) munitions. This essential data, such as velocities, spin, pitch and yaw angles, sabot discards, impact angles, target penetrations, behind target effects and post impact delays, need to be recorded during dynamic, high velocity, and dangerous firings. Traditionally these parameters have been recorded on high-speed film cameras. With the demise of film as a recording media a new generation of electronic digital recording cameras has come to be accepted method of allowing these parameters to be recorded and analysed. Their obvious advantage over film is their instant access to records and their ability for almost instant analysis of records. This paper will detail results obtained using a new specially designed Ballistic Range Camera manufactured by Specialised Imaging Ltd.

  20. MTF measurement and imaging quality evaluation of digital camera with slanted-edge method

    NASA Astrophysics Data System (ADS)

    Xiang, Chunchang; Chen, Xinhua; Chen, Yuheng; Zhou, Jiankang; Shen, Weimin

    2010-11-01

    Modulation Transfer Function (MTF) is the spatial frequency response of imaging systems and now develops as an objective merit performance for evaluating both quality of lens and camera. Slanted-edge method and its principle for measuring MTF of digital camera are introduced in this paper. The setup and software for testing digital camera is respectively established and developed. Measurement results with different tilt angle of the knife edge are compared to discuss the influence of the tilt angle. Also carefully denoise of the knife edge image is performed to decrease the noise sensitivity of knife edge measurement. Comparisons have been made between the testing results gained by slanted-edge method and grating target technique, and their deviation is analyzed.

  1. Silicone Contamination Camera for Developed for Shuttle Payloads

    NASA Technical Reports Server (NTRS)

    1996-01-01

    On many shuttle missions, silicone contamination from unknown sources from within or external to the shuttle payload bay has been a chronic problem plaguing experiment payloads. There is currently a wide range of silicone usage on the shuttle. Silicones are used to coat the shuttle tiles to enhance their ability to shed rain, and over 100 kg of RTV 560 silicone is used to seal white tiles to the shuttle surfaces. Silicones are also used in electronic components, potting compounds, and thermal control blankets. Efforts to date to identify and eliminate the sources of silicone contamination have not been highly successful and have created much controversy. To identify the sources of silicone contamination on the space shuttle, the NASA Lewis Research Center developed a contamination camera. This specially designed pinhole camera utilizes low-Earth-orbit atomic oxygen to develop a picture that identifies sources of silicone contamination on shuttle-launched payloads. The volatile silicone species travel through the aperture of the pinhole camera, and since volatile silicone species lose their hydrocarbon functionalities under atomic oxygen attack, the silicone adheres to the substrate as SiO_x. This glassy deposit should be spatially arranged in the image of the sources of silicone contamination. To view the contamination image, one can use ultrasensitive thickness measurement techniques, such as scanning variable-angle ellipsometry, to map the surface topography of the camera's substrate. The demonstration of a functional contamination camera would resolve the controversial debate concerning the amount and location of contamination sources, would allow corrective actions to be taken, and would demonstrate a useful tool for contamination documentation on future shuttle payloads, with near negligible effect on cost and weight.

  2. Image quality testing of assembled IR camera modules

    NASA Astrophysics Data System (ADS)

    Winters, Daniel; Erichsen, Patrik

    2013-10-01

    Infrared (IR) camera modules for the LWIR (8-12_m) that combine IR imaging optics with microbolometer focal plane array (FPA) sensors with readout electronics are becoming more and more a mass market product. At the same time, steady improvements in sensor resolution in the higher priced markets raise the requirement for imaging performance of objectives and the proper alignment between objective and FPA. This puts pressure on camera manufacturers and system integrators to assess the image quality of finished camera modules in a cost-efficient and automated way for quality control or during end-of-line testing. In this paper we present recent development work done in the field of image quality testing of IR camera modules. This technology provides a wealth of additional information in contrast to the more traditional test methods like minimum resolvable temperature difference (MRTD) which give only a subjective overall test result. Parameters that can be measured are image quality via the modulation transfer function (MTF) for broadband or with various bandpass filters on- and off-axis and optical parameters like e.g. effective focal length (EFL) and distortion. If the camera module allows for refocusing the optics, additional parameters like best focus plane, image plane tilt, auto-focus quality, chief ray angle etc. can be characterized. Additionally, the homogeneity and response of the sensor with the optics can be characterized in order to calculate the appropriate tables for non-uniformity correction (NUC). The technology can also be used to control active alignment methods during mechanical assembly of optics to high resolution sensors. Other important points that are discussed are the flexibility of the technology to test IR modules with different form factors, electrical interfaces and last but not least the suitability for fully automated measurements in mass production.

  3. Bio-inspired hemispherical compound eye camera

    NASA Astrophysics Data System (ADS)

    Xiao, Jianliang; Song, Young Min; Xie, Yizhu; Malyarchuk, Viktor; Jung, Inhwa; Choi, Ki-Joong; Liu, Zhuangjian; Park, Hyunsung; Lu, Chaofeng; Kim, Rak-Hwan; Li, Rui; Crozier, Kenneth B.; Huang, Yonggang; Rogers, John A.

    2014-03-01

    Compound eyes in arthropods demonstrate distinct imaging characteristics from human eyes, with wide angle field of view, low aberrations, high acuity to motion and infinite depth of field. Artificial imaging systems with similar geometries and properties are of great interest for many applications. However, the challenges in building such systems with hemispherical, compound apposition layouts cannot be met through established planar sensor technologies and conventional optics. We present our recent progress in combining optics, materials, mechanics and integration schemes to build fully functional artificial compound eye cameras. Nearly full hemispherical shapes (about 160 degrees) with densely packed artificial ommatidia were realized. The number of ommatidia (180) is comparable to those of the eyes of fire ants and bark beetles. The devices combine elastomeric compound optical elements with deformable arrays of thin silicon photodetectors, which were fabricated in the planar geometries and then integrated and elastically transformed to hemispherical shapes. Imaging results and quantitative ray-tracing-based simulations illustrate key features of operation. These general strategies seem to be applicable to other compound eye devices, such as those inspired by moths and lacewings (refracting superposition eyes), lobster and shrimp (reflecting superposition eyes), and houseflies (neural superposition eyes).

  4. Cooling the dark energy camera instrument

    SciTech Connect

    Schmitt, R.L.; Cease, H.; DePoy, D.; Diehl, H.T.; Estrada, J.; Flaugher, B.; Kuhlmann, S.; Onal, Birce; Stefanik, A.; /Fermilab

    2008-06-01

    DECam, camera for the Dark Energy Survey (DES), is undergoing general design and component testing. For an overview see DePoy, et al in these proceedings. For a description of the imager, see Cease, et al in these proceedings. The CCD instrument will be mounted at the prime focus of the CTIO Blanco 4m telescope. The instrument temperature will be 173K with a heat load of 113W. In similar applications, cooling CCD instruments at the prime focus has been accomplished by three general methods. Liquid nitrogen reservoirs have been constructed to operate in any orientation, pulse tube cryocoolers have been used when tilt angles are limited and Joule-Thompson or Stirling cryocoolers have been used with smaller heat loads. Gifford-MacMahon cooling has been used at the Cassegrain but not at the prime focus. For DES, the combined requirements of high heat load, temperature stability, low vibration, operation in any orientation, liquid nitrogen cost and limited space available led to the design of a pumped, closed loop, circulating nitrogen system. At zenith the instrument will be twelve meters above the pump/cryocooler station. This cooling system expected to have a 10,000 hour maintenance interval. This paper will describe the engineering basis including the thermal model, unbalanced forces, cooldown time, the single and two-phase flow model.

  5. Cooling the Dark Energy Camera instrument

    NASA Astrophysics Data System (ADS)

    Schmitt, R. L.; Cease, H.; DePoy, D.; Diehl, H. T.; Estrada, J.; Flaugher, B.; Kuhlmann, S.; Onal, Birce; Stefanik, A.

    2008-07-01

    DECam, camera for the Dark Energy Survey (DES), is undergoing general design and component testing. For an overview see DePoy, et al in these proceedings. For a description of the imager, see Cease, et al in these proceedings. The CCD instrument will be mounted at the prime focus of the CTIO Blanco 4m telescope. The instrument temperature will be 173K with a heat load of 113W. In similar applications, cooling CCD instruments at the prime focus has been accomplished by three general methods. Liquid nitrogen reservoirs have been constructed to operate in any orientation, pulse tube cryocoolers have been used when tilt angles are limited and Joule-Thompson or Stirling cryocoolers have been used with smaller heat loads. Gifford-MacMahon cooling has been used at the Cassegrain but not at the prime focus. For DES, the combined requirements of high heat load, temperature stability, low vibration, operation in any orientation, liquid nitrogen cost and limited space available led to the design of a pumped, closed loop, circulating nitrogen system. At zenith the instrument will be twelve meters above the pump/cryocooler station. This cooling system expected to have a 10,000 hour maintenance interval. This paper will describe the engineering basis including the thermal model, unbalanced forces, cooldown time, the single and two-phase flow model.

  6. Speckle Camera Imaging of the Planet Pluto

    NASA Astrophysics Data System (ADS)

    Howell, Steve B.; Horch, Elliott P.; Everett, Mark E.; Ciardi, David R.

    2012-10-01

    We have obtained optical wavelength (692 nm and 880 nm) speckle imaging of the planet Pluto and its largest moon Charon. Using our DSSI speckle camera attached to the Gemini North 8 m telescope, we collected high resolution imaging with an angular resolution of ~20 mas, a value at the Gemini-N telescope diffraction limit. We have produced for this binary system the first speckle reconstructed images, from which we can measure not only the orbital separation and position angle for Charon, but also the diameters of the two bodies. Our measurements of these parameters agree, within the uncertainties, with the current best values for Pluto and Charon. The Gemini-N speckle observations of Pluto are presented to illustrate the capabilities of our instrument and the robust production of high accuracy, high spatial resolution reconstructed images. We hope our results will suggest additional applications of high resolution speckle imaging for other objects within our solar system and beyond. Based on observations obtained at the Gemini Observatory, which is operated by the Association of Universities for Research in Astronomy, Inc., under a cooperative agreement with the National Science Foundation on behalf of the Gemini partnership: the National Science Foundation (United States), the Science and Technology Facilities Council (United Kingdom), the National Research Council (Canada), CONICYT (Chile), the Australian Research Council (Australia), Ministério da Ciência, Tecnologia e Inovação (Brazil) and Ministerio de Ciencia, Tecnología e Innovación Productiva (Argentina).

  7. 16 CFR 501.1 - Camera film.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 16 Commercial Practices 1 2010-01-01 2010-01-01 false Camera film. 501.1 Section 501.1 Commercial... 500 § 501.1 Camera film. Camera film packaged and labeled for retail sale is exempt from the net... should be expressed, provided: (a) The net quantity of contents on packages of movie film and bulk...

  8. 16 CFR 501.1 - Camera film.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 16 Commercial Practices 1 2011-01-01 2011-01-01 false Camera film. 501.1 Section 501.1 Commercial... 500 § 501.1 Camera film. Camera film packaged and labeled for retail sale is exempt from the net... should be expressed, provided: (a) The net quantity of contents on packages of movie film and bulk...

  9. 16 CFR 501.1 - Camera film.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 16 Commercial Practices 1 2014-01-01 2014-01-01 false Camera film. 501.1 Section 501.1 Commercial Practices FEDERAL TRADE COMMISSION RULES, REGULATIONS, STATEMENT OF GENERAL POLICY OR INTERPRETATION AND... 500 § 501.1 Camera film. Camera film packaged and labeled for retail sale is exempt from the...

  10. 21 CFR 886.1120 - Ophthalmic camera.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 8 2014-04-01 2014-04-01 false Ophthalmic camera. 886.1120 Section 886.1120 Food... DEVICES OPHTHALMIC DEVICES Diagnostic Devices § 886.1120 Ophthalmic camera. (a) Identification. An ophthalmic camera is an AC-powered device intended to take photographs of the eye and the surrounding...

  11. 21 CFR 886.1120 - Ophthalmic camera.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 8 2013-04-01 2013-04-01 false Ophthalmic camera. 886.1120 Section 886.1120 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES OPHTHALMIC DEVICES Diagnostic Devices § 886.1120 Ophthalmic camera. (a) Identification. An ophthalmic camera is an AC-powered device...

  12. 21 CFR 892.1110 - Positron camera.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 8 2012-04-01 2012-04-01 false Positron camera. 892.1110 Section 892.1110 Food... DEVICES RADIOLOGY DEVICES Diagnostic Devices § 892.1110 Positron camera. (a) Identification. A positron camera is a device intended to image the distribution of positron-emitting radionuclides in the...

  13. 21 CFR 886.1120 - Opthalmic camera.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 8 2012-04-01 2012-04-01 false Opthalmic camera. 886.1120 Section 886.1120 Food... DEVICES OPHTHALMIC DEVICES Diagnostic Devices § 886.1120 Opthalmic camera. (a) Identification. An ophthalmic camera is an AC-powered device intended to take photographs of the eye and the surrounding...

  14. 21 CFR 892.1110 - Positron camera.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Positron camera. 892.1110 Section 892.1110 Food... DEVICES RADIOLOGY DEVICES Diagnostic Devices § 892.1110 Positron camera. (a) Identification. A positron camera is a device intended to image the distribution of positron-emitting radionuclides in the...

  15. 16 CFR 501.1 - Camera film.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 16 Commercial Practices 1 2012-01-01 2012-01-01 false Camera film. 501.1 Section 501.1 Commercial Practices FEDERAL TRADE COMMISSION RULES, REGULATIONS, STATEMENT OF GENERAL POLICY OR INTERPRETATION AND... 500 § 501.1 Camera film. Camera film packaged and labeled for retail sale is exempt from the...

  16. 21 CFR 886.1120 - Opthalmic camera.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Opthalmic camera. 886.1120 Section 886.1120 Food... DEVICES OPHTHALMIC DEVICES Diagnostic Devices § 886.1120 Opthalmic camera. (a) Identification. An ophthalmic camera is an AC-powered device intended to take photographs of the eye and the surrounding...

  17. 21 CFR 892.1110 - Positron camera.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 8 2013-04-01 2013-04-01 false Positron camera. 892.1110 Section 892.1110 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES RADIOLOGY DEVICES Diagnostic Devices § 892.1110 Positron camera. (a) Identification. A positron camera is a device intended to image...

  18. 16 CFR 501.1 - Camera film.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 16 Commercial Practices 1 2013-01-01 2013-01-01 false Camera film. 501.1 Section 501.1 Commercial Practices FEDERAL TRADE COMMISSION RULES, REGULATIONS, STATEMENT OF GENERAL POLICY OR INTERPRETATION AND... 500 § 501.1 Camera film. Camera film packaged and labeled for retail sale is exempt from the...

  19. 21 CFR 886.1120 - Opthalmic camera.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Opthalmic camera. 886.1120 Section 886.1120 Food... DEVICES OPHTHALMIC DEVICES Diagnostic Devices § 886.1120 Opthalmic camera. (a) Identification. An ophthalmic camera is an AC-powered device intended to take photographs of the eye and the surrounding...

  20. 21 CFR 892.1110 - Positron camera.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 8 2014-04-01 2014-04-01 false Positron camera. 892.1110 Section 892.1110 Food... DEVICES RADIOLOGY DEVICES Diagnostic Devices § 892.1110 Positron camera. (a) Identification. A positron camera is a device intended to image the distribution of positron-emitting radionuclides in the...

  1. An auto-focusing CCD camera mount

    NASA Astrophysics Data System (ADS)

    Arbour, R. W.

    1994-08-01

    The traditional methods of focusing a CCD camera are either time consuming, difficult or, more importantly, indecisive. This paper describes a device designed to allow the observer to be confident that the camera will always be properly focused by sensing a selected star image and automatically adjusting the camera's focal position.

  2. Solid-state array cameras.

    PubMed

    Strull, G; List, W F; Irwin, E L; Farnsworth, D L

    1972-05-01

    Over the past few years there has been growing interest shown in the rapidly maturing technology of totally solid-state imaging. This paper presents a synopsis of developments made in this field at the Westinghouse ATL facilities with emphasis on row-column organized monolithic arrays of diffused junction phototransistors. The complete processing sequence applicable to the fabrication of modern highdensity arrays is described from wafer ingot preparation to final sensor testing. Special steps found necessary for high yield processing, such as surface etching prior to both sawing and lapping, are discussed along with the rationale behind their adoption. Camera systems built around matrix array photosensors are presented in a historical time-wise progression beginning with the first 50 x 50 element converter developed in 1965 and running through the most recent 400 x 500 element system delivered in 1972. The freedom of mechanical architecture made available to system designers by solid-state array cameras is noted from the description of a bare-chip packaged cubic inch camera. Hybrid scan systems employing one-dimensional line arrays are cited, and the basic tradeoffs to their use are listed. PMID:20119094

  3. Unassisted 3D camera calibration

    NASA Astrophysics Data System (ADS)

    Atanassov, Kalin; Ramachandra, Vikas; Nash, James; Goma, Sergio R.

    2012-03-01

    With the rapid growth of 3D technology, 3D image capture has become a critical part of the 3D feature set on mobile phones. 3D image quality is affected by the scene geometry as well as on-the-device processing. An automatic 3D system usually assumes known camera poses accomplished by factory calibration using a special chart. In real life settings, pose parameters estimated by factory calibration can be negatively impacted by movements of the lens barrel due to shaking, focusing, or camera drop. If any of these factors displaces the optical axes of either or both cameras, vertical disparity might exceed the maximum tolerable margin and the 3D user may experience eye strain or headaches. To make 3D capture more practical, one needs to consider unassisted (on arbitrary scenes) calibration. In this paper, we propose an algorithm that relies on detection and matching of keypoints between left and right images. Frames containing erroneous matches, along with frames with insufficiently rich keypoint constellations, are detected and discarded. Roll, pitch yaw , and scale differences between left and right frames are then estimated. The algorithm performance is evaluated in terms of the remaining vertical disparity as compared to the maximum tolerable vertical disparity.

  4. Motion detection with camera shake

    NASA Astrophysics Data System (ADS)

    Kazui, Masato; Itoh, Masaya; Yaemori, Hiroki; Takauji, Hidenori; Kaneko, Shun'ichi

    2009-05-01

    A method for detecting an object's motion in images that suffer from camera shake or images with camera egomotion is proposed. This approach is based on edge orientation codes and on the entropy calculated from a histogram of the edge orientation codes. Here, entropy is extended to spatio-temporal entropy. We consider that the spatio-temporal entropy calculated from time-series orientation codes can represent motion complexity, e.g., the motion of a pedestrian. Our method can reject false positives caused by camera shake or background motion. Before the motion filtering, object candidates are detected by a frame-subtraction-based method. After the filtering, over-detected candidates are evaluated using the spatio-temporal entropy, and false positives are then rejected by a threshold. This method could reject 79 to 96 [%] of all false positives in road roller and escalator scenes. The motion filtering decreased the detection rate somewhat because of motion coherency or small apparent motion of a target. In such cases, we need to introduce a tracking method such as Particle Filter or Mean Shift Tracker. The running speed of our method is 32 to 46 ms per frame with a 160×120 pixel image on an Intel Pentium 4 CPU at 2.8 GHz. We think that this is fast enough for real-time detection. In addition, our method can be used as pre-processing for classifiers based on support vector machines or Boosting.

  5. Coaxial fundus camera for opthalmology

    NASA Astrophysics Data System (ADS)

    de Matos, Luciana; Castro, Guilherme; Castro Neto, Jarbas C.

    2015-09-01

    A Fundus Camera for ophthalmology is a high definition device which needs to meet low light illumination of the human retina, high resolution in the retina and reflection free image1. Those constraints make its optical design very sophisticated, but the most difficult to comply with is the reflection free illumination and the final alignment due to the high number of non coaxial optical components in the system. Reflection of the illumination, both in the objective and at the cornea, mask image quality, and a poor alignment make the sophisticated optical design useless. In this work we developed a totally axial optical system for a non-midriatic Fundus Camera. The illumination is performed by a LED ring, coaxial with the optical system and composed of IR of visible LEDs. The illumination ring is projected by the objective lens in the cornea. The Objective, LED illuminator, CCD lens are coaxial making the final alignment easily to perform. The CCD + capture lens module is a CCTV camera with autofocus and Zoom built in, added to a 175 mm focal length doublet corrected for infinity, making the system easily operated and very compact.

  6. NARROW-K-BAND OBSERVATIONS OF THE GJ 1214 SYSTEM

    SciTech Connect

    Colón, Knicole D.; Gaidos, Eric

    2013-10-10

    GJ 1214 is a nearby M dwarf star that hosts a transiting super-Earth-size planet, making this system an excellent target for atmospheric studies. Most studies find that the transmission spectrum of GJ 1214b is flat, which favors either a high mean molecular weight or cloudy/hazy hydrogen (H) rich atmosphere model. Photometry at short wavelengths (<0.7 μm) and in the K band can discriminate the most between these different atmosphere models for GJ 1214b, but current observations do not have sufficiently high precision. We present photometry of seven transits of GJ 1214b through a narrow K-band (2.141 μm) filter with the Wide Field Camera on the 3.8 m United Kingdom Infrared Telescope. Our photometric precision is typically 1.7 × 10{sup –3} (for a single transit), comparable with other ground-based observations of GJ 1214b. We measure a planet-star radius ratio of 0.1158 ± 0.0013, which, along with other studies, also supports a flat transmission spectrum for GJ 1214b. Since this does not exclude a scenario where GJ 1214b has an H-rich envelope with heavy elements that are sequestered below a cloud/haze layer, we compare K-band observations with models of H{sub 2} collision-induced absorption in an atmosphere for a range of temperatures. While we find no evidence for deviation from a flat spectrum (slope s = 0.0016 ± 0.0038), an H{sub 2}-dominated upper atmosphere (<60 mbar) cannot be excluded. More precise observations at <0.7 μm and in the K band, as well as a uniform analysis of all published data, would be useful for establishing more robust limits on atmosphere models for GJ 1214b.

  7. Narrow field electromagnetic sensor system and method

    DOEpatents

    McEwan, Thomas E.

    1996-01-01

    A narrow field electromagnetic sensor system and method of sensing a characteristic of an object provide the capability to realize a characteristic of an object such as density, thickness, or presence, for any desired coordinate position on the object. One application is imaging. The sensor can also be used as an obstruction detector or an electronic trip wire with a narrow field without the disadvantages of impaired performance when exposed to dirt, snow, rain, or sunlight. The sensor employs a transmitter for transmitting a sequence of electromagnetic signals in response to a transmit timing signal, a receiver for sampling only the initial direct RF path of the electromagnetic signal while excluding all other electromagnetic signals in response to a receive timing signal, and a signal processor for processing the sampled direct RF path electromagnetic signal and providing an indication of the characteristic of an object. Usually, the electromagnetic signal is a short RF burst and the obstruction must provide a substantially complete eclipse of the direct RF path. By employing time-of-flight techniques, a timing circuit controls the receiver to sample only the initial direct RF path of the electromagnetic signal while not sampling indirect path electromagnetic signals. The sensor system also incorporates circuitry for ultra-wideband spread spectrum operation that reduces interference to and from other RF services while allowing co-location of multiple electronic sensors without the need for frequency assignments.

  8. Narrow field electromagnetic sensor system and method

    DOEpatents

    McEwan, T.E.

    1996-11-19

    A narrow field electromagnetic sensor system and method of sensing a characteristic of an object provide the capability to realize a characteristic of an object such as density, thickness, or presence, for any desired coordinate position on the object. One application is imaging. The sensor can also be used as an obstruction detector or an electronic trip wire with a narrow field without the disadvantages of impaired performance when exposed to dirt, snow, rain, or sunlight. The sensor employs a transmitter for transmitting a sequence of electromagnetic signals in response to a transmit timing signal, a receiver for sampling only the initial direct RF path of the electromagnetic signal while excluding all other electromagnetic signals in response to a receive timing signal, and a signal processor for processing the sampled direct RF path electromagnetic signal and providing an indication of the characteristic of an object. Usually, the electromagnetic signal is a short RF burst and the obstruction must provide a substantially complete eclipse of the direct RF path. By employing time-of-flight techniques, a timing circuit controls the receiver to sample only the initial direct RF path of the electromagnetic signal while not sampling indirect path electromagnetic signals. The sensor system also incorporates circuitry for ultra-wideband spread spectrum operation that reduces interference to and from other RF services while allowing co-location of multiple electronic sensors without the need for frequency assignments. 12 figs.

  9. Studies of narrow autoionizing resonances in gadolinium

    SciTech Connect

    Bushaw, Bruce A.; Nortershauser, W.; Blaum, K.; Wendt, Klaus

    2003-06-30

    The autoionization (AI) spectrum of gadolinium between the first and second limits has been investigated by triple-resonance excitation with high-resolution cw lasers. A large number of narrow AI resonances have been observed and assigned total angular momentum J values. The resonances are further divided into members of AI Rydberg series converging to the second limit or other ''interloping'' levels. Fine structure in the Rydberg series has been identified and interpreted in terms of Jc j coupling. A number of detailed studies have been performed on the interloping resonances: These include lifetime determination by lineshape analysis, isotope shifts, hyperfine structure, and photoionization saturation parameters. The electronic structure of the interloping levels is discussed in terms of these studies. Linewidths generally decrease with increasing total angular momentum and the J = 7 resonances are extremely narrow with Lorentzian widths ranging from < 1 MHz up to 157 MHz. The strongest resonances are found to have cross-sections of {approx}10-12 cm{sup 2} and photoionization can be saturated with powers available from cw diode lasers.

  10. Equilibrium contact angle or the most-stable contact angle?

    PubMed

    Montes Ruiz-Cabello, F J; Rodríguez-Valverde, M A; Cabrerizo-Vílchez, M A

    2014-04-01

    It is well-established that the equilibrium contact angle in a thermodynamic framework is an "unattainable" contact angle. Instead, the most-stable contact angle obtained from mechanical stimuli of the system is indeed experimentally accessible. Monitoring the susceptibility of a sessile drop to a mechanical stimulus enables to identify the most stable drop configuration within the practical range of contact angle hysteresis. Two different stimuli may be used with sessile drops: mechanical vibration and tilting. The most stable drop against vibration should reveal the changeless contact angle but against the gravity force, it should reveal the highest resistance to slide down. After the corresponding mechanical stimulus, once the excited drop configuration is examined, the focus will be on the contact angle of the initial drop configuration. This methodology needs to map significantly the static drop configurations with different stable contact angles. The most-stable contact angle, together with the advancing and receding contact angles, completes the description of physically realizable configurations of a solid-liquid system. Since the most-stable contact angle is energetically significant, it may be used in the Wenzel, Cassie or Cassie-Baxter equations accordingly or for the surface energy evaluation. PMID:24140073

  11. Sentinel node detection in an animal study: evaluation of a new portable gamma camera.

    PubMed

    Kopelman, Doron; Blevis, Ira; Iosilevsky, Galina; Hatoum, Ossama A; Zaretzki, Assaf; Shofti, Rona; Salmon, Tal; Israel, Ora; Hashmonai, Moshe

    2007-01-01

    We tested the capacity of a newly developed portable gamma camera to precisely locate sentinel nodes by injecting a radiotracer. Two sets of experiments were performed on eight pigs under general anesthesia. 99mTc-Nanocolloid and dye complex was injected in the submuscular layer of the small bowel in the first set and subcutaneously in the knee region in the second set of experiments. Image acquisition of the sentinel nodes was performed with the Camera placed at various angles. A mosaic of images was obtained encompassing the injection sites, lymphatic pathways, and sentinel lymph nodes. Three-dimensional visualizations were obtained, allowing the precise location and complete excision of these nodes. The use of the portable gamma camera allowed the rapid visualization of the lymphatic pathways leading from the injection sites to the sentinel nodes and precise location of these nodes. The Camera was also useful to verify the complete removal of the labeled target tissues. PMID:17972472

  12. Passive Millimeter Wave Camera (PMMWC) at TRW

    NASA Technical Reports Server (NTRS)

    1997-01-01

    Engineers at TRW, Redondo Beach, California, inspect the Passive Millimeter Wave Camera, a weather-piercing camera designed to see through fog, clouds, smoke and dust. Operating in the millimeter wave portion of the electromagnetic spectrum, the camera creates visual-like video images of objects, people, runways, obstacles and the horizon. A demonstration camera (shown in photo) has been completed and is scheduled for checkout tests and flight demonstration. Engineer (left) holds a compact, lightweight circuit board containing 40 complete radiometers, including antenna, monolithic millimeter wave integrated circuit (MMIC) receivers and signal processing and readout electronics that forms the basis for the camera's 1040-element focal plane array.

  13. Passive Millimeter Wave Camera (PMMWC) at TRW

    NASA Technical Reports Server (NTRS)

    1997-01-01

    Engineers at TRW, Redondo Beach, California, inspect the Passive Millimeter Wave Camera, a weather-piercing camera designed to 'see' through fog, clouds, smoke and dust. Operating in the millimeter wave portion of the electromagnetic spectrum, the camera creates visual-like video images of objects, people, runways, obstacles and the horizon. A demonstration camera (shown in photo) has been completed and is scheduled for checkout tests and flight demonstration. Engineer (left) holds a compact, lightweight circuit board containing 40 complete radiometers, including antenna, monolithic millimeter wave integrated circuit (MMIC) receivers and signal processing and readout electronics that forms the basis for the camera's 1040-element focal plane array.

  14. HHEBBES! All sky camera system: status update

    NASA Astrophysics Data System (ADS)

    Bettonvil, F.

    2015-01-01

    A status update is given of the HHEBBES! All sky camera system. HHEBBES!, an automatic camera for capturing bright meteor trails, is based on a DSLR camera and a Liquid Crystal chopper for measuring the angular velocity. Purpose of the system is to a) recover meteorites; b) identify origin/parental bodies. In 2015, two new cameras were rolled out: BINGO! -alike HHEBBES! also in The Netherlands-, and POgLED, in Serbia. BINGO! is a first camera equipped with a longer focal length fisheye lens, to further increase the accuracy. Several minor improvements have been done and the data reduction pipeline was used for processing two prominent Dutch fireballs.

  15. Mini gamma camera, camera system and method of use

    DOEpatents

    Majewski, Stanislaw; Weisenberger, Andrew G.; Wojcik, Randolph F.

    2001-01-01

    A gamma camera comprising essentially and in order from the front outer or gamma ray impinging surface: 1) a collimator, 2) a scintillator layer, 3) a light guide, 4) an array of position sensitive, high resolution photomultiplier tubes, and 5) printed circuitry for receipt of the output of the photomultipliers. There is also described, a system wherein the output supplied by the high resolution, position sensitive photomultipiler tubes is communicated to: a) a digitizer and b) a computer where it is processed using advanced image processing techniques and a specific algorithm to calculate the center of gravity of any abnormality observed during imaging, and c) optional image display and telecommunications ports.

  16. 33 CFR 117.561 - Kent Island Narrows.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 33 Navigation and Navigable Waters 1 2010-07-01 2010-07-01 false Kent Island Narrows. 117.561... DRAWBRIDGE OPERATION REGULATIONS Specific Requirements Maryland § 117.561 Kent Island Narrows. The draw of the U.S. Route 50/301 bridge, mile 1.0, Kent Island Narrows, operates as follows: (a) From November...

  17. Promoting L2 Vocabulary Learning through Narrow Reading

    ERIC Educational Resources Information Center

    Kang, Eun Young

    2015-01-01

    Krashen (2004) has advocated that narrow reading, i.e., reading a series of texts addressing one specific topic, is an effective method to grow vocabulary. While narrow reading has been championed to have many advantages for L2 vocabulary learning, there remains a relative dearth of empirical studies that test the impact of narrow reading on L2…

  18. 2. Photocopied July 1971 from photostat Jordan Narrows Folder #1, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    2. Photocopied July 1971 from photostat Jordan Narrows Folder #1, Engineering Department, Utah Power and Light Co., Salt Lake City, Utah. JORDAN NARROWS STATION. PLAN AND SECTION. - Salt Lake City Water & Electrical Power Company, Jordan Narrows Hydroelectric Plant, Jordan River, Riverton, Salt Lake County, UT

  19. 3D camera tracking from disparity images

    NASA Astrophysics Data System (ADS)

    Kim, Kiyoung; Woo, Woontack

    2005-07-01

    In this paper, we propose a robust camera tracking method that uses disparity images computed from known parameters of 3D camera and multiple epipolar constraints. We assume that baselines between lenses in 3D camera and intrinsic parameters are known. The proposed method reduces camera motion uncertainty encountered during camera tracking. Specifically, we first obtain corresponding feature points between initial lenses using normalized correlation method. In conjunction with matching features, we get disparity images. When the camera moves, the corresponding feature points, obtained from each lens of 3D camera, are robustly tracked via Kanade-Lukas-Tomasi (KLT) tracking algorithm. Secondly, relative pose parameters of each lens are calculated via Essential matrices. Essential matrices are computed from Fundamental matrix calculated using normalized 8-point algorithm with RANSAC scheme. Then, we determine scale factor of translation matrix by d-motion. This is required because the camera motion obtained from Essential matrix is up to scale. Finally, we optimize camera motion using multiple epipolar constraints between lenses and d-motion constraints computed from disparity images. The proposed method can be widely adopted in Augmented Reality (AR) applications, 3D reconstruction using 3D camera, and fine surveillance systems which not only need depth information, but also camera motion parameters in real-time.

  20. Optimum Projection Angle for Attaining Maximum Distance in a Soccer Punt Kick

    PubMed Central

    Linthorne, Nicholas P.; Patel, Dipesh S.

    2011-01-01

    To produce the greatest horizontal distance in a punt kick the ball must be projected at an appropriate angle. Here, we investigated the optimum projection angle that maximises the distance attained in a punt kick by a soccer goalkeeper. Two male players performed many maximum-effort kicks using projection angles of between 10° and 90°. The kicks were recorded by a video camera at 100 Hz and a 2 D biomechanical analysis was conducted to obtain measures of the projection velocity, projection angle, projection height, ball spin rate, and foot velocity at impact. The player’s optimum projection angle was calculated by substituting mathematical equations for the relationships between the projection variables into the equations for the aerodynamic flight of a soccer ball. The calculated optimum projection angles were in agreement with the player’s preferred projection angles (40° and 44°). In projectile sports even a small dependence of projection velocity on projection angle is sufficient to produce a substantial shift in the optimum projection angle away from 45°. In the punt kicks studied here, the optimum projection angle was close to 45° because the projection velocity of the ball remained almost constant across all projection angles. This result is in contrast to throwing and jumping for maximum distance, where the projection velocity the athlete is able to achieve decreases substantially with increasing projection angle and so the optimum projection angle is well below 45°. Key points The optimum projection angle that maximizes the distance of a punt kick by a soccer goalkeeper is about 45°. The optimum projection angle is close to 45° because the projection velocity of the ball is almost the same at all projection angles. This result is in contrast to throwing and jumping for maximum distance, where the optimum projection angle is well below 45° because the projection velocity the athlete is able to achieve decreases substantially with increasing projection angle. PMID:24149315

  1. High-speed measurement of nozzle swing angle of rocket engine based on monocular vision

    NASA Astrophysics Data System (ADS)

    Qu, Yufu; Yang, Haijuan

    2015-02-01

    A nozzle angle measurement system based on monocular vision is proposed to achieve high-speed and non-contact angle measurement of rocket engine nozzle. The measurement system consists of two illumination sources, a lens, a target board with spots, a high-speed camera, an image acquisition card and a PC. A target board with spots was fixed on the end of rocket engine nozzle. The image of the target board moved along with the rocket engine nozzle swing was captured by a high-speed camera and transferred to the PC by an image acquisition card. Then a data processing algorithm was utilized to acquire the swing angle of the engine nozzle. Experiment shows that the accuracy of swing angle measurement was 0.2° and the measurement frequency was up to 500Hz.

  2. Photometric Calibration and Image Stitching for a Large Field of View Multi-Camera System

    PubMed Central

    Lu, Yu; Wang, Keyi; Fan, Gongshu

    2016-01-01

    A new compact large field of view (FOV) multi-camera system is introduced. The camera is based on seven tiny complementary metal-oxide-semiconductor sensor modules covering over 160° × 160° FOV. Although image stitching has been studied extensively, sensor and lens differences have not been considered in previous multi-camera devices. In this study, we have calibrated the photometric characteristics of the multi-camera device. Lenses were not mounted on the sensor in the process of radiometric response calibration to eliminate the influence of the focusing effect of uniform light from an integrating sphere. Linearity range of the radiometric response, non-linearity response characteristics, sensitivity, and dark current of the camera response function are presented. The R, G, and B channels have different responses for the same illuminance. Vignetting artifact patterns have been tested. The actual luminance of the object is retrieved by sensor calibration results, and is used to blend images to make panoramas reflect the objective luminance more objectively. This compensates for the limitation of stitching images that are more realistic only through the smoothing method. The dynamic range limitation of can be resolved by using multiple cameras that cover a large field of view instead of a single image sensor with a wide-angle lens. The dynamic range is expanded by 48-fold in this system. We can obtain seven images in one shot with this multi-camera system, at 13 frames per second. PMID:27077857

  3. Photometric Calibration and Image Stitching for a Large Field of View Multi-Camera System.

    PubMed

    Lu, Yu; Wang, Keyi; Fan, Gongshu

    2016-01-01

    A new compact large field of view (FOV) multi-camera system is introduced. The camera is based on seven tiny complementary metal-oxide-semiconductor sensor modules covering over 160° × 160° FOV. Although image stitching has been studied extensively, sensor and lens differences have not been considered in previous multi-camera devices. In this study, we have calibrated the photometric characteristics of the multi-camera device. Lenses were not mounted on the sensor in the process of radiometric response calibration to eliminate the influence of the focusing effect of uniform light from an integrating sphere. Linearity range of the radiometric response, non-linearity response characteristics, sensitivity, and dark current of the camera response function are presented. The R, G, and B channels have different responses for the same illuminance. Vignetting artifact patterns have been tested. The actual luminance of the object is retrieved by sensor calibration results, and is used to blend images to make panoramas reflect the objective luminance more objectively. This compensates for the limitation of stitching images that are more realistic only through the smoothing method. The dynamic range limitation of can be resolved by using multiple cameras that cover a large field of view instead of a single image sensor with a wide-angle lens. The dynamic range is expanded by 48-fold in this system. We can obtain seven images in one shot with this multi-camera system, at 13 frames per second. PMID:27077857

  4. Rover mast calibration, exact camera pointing, and camara handoff for visual target tracking

    NASA Technical Reports Server (NTRS)

    Kim, Won S.; Ansar, Adnan I.; Steele, Robert D.

    2005-01-01

    This paper presents three technical elements that we have developed to improve the accuracy of the visual target tracking for single-sol approach-and-instrument placement in future Mars rover missions. An accurate, straightforward method of rover mast calibration is achieved by using a total station, a camera calibration target, and four prism targets mounted on the rover. The method was applied to Rocky8 rover mast calibration and yielded a 1.1-pixel rms residual error. Camera pointing requires inverse kinematic solutions for mast pan and tilt angles such that the target image appears right at the center of the camera image. Two issues were raised. Mast camera frames are in general not parallel to the masthead base frame. Further, the optical axis of the camera model in general does not pass through the center of the image. Despite these issues, we managed to derive non-iterative closed-form exact solutions, which were verified with Matlab routines. Actual camera pointing experiments aver 50 random target image paints yielded less than 1.3-pixel rms pointing error. Finally, a purely geometric method for camera handoff using stereo views of the target has been developed. Experimental test runs show less than 2.5 pixels error on high-resolution Navcam for Pancam-to-Navcam handoff, and less than 4 pixels error on lower-resolution Hazcam for Navcam-to-Hazcam handoff.

  5. Advanced camera image data acquisition system for Pi-of-the-Sky

    NASA Astrophysics Data System (ADS)

    Kwiatkowski, Maciej; Kasprowicz, Grzegorz; Pozniak, Krzysztof; Romaniuk, Ryszard; Wrochna, Grzegorz

    2008-11-01

    The paper describes a new generation of high performance, remote control, CCD cameras designed for astronomical applications. A completely new camera PCB was designed, manufactured, tested and commissioned. The CCD chip was positioned in a different way than previously resulting in better performance of the astronomical video data acquisition system. The camera was built using a low-noise, 4Mpixel CCD circuit by STA. The electronic circuit of the camera is highly parameterized and reconfigurable, as well as modular in comparison with the solution of first generation, due to application of open software solutions and FPGA circuit, Altera Cyclone EP1C6. New algorithms were implemented into the FPGA chip. There were used the following advanced electronic circuit in the camera system: microcontroller CY7C68013a (core 8051) by Cypress, image processor AD9826 by Analog Devices, GigEth interface RTL8169s by Realtec, memory SDRAM AT45DB642 by Atmel, CPU typr microprocessor ARM926EJ-S AT91SAM9260 by ARM and Atmel. Software solutions for the camera and its remote control, as well as image data acquisition are based only on the open source platform. There were used the following image interfaces ISI and API V4L2, data bus AMBA, AHB, INDI protocol. The camera will be replicated in 20 pieces and is designed for continuous on-line, wide angle observations of the sky in the research program Pi-of-the-Sky.

  6. Characterization of the Series 1000 Camera System

    SciTech Connect

    Kimbrough, J; Moody, J; Bell, P; Landen, O

    2004-04-07

    The National Ignition Facility requires a compact network addressable scientific grade CCD camera for use in diagnostics ranging from streak cameras to gated x-ray imaging cameras. Due to the limited space inside the diagnostic, an analog and digital input/output option in the camera controller permits control of both the camera and the diagnostic by a single Ethernet link. The system consists of a Spectral Instruments Series 1000 camera, a PC104+ controller, and power supply. The 4k by 4k CCD camera has a dynamic range of 70 dB with less than 14 electron read noise at a 1MHz readout rate. The PC104+ controller includes 16 analog inputs, 4 analog outputs and 16 digital input/output lines for interfacing to diagnostic instrumentation. A description of the system and performance characterization is reported.

  7. Characterization of the series 1000 camera system

    SciTech Connect

    Kimbrough, J.R.; Moody, J.D.; Bell, P.M.; Landen, O.L.

    2004-10-01

    The National Ignition Facility requires a compact network addressable scientific grade charge coupled device (CCD) camera for use in diagnostics ranging from streak cameras to gated x-ray imaging cameras. Due to the limited space inside the diagnostic, an analog and digital input/output option in the camera controller permits control of both the camera and the diagnostic by a single Ethernet link. The system consists of a Spectral Instruments Series 1000 camera, a PC104+ controller, and power supply. The 4k by 4k CCD camera has a dynamic range of 70 dB with less than 14 electron read noise at a 1 MHz readout rate. The PC104+ controller includes 16 analog inputs, four analog outputs, and 16 digital input/output lines for interfacing to diagnostic instrumentation. A description of the system and performance characterization is reported.

  8. Research on evaluation method of CMOS camera

    NASA Astrophysics Data System (ADS)

    Zhang, Shaoqiang; Han, Weiqiang; Cui, Lanfang

    2014-09-01

    In some professional image application fields, we need to test some key parameters of the CMOS camera and evaluate the performance of the device. Aiming at this requirement, this paper proposes a perfect test method to evaluate the CMOS camera. Considering that the CMOS camera has a big fixed pattern noise, the method proposes the `photon transfer curve method' based on pixels to measure the gain and the read noise of the camera. The advantage of this method is that it can effectively wipe out the error brought by the response nonlinearity. Then the reason of photoelectric response nonlinearity of CMOS camera is theoretically analyzed, and the calculation formula of CMOS camera response nonlinearity is deduced. Finally, we use the proposed test method to test the CMOS camera of 2560*2048 pixels. In addition, we analyze the validity and the feasibility of this method.

  9. Cerebellopontine Angle Epidermoids

    PubMed Central

    Doyle, Karen Jo; De la Cruz, Antonio

    1996-01-01

    Epidermoids, or congenital cholesteatomas, constitute about 0.2% to 1.5% of intracranial tumors, and 3% to 5% of tumors of the cerebellopontine angle (CPA). We review the surgical management of CPA epidermoids in 13 patients at the House Ear Clinic for the years 1978 to 1993. There were seven male and six female patients, ranging in age from 27 to 59 years (average, 40 years). Tumors ranged in size from 3.5 cm to 7.0 cm, and the surgical approach was tailored to the tumor extent and location. All patients complained at presentation of unilateral hearing loss, and nine had poor speech discrimination (less than 50%) preoperatively. Serviceable hearing was preserved in two patients. Two patients presented with facial nerve symptoms, and four cases had postoperative permanent facial nerve paralysis (House-Brackmann Grade V or VI). There were no surgical deaths. Four patients required second surgeries to remove residual cholesteatoma. Compared with prior series, we describe a higher rate of total tumor removed, as well as a higher rate of second operations, indicating a more aggressive approach to these lesions. ImagesFigure 1Figure 2Figure 3 PMID:17170950

  10. AWiFS camera for Resourcesat

    NASA Astrophysics Data System (ADS)

    Dave, Himanshu; Dewan, Chirag; Paul, Sandip; Sarkar, S. S.; Pandya, Himanshu; Joshi, S. R.; Mishra, Ashish; Detroja, Manoj

    2006-12-01

    Remote sensors were developed and used extensively world over using aircraft and space platforms. India has developed and launched many sensors into space to survey natural resources. The AWiFS is one such Camera, launched onboard Resourcesat-1 satellite by ISRO in 2003. It is a medium resolution camera with 5-day revisit designed for studies related to forestry, vegetation, soil, snow and disaster warning. The camera provides 56m (nadir) resolution from 817 km altitude in three visible bands and one SWIR band. This paper deals with configuration features of AWiFS Camera of Resourcesat-1, its onboard performance and also the highlights of Camera being developed for Resourcesat-2. The AWiFS is realized with two identical cameras viz. AWiFS-A and AWiFS-B, which cover the large field of view of 48°. Each camera consists of independent collecting optics and associated 6000 element detectors and electronics catering to 4 bands. The visible bands use linear Silicon CCDs, with 10μ × 7μ element while SWIR band uses 13μ staggered InGaAs linear active pixels. Camera Electronics are custom designed for each detector based on detector and system requirements. The camera covers the total dynamic range up to 100% albedo with a single gain setting and 12-bit digitization of which 10 MSBs are transmitted. The Camera saturation radiance of each band can also be selected through telecommand. The Camera provides very high SNR of about 700 near saturation. The camera components are housed in specially designed Invar structures. The AWiFS Camera onboard Resourcesat-1 is providing excellent imageries and the data is routinely used world over. AWiFS for Resourcesat-2 is being developed with overall performance specifications remaining same. The Camera electronics is miniaturized with reductions in hardware packages, size and weight to one third.

  11. Second-Generation Multi-Angle Imaging Spectroradiometer

    NASA Technical Reports Server (NTRS)

    Macenka, Steven; Hovland, Larry; Preston, Daniel; Zellers, Brian; Downing, Kevin

    2004-01-01

    A report discusses an early phase in the development of the MISR-2 C, a second, improved version of the Multi-angle Imaging SpectroRadiometer (MISR), which has been in orbit around the Earth aboard NASA's Terra spacecraft since 1999. Like the MISR, the MISR-2 would contain a pushbroom array of nine charge-coupled- device (CCD) cameras one aimed at the nadir and the others aimed at different angles sideways from the nadir. The major improvements embodied in the MISR-2 would be the following: A new folded-reflective-optics design would render the MISR-2 only a third as massive as the MISR. Smaller filters and electronic circuits would enable a reduction in volume to a sixth of that of the MISR. The MISR-2 would generate images in two infrared spectral bands in addition to the blue, green, red, and near-infrared spectral bands of the MISR. Miniature polarization filters would be incorporated to add a polarization-sensing capability. Calibration would be performed nonintrusively by use of a gimbaled tenth camera. The main accomplishment thus far has been the construction of an extremely compact all-reflective-optics CCD camera to demonstrate feasibility.

  12. An XMM-Newton observation of the extreme narrow-line Seyfert 1 galaxy Mrk 359

    NASA Astrophysics Data System (ADS)

    O'Brien, P. T.; Page, K.; Reeves, J. N.; Pounds, K.; Turner, M. J. L.; Puchnarewicz, E. M.

    2001-11-01

    We present XMM-Newton observations of Mrk 359, the first narrow-line Seyfert 1 galaxy (NLS1) discovered. Even among NLS1s, Mrk 359 is an extreme case with extraordinarily narrow optical emission lines. The XMM-Newton data show that Mrk 359 has a significant soft X-ray excess which displays only weak absorption and emission features. The (2-10)keV continuum, including reflection, is flatter than that of the typical NLS1, with Γ~1.84. A strong emission line of equivalent width ~200eV is also observed, centred near 6.4keV. We fit this emission with two line components of approximately equal strength: a broad iron line from an accretion disc and a narrow, unresolved core. The unresolved line core has an equivalent width of ~120eV and is consistent with fluorescence from neutral iron in distant reprocessing gas, possibly in the form of a `molecular torus'. Comparison of the narrow-line strengths in Mrk 359 and other low-moderate luminosity Seyfert 1 galaxies with those in QSOs suggests that the solid angle subtended by the distant reprocessing gas decreases with increasing active galactic nucleus luminosity.

  13. Angle-resolved scattering spectroscopy of explosives using an external cavity quantum cascade laser

    SciTech Connect

    Suter, Jonathan D.; Bernacki, Bruce E.; Phillips, Mark C.

    2012-04-01

    Investigation of angle-resolved scattering from solid explosives residues on a car door for non-contact sensing geometries. Illumination with a mid-infrared external cavity quantum cascade laser tuning between 7 and 8 microns was detected both with a sensitive single point detector and a hyperspectral imaging camera. Spectral scattering phenomena were discussed and possibilities for hyperspectral imaging at large scattering angles were outlined.

  14. Gated narrow escape time for molecular signaling.

    PubMed

    Reingruber, Jürgen; Holcman, David

    2009-10-01

    The mean time for a diffusing ligand to activate a target protein located on the surface of a microdomain can regulate cellular signaling. When the ligand switches between various states induced by chemical interactions or conformational changes, while target activation occurs in only one state, this activation time is affected. We investigate this dynamics using new equations for the sojourn times spent in each state. For two states, we obtain exact solutions in dimension one, and asymptotic ones confirmed by Brownian simulations in dimension 3. We find that the activation time is quite sensitive to changes of the switching rates, which can be used to modulate signaling. Interestingly, our analysis reveals that activation can be fast although the ligand spends most of the time "hidden" in the nonactivating state. Finally, we obtain a new formula for the narrow escape time in the presence of switching. PMID:19905605

  15. Line Narrowing Parameter Measurement by Modulation Spectroscopy

    NASA Technical Reports Server (NTRS)

    Dharamsi, Amin N.

    1998-01-01

    Accurate Characterization of Oxygen A-Band Line Parameters by Wavelength Modulation Spectroscopy with tunable diode lasers is an ongoing research at Old Dominion University, under sponsorship from NASA Langley research Center. The work proposed here will be undertaken under the guidance of Dr. William Chu and Dr. Lamont Poole of the Aerosol Research Branch at NASA Langley-Research Center in Hampton, Virginia. The research was started about two years ago and utilizes wavelength modulation absorption spectroscopy with higher harmonic detection, a technique that we developed at Old Dominion University, to obtain the absorption line characteristics of the Oxygen A-band rovibronic lines. Accurate characterization of this absorption band is needed for processing of data that will be obtained in experiments such as the NASA Stratospheric Aerosol and Gas Experiment III (SAGE III) as part of the US Mission to Planet Earth. The research work for Summer Fellowship undertook a measurement of the Dicke line-narrowing parameters of the Oxygen A-Band lines by using wavelength modulation spectroscopy. Our previous theoretical results had indicated that such a measurement could be done sensitively and in a convenient fashion by using this type of spectroscopy. In particular, theoretical results had indicated that the signal magnitude would depend on pressure in a manner that was very sensitive to the narrowing parameter. One of the major tasks undertaken during the summer of 1998 was to establish experimentally that these theoretical predictions were correct. This was done successfully and the results of the work are being prepared for publication. Experimental Results were obtained in which the magnitude of the signal was measured as a function of pressure, for various harmonic detection orders (N = 1, 2, 3, 4, 5). A comparison with theoretical results was made, and it was shown that the agreement between theory and experiment was very good. More importantly, however, it was shown that the measurement was yielded a very sensitive technique for obtaining the narrowing parameter that describes the deviation of Oxygen A-band lines from the Voigt profile. In particular, it was seen that the best fits were obtained consistently when the narrowing parameter value used was 0.022 1/cm, 1/Atm, Previous work, upon which the current work was based has resulted in several accurate measurements of properties of particular lines of the Oxygen A band. For example, this work has resulted in the measurement of the collision cross sections of several lines including the RQ(13,14) and the RR(15,15) lines. A major achievement of achievement of the work was also the demonstration that the technique we have developed can accurately probe the structure of the absorption lineshape function. In particular, the method we have developed is very well suited for experimentally probing the characteristics of lines in their wings. This work was accepted for publication in the Journal of Applied Physics, and is scheduled to appear in the December 15, 1998 issue.

  16. Robotic chair at steep and narrow stairways

    NASA Astrophysics Data System (ADS)

    Imazato, Masahiro; Yamaguchi, Masahiro; Moromugi, Shunji; Ishimatsu, Takakazu

    2007-12-01

    A robotic chair is developed to support mobility of elderly and disabled people living in the house where steep and narrow stairways are installed. In order to deal with such mobility problem the developed robotic chair has a compact original configuration. The robotic chair vertically moves by actuation of electric cylinders and horizontally moves by push-pull operation given by a care-giver. In order to navigate safely every action of the chair is checked by the operator. Up-and-down motions of the robotic chair on the stairway are executed through combinations of motor and cylinder actuations. Performance of the robotic chair was evaluated through two kinds of experiments. The excellent ability of the robotic chair could be confirmed through these experiments.

  17. [Differential diagnosis of a narrow QRS tachycardia].

    PubMed

    Lewalter, Thorsten

    2015-09-01

    The differential diagnosis of a narrow QRS tachycardia requires on the one hand knowledge about the clinical data of the tachycardia patient but on the other hand a systematic step by step analysis of the electrocardiogram (ECG) is the most successful approach. Apart from the question of regularity or irregularity of the QRS complexes, the presence and detection of P waves is also of importance. The P wave timing in relation to the preceding and the following QRS complexes as well as the numerical relationship of P waves and QRS complexes allow a well-founded suspected diagnosis to be achieved in most cases. Even the differentiation between atrioventricular (AV) nodal reentrant tachycardia (AVNRT) versus orthodromic AV reentrant tachycardia (AVRT), e.g. by accessory leads, is in most cases possible in a surface ECG. Obviously, there are constellations which need an invasive electrophysiological procedure for a definitive diagnosis. PMID:26287273

  18. Nondecaying Hydrodynamic Interactions along Narrow Channels

    NASA Astrophysics Data System (ADS)

    Misiunas, Karolis; Pagliara, Stefano; Lauga, Eric; Lister, John R.; Keyser, Ulrich F.

    2015-07-01

    Particle-particle interactions are of paramount importance in every multibody system as they determine the collective behavior and coupling strength. Many well-known interactions such as electrostatic, van der Waals, or screened Coulomb interactions, decay exponentially or with negative powers of the particle spacing r . Similarly, hydrodynamic interactions between particles undergoing Brownian motion decay as 1 /r in bulk, and are assumed to decay in small channels. Such interactions are ubiquitous in biological and technological systems. Here we confine two particles undergoing Brownian motion in narrow, microfluidic channels and study their coupling through hydrodynamic interactions. Our experiments show that the hydrodynamic particle-particle interactions are distance independent in these channels. This finding is of fundamental importance for the interpretation of experiments where dense mixtures of particles or molecules diffuse through finite length, water-filled channels or pore networks.

  19. Nondecaying Hydrodynamic Interactions along Narrow Channels.

    PubMed

    Misiunas, Karolis; Pagliara, Stefano; Lauga, Eric; Lister, John R; Keyser, Ulrich F

    2015-07-17

    Particle-particle interactions are of paramount importance in every multibody system as they determine the collective behavior and coupling strength. Many well-known interactions such as electrostatic, van der Waals, or screened Coulomb interactions, decay exponentially or with negative powers of the particle spacing r. Similarly, hydrodynamic interactions between particles undergoing Brownian motion decay as 1/r in bulk, and are assumed to decay in small channels. Such interactions are ubiquitous in biological and technological systems. Here we confine two particles undergoing Brownian motion in narrow, microfluidic channels and study their coupling through hydrodynamic interactions. Our experiments show that the hydrodynamic particle-particle interactions are distance independent in these channels. This finding is of fundamental importance for the interpretation of experiments where dense mixtures of particles or molecules diffuse through finite length, water-filled channels or pore networks. PMID:26230830

  20. Dynamic-angle spinning and double rotation of quadrupolar nuclei

    SciTech Connect

    Mueller, K.T. California Univ., Berkeley, CA . Dept. of Chemistry)

    1991-07-01

    Nuclear magnetic resonance (NMR) spectroscopy of quadrupolar nuclei is complicated by the coupling of the electric quadrupole moment of the nucleus to local variations in the electric field. The quadrupolar interaction is a useful source of information about local molecular structure in solids, but it tends to broaden resonance lines causing crowding and overlap in NMR spectra. Magic- angle spinning, which is routinely used to produce high resolution spectra of spin-{1/2} nuclei like carbon-13 and silicon-29, is incapable of fully narrowing resonances from quadrupolar nuclei when anisotropic second-order quadrupolar interactions are present. Two new sample-spinning techniques are introduced here that completely average the second-order quadrupolar coupling. Narrow resonance lines are obtained and individual resonances from distinct nuclear sites are identified. In dynamic-angle spinning (DAS) a rotor containing a powdered sample is reoriented between discrete angles with respect to high magnetic field. Evolution under anisotropic interactions at the different angles cancels, leaving only the isotropic evolution of the spin system. In the second technique, double rotation (DOR), a small rotor spins within a larger rotor so that the sample traces out a complicated trajectory in space. The relative orientation of the rotors and the orientation of the larger rotor within the magnetic field are selected to average both first- and second-order anisotropic broadening. The theory of quadrupolar interactions, coherent averaging theory, and motional narrowing by sample reorientation are reviewed with emphasis on the chemical shift anisotropy and second-order quadrupolar interactions experienced by half-odd integer spin quadrupolar nuclei. The DAS and DOR techniques are introduced and illustrated with application to common quadrupolar systems such as sodium-23 and oxygen-17 nuclei in solids.