Science.gov

Sample records for camera narrow angle

  1. Reconditioning of Cassini Narrow-Angle Camera

    NASA Technical Reports Server (NTRS)

    2002-01-01

    These five images of single stars, taken at different times with the narrow-angle camera on NASA's Cassini spacecraft, show the effects of haze collecting on the camera's optics, then successful removal of the haze by warming treatments.

    The image on the left was taken on May 25, 2001, before the haze problem occurred. It shows a star named HD339457.

    The second image from left, taken May 30, 2001, shows the effect of haze that collected on the optics when the camera cooled back down after a routine-maintenance heating to 30 degrees Celsius (86 degrees Fahrenheit). The star is Maia, one of the Pleiades.

    The third image was taken on October 26, 2001, after a weeklong decontamination treatment at minus 7 C (19 F). The star is Spica.

    The fourth image was taken of Spica January 30, 2002, after a weeklong decontamination treatment at 4 C (39 F).

    The final image, also of Spica, was taken July 9, 2002, following three additional decontamination treatments at 4 C (39 F) for two months, one month, then another month.

    Cassini, on its way toward arrival at Saturn in 2004, is a cooperative project of NASA, the European Space Agency and the Italian Space Agency. The Jet Propulsion Laboratory, a division of the California Institute of Technology in Pasadena, manages the Cassini mission for NASA's Office of Space Science, Washington, D.C.

  2. Flight Calibration of the LROC Narrow Angle Camera

    NASA Astrophysics Data System (ADS)

    Humm, D. C.; Tschimmel, M.; Brylow, S. M.; Mahanti, P.; Tran, T. N.; Braden, S. E.; Wiseman, S.; Danton, J.; Eliason, E. M.; Robinson, M. S.

    2016-04-01

    Characterization and calibration are vital for instrument commanding and image interpretation in remote sensing. The Lunar Reconnaissance Orbiter Camera Narrow Angle Camera (LROC NAC) takes 500 Mpixel greyscale images of lunar scenes at 0.5 meters/pixel. It uses two nominally identical line scan cameras for a larger crosstrack field of view. Stray light, spatial crosstalk, and nonlinearity were characterized using flight images of the Earth and the lunar limb. These are important for imaging shadowed craters, studying ˜1 meter size objects, and photometry respectively. Background, nonlinearity, and flatfield corrections have been implemented in the calibration pipeline. An eight-column pattern in the background is corrected. The detector is linear for DN = 600--2000 but a signal-dependent additive correction is required and applied for DN<600. A predictive model of detector temperature and dark level was developed to command dark level offset. This avoids images with a cutoff at DN=0 and minimizes quantization error in companding. Absolute radiometric calibration is derived from comparison of NAC images with ground-based images taken with the Robotic Lunar Observatory (ROLO) at much lower spatial resolution but with the same photometric angles.

  3. Lunar Reconnaissance Orbiter Camera Narrow Angle Cameras: Laboratory and Initial Flight Calibration

    NASA Astrophysics Data System (ADS)

    Humm, D. C.; Tschimmel, M.; Denevi, B. W.; Lawrence, S.; Mahanti, P.; Tran, T. N.; Thomas, P. C.; Eliason, E.; Robinson, M. S.

    2009-12-01

    The Lunar Reconnaissance Orbiter Camera (LROC) has two identical Narrow Angle Cameras (NACs). Each NAC is a monochrome pushbroom scanner, providing images with a pixel scale of 50 cm from a 50-km orbit. A single NAC image has a swath width of 2.5 km and a length of up to 26 km. The NACs are mounted to acquire side-by-side imaging for a combined swath width of 5 km. The NAC is designed to fully characterize future human and robotic landing sites in terms of scientific and resource merit, trafficability, and hazards. The North and South poles will be mapped at 1-meter-scale poleward of 85.5 degrees latitude. Stereo coverage is achieved by pointing the NACs off-nadir, which requires planning in advance. Read noise is 91 and 93 e- and the full well capacity is 334,000 and 352,000 e- for NAC-L and NAC-R respectively. Signal-to-noise ranges from 42 for low-reflectance material with 70 degree illumination to 230 for high-reflectance material with 0 degree illumination. Longer exposure times and 2x binning are available to further increase signal-to-noise with loss of spatial resolution. Lossy data compression from 12 bits to 8 bits uses a companding table selected from a set optimized for different signal levels. A model of focal plane temperatures based on flight data is used to command dark levels for individual images, optimizing the performance of the companding tables and providing good matching of the NAC-L and NAC-R images even before calibration. The preliminary NAC calibration pipeline includes a correction for nonlinearity at low signal levels with an offset applied for DN>600 and a logistic function for DN<600. Flight images taken on the limb of the Moon provide a measure of stray light performance. Averages over many lines of images provide a measure of flat field performance in flight. These are comparable with laboratory data taken with a diffusely reflecting uniform panel.

  4. Extracting Accurate and Precise Topography from Lroc Narrow Angle Camera Stereo Observations

    NASA Astrophysics Data System (ADS)

    Henriksen, M. R.; Manheim, M. R.; Speyerer, E. J.; Robinson, M. S.; LROC Team

    2016-06-01

    The Lunar Reconnaissance Orbiter Camera (LROC) includes two identical Narrow Angle Cameras (NAC) that acquire meter scale imaging. Stereo observations are acquired by imaging from two or more orbits, including at least one off-nadir slew. Digital terrain models (DTMs) generated from the stereo observations are controlled to Lunar Orbiter Laser Altimeter (LOLA) elevation profiles. With current processing methods, digital terrain models (DTM) have absolute accuracies commensurate than the uncertainties of the LOLA profiles (~10 m horizontally and ~1 m vertically) and relative horizontal and vertical precisions better than the pixel scale of the DTMs (2 to 5 m). The NAC stereo pairs and derived DTMs represent an invaluable tool for science and exploration purposes. We computed slope statistics from 81 highland and 31 mare DTMs across a range of baselines. Overlapping DTMs of single stereo sets were also combined to form larger area DTM mosaics, enabling detailed characterization of large geomorphic features and providing a key resource for future exploration planning. Currently, two percent of the lunar surface is imaged in NAC stereo and continued acquisition of stereo observations will serve to strengthen our knowledge of the Moon and geologic processes that occur on all the terrestrial planets.

  5. Extracting accurate and precise topography from LROC narrow angle camera stereo observations

    NASA Astrophysics Data System (ADS)

    Henriksen, M. R.; Manheim, M. R.; Burns, K. N.; Seymour, P.; Speyerer, E. J.; Deran, A.; Boyd, A. K.; Howington-Kraus, E.; Rosiek, M. R.; Archinal, B. A.; Robinson, M. S.

    2017-02-01

    The Lunar Reconnaissance Orbiter Camera (LROC) includes two identical Narrow Angle Cameras (NAC) that each provide 0.5 to 2.0 m scale images of the lunar surface. Although not designed as a stereo system, LROC can acquire NAC stereo observations over two or more orbits using at least one off-nadir slew. Digital terrain models (DTMs) are generated from sets of stereo images and registered to profiles from the Lunar Orbiter Laser Altimeter (LOLA) to improve absolute accuracy. With current processing methods, DTMs have absolute accuracies better than the uncertainties of the LOLA profiles and relative vertical and horizontal precisions less than the pixel scale of the DTMs (2-5 m). We computed slope statistics from 81 highland and 31 mare DTMs across a range of baselines. For a baseline of 15 m the highland mean slope parameters are: median = 9.1°, mean = 11.0°, standard deviation = 7.0°. For the mare the mean slope parameters are: median = 3.5°, mean = 4.9°, standard deviation = 4.5°. The slope values for the highland terrain are steeper than previously reported, likely due to a bias in targeting of the NAC DTMs toward higher relief features in the highland terrain. Overlapping DTMs of single stereo sets were also combined to form larger area DTM mosaics that enable detailed characterization of large geomorphic features. From one DTM mosaic we mapped a large viscous flow related to the Orientale basin ejecta and estimated its thickness and volume to exceed 300 m and 500 km3, respectively. Despite its ∼3.8 billion year age the flow still exhibits unconfined margin slopes above 30°, in some cases exceeding the angle of repose, consistent with deposition of material rich in impact melt. We show that the NAC stereo pairs and derived DTMs represent an invaluable tool for science and exploration purposes. At this date about 2% of the lunar surface is imaged in high-resolution stereo, and continued acquisition of stereo observations will serve to strengthen our

  6. Two Years of Digital Terrain Model Production Using the Lunar Reconnaissance Orbiter Narrow Angle Camera

    NASA Astrophysics Data System (ADS)

    Burns, K.; Robinson, M. S.; Speyerer, E.; LROC Science Team

    2011-12-01

    One of the primary objectives of the Lunar Reconnaissance Orbiter Camera (LROC) is to gather stereo observations with the Narrow Angle Camera (NAC). These stereo observations are used to generate digital terrain models (DTMs). The NAC has a pixel scale of 0.5 to 2.0 meters but was not designed for stereo observations and thus requires the spacecraft to roll off-nadir to acquire these images. Slews interfere with the data collection of the other instruments, so opportunities are currently limited to four per day. Arizona State University has produced DTMs from 95 stereo pairs for 11 Constellation Project (CxP) sites (Aristarchus, Copernicus crater, Gruithuisen domes, Hortensius domes, Ina D-caldera, Lichtenberg crater, Mare Ingenii, Marius hills, Reiner Gamma, South Pole-Aitkin Rim, Sulpicius Gallus) as well as 30 other regions of scientific interest (including: Bhabha crater, highest and lowest elevation points, Highland Ponds, Kugler Anuchin, Linne Crater, Planck Crater, Slipher crater, Sears Crater, Mandel'shtam Crater, Virtanen Graben, Compton/Belkovich, Rumker Domes, King Crater, Luna 16/20/23/24 landing sites, Ranger 6 landing site, Wiener F Crater, Apollo 11/14/15/17, fresh craters, impact melt flows, Larmor Q crater, Mare Tranquillitatis pit, Hansteen Alpha, Moore F Crater, and Lassell Massif). To generate DTMs, the USGS ISIS software and SOCET SET° from BAE Systems are used. To increase the absolute accuracy of the DTMs, data obtained from the Lunar Orbiter Laser Altimeter (LOLA) is used to coregister the NAC images and define the geodetic reference frame. NAC DTMs have been used in examination of several sites, e.g. Compton-Belkovich, Marius Hills and Ina D-caldera [1-3]. LROC will continue to acquire high-resolution stereo images throughout the science phase of the mission and any extended mission opportunities, thus providing a vital dataset for scientific research as well as future human and robotic exploration. [1] B.L. Jolliff (2011) Nature

  7. Narrow Angle movie

    NASA Technical Reports Server (NTRS)

    1999-01-01

    This brief three-frame movie of the Moon was made from three Cassini narrow-angle images as the spacecraft passed by the Moon on the way to its closest approach with Earth on August 17, 1999. The purpose of this particular set of images was to calibrate the spectral response of the narrow-angle camera and to test its 'on-chip summing mode' data compression technique in flight. From left to right, they show the Moon in the green, blue and ultraviolet regions of the spectrum in 40, 60 and 80 millisecond exposures, respectively. All three images have been scaled so that the brightness of Crisium basin, the dark circular region in the upper right, is the same in each image. The spatial scale in the blue and ultraviolet images is 1.4 miles per pixel (2.3 kilometers). The original scale in the green image (which was captured in the usual manner and then reduced in size by 2x2 pixel summing within the camera system) was 2.8 miles per pixel (4.6 kilometers). It has been enlarged for display to the same scale as the other two. The imaging data were processed and released by the Cassini Imaging Central Laboratory for Operations (CICLOPS) at the University of Arizona's Lunar and Planetary Laboratory, Tucson, AZ.

    Photo Credit: NASA/JPL/Cassini Imaging Team/University of Arizona

    Cassini, launched in 1997, is a joint mission of NASA, the European Space Agency and Italian Space Agency. The mission is managed by NASA's Jet Propulsion Laboratory, Pasadena, CA, for NASA's Office of Space Science, Washington DC. JPL is a division of the California Institute of Technology, Pasadena, CA.

  8. NFC - Narrow Field Camera

    NASA Astrophysics Data System (ADS)

    Koukal, J.; Srba, J.; Gorková, S.

    2015-01-01

    We have been introducing a low-cost CCTV video system for faint meteor monitoring and here we describe the first results from 5 months of two-station operations. Our system called NFC (Narrow Field Camera) with a meteor limiting magnitude around +6.5mag allows research on trajectories of less massive meteoroids within individual parent meteor showers and the sporadic background. At present 4 stations (2 pairs with coordinated fields of view) of NFC system are operated in the frame of CEMeNt (Central European Meteor Network). The heart of each NFC station is a sensitive CCTV camera Watec 902 H2 and a fast cinematographic lens Meopta Meostigmat 1/50 - 52.5 mm (50 mm focal length and fixed aperture f/1.0). In this paper we present the first results based on 1595 individual meteors, 368 of which were recorded from two stations simultaneously. This data set allows the first empirical verification of theoretical assumptions for NFC system capabilities (stellar and meteor magnitude limit, meteor apparent brightness distribution and accuracy of single station measurements) and the first low mass meteoroid trajectory calculations. Our experimental data clearly showed the capabilities of the proposed system for low mass meteor registration and for calculations based on NFC data to lead to a significant refinement in the orbital elements for low mass meteoroids.

  9. High-resolution topomapping of candidate MER landing sites with Mars Orbiter Camera narrow-angle images

    USGS Publications Warehouse

    Kirk, R.L.; Howington-Kraus, E.; Redding, B.; Galuszka, D.; Hare, T.M.; Archinal, B.A.; Soderblom, L.A.; Barrett, J.M.

    2003-01-01

    We analyzed narrow-angle Mars Orbiter Camera (MOC-NA) images to produce high-resolution digital elevation models (DEMs) in order to provide topographic and slope information needed to assess the safety of candidate landing sites for the Mars Exploration Rovers (MER) and to assess the accuracy of our results by a variety of tests. The mapping techniques developed also support geoscientific studies and can be used with all present and planned Mars-orbiting scanner cameras. Photogrammetric analysis of MOC stereopairs yields DEMs with 3-pixel (typically 10 m) horizontal resolution, vertical precision consistent with ???0.22 pixel matching errors (typically a few meters), and slope errors of 1-3??. These DEMs are controlled to the Mars Orbiter Laser Altimeter (MOLA) global data set and consistent with it at the limits of resolution. Photoclinometry yields DEMs with single-pixel (typically ???3 m) horizontal resolution and submeter vertical precision. Where the surface albedo is uniform, the dominant error is 10-20% relative uncertainty in the amplitude of topography and slopes after "calibrating" photoclinometry against a stereo DEM to account for the influence of atmospheric haze. We mapped portions of seven candidate MER sites and the Mars Pathfinder site. Safety of the final four sites (Elysium, Gusev, Isidis, and Meridiani) was assessed by mission engineers by simulating landings on our DEMs of "hazard units" mapped in the sites, with results weighted by the probability of landing on those units; summary slope statistics show that most hazard units are smooth, with only small areas of etched terrain in Gusev crater posing a slope hazard.

  10. Calculating video meteor positions in a narrow-angle field with AIP4Win software - Comparison with the positions obtained by SPOSH cameras in a wide-angle field

    NASA Astrophysics Data System (ADS)

    Tsamis, Vagelis; Margonis, Anastasios; Christou, Apostolos

    2013-01-01

    We present an alternative way to calculate the positions of meteors captured in a narrow video field with a Watec camera and a 28 mm aspherical lens (FOV 11 degrees) by using Astronomical Image Processing for Windows, V2, a classic astrometry and photometry software. We have calculated positions for two Perseid meteors in Lyra which were recorded in August 2010, at Mt. Parnon, Greece. We then compare our astrometry position results with the results obtained by SPOSH cameras (FOV 120 degrees) for the same meteors.

  11. Photogrammetric measurement of 3D freeform millimetre-sized objects with micro features: an experimental validation of the close-range camera calibration model for narrow angles of view

    NASA Astrophysics Data System (ADS)

    Percoco, Gianluca; Sánchez Salmerón, Antonio J.

    2015-09-01

    The measurement of millimetre and micro-scale features is performed by high-cost systems based on technologies with narrow working ranges to accurately control the position of the sensors. Photogrammetry would lower the costs of 3D inspection of micro-features and would be applicable to the inspection of non-removable micro parts of large objects too. Unfortunately, the behaviour of photogrammetry is not known when photogrammetry is applied to micro-features. In this paper, the authors address these issues towards the application of digital close-range photogrammetry (DCRP) to the micro-scale, taking into account that in literature there are research papers stating that an angle of view (AOV) around 10° is the lower limit to the application of the traditional pinhole close-range calibration model (CRCM), which is the basis of DCRP. At first a general calibration procedure is introduced, with the aid of an open-source software library, to calibrate narrow AOV cameras with the CRCM. Subsequently the procedure is validated using a reflex camera with a 60 mm macro lens, equipped with extension tubes (20 and 32 mm) achieving magnification of up to 2 times approximately, to verify literature findings with experimental photogrammetric 3D measurements of millimetre-sized objects with micro-features. The limitation experienced by the laser printing technology, used to produce the bi-dimensional pattern on common paper, has been overcome using an accurate pattern manufactured with a photolithographic process. The results of the experimental activity prove that the CRCM is valid for AOVs down to 3.4° and that DCRP results are comparable with the results of existing and more expensive commercial techniques.

  12. A wide-angle camera module for disposable endoscopy

    NASA Astrophysics Data System (ADS)

    Shim, Dongha; Yeon, Jesun; Yi, Jason; Park, Jongwon; Park, Soo Nam; Lee, Nanhee

    2016-08-01

    A wide-angle miniaturized camera module for disposable endoscope is demonstrated in this paper. A lens module with 150° angle of view (AOV) is designed and manufactured. All plastic injection-molded lenses and a commercial CMOS image sensor are employed to reduce the manufacturing cost. The image sensor and LED illumination unit are assembled with a lens module. The camera module does not include a camera processor to further reduce its size and cost. The size of the camera module is 5.5 × 5.5 × 22.3 mm3. The diagonal field of view (FOV) of the camera module is measured to be 110°. A prototype of a disposable endoscope is implemented to perform a pre-clinical animal testing. The esophagus of an adult beagle dog is observed. These results demonstrate the feasibility of a cost-effective and high-performance camera module for disposable endoscopy.

  13. [Clinical features of narrow-angle glaucoma in the native residents of Turkmenistan].

    PubMed

    Orazmukhammedov, Iu G

    1991-01-01

    The author analyzes the specific clinical features of narrow-angle glaucoma in the native residents of Turkmenistan and in the residents of a european origin. He has found some specific anatomic and functional features that may be considered as factors predisposing to the development of narrow-angle glaucoma in Turkmenistan natives.

  14. Association between choroidal thickness and anterior chamber segment in eyes with narrow or open-angle

    PubMed Central

    Li, Song-Feng; Wu, Ge-Wei; Chen, Chang-Xi; Shen, Ling; Zhang, Zhi-Bao; Gao, Fei; Wang, Ning-Li

    2016-01-01

    AIM To investigate the relationship between choroidal thickness and anterior chamber segment in subjects with eyes with narrow or open-angle. METHODS The subfoveal choroidal thickness was measured with enhanced depth-imaging optical coherence tomography and anterior chamber parameters were measured with ultrasound biomicroscopy in one eye of 23 subjects with open-angle eyes and 38 subjects with narrow-angle eyes. The mean age was 59.52±7.04y for narrow-angle subjects and 60.76±7.23y for open-angle subjects (P=0.514). Multivariate linear regression analysis was performed to assess the association between choroidal thickness and narrow-angle parameters. RESULTS There were no differences in subfoveal choroidal thickness between open- and narrow-angle subjects (P=0.231). Anterior chamber parameters, including central anterior chamber depth, trabecular iris angle, iris thickness 500 µm from the scleral spur (IT500), and ciliary body thickness at 1 mm and 2 mm from the scleral spur (CBT1, CBT2) showed significant differences between the two groups (P<0.05). Subfoveal choroidal thickness showed negative correlation (β=-0.496, P=0.016) only with anterior chamber depth in the open-angle group and with age (β=-0.442, P=0.003) and IT500 (β=-0.399, P=0.008) in the narrow-angle group. However, subfoveal choroidal thickness was not correlated with trabecular iris angle, anterior chamber depth, ciliary body thickness, or central corneal thickness in the narrow-angle group. CONCLUSION Choroidal thickness does not differ in the two groups and has not correlated with anterior chamber parameters in narrow-angle subjects, suggesting a lack of relationship between choroidal thickness and primary angle-closure glaucoma. PMID:27588269

  15. Improved wide-angle, fisheye and omnidirectional camera calibration

    NASA Astrophysics Data System (ADS)

    Urban, Steffen; Leitloff, Jens; Hinz, Stefan

    2015-10-01

    In this paper an improved method for calibrating wide-angle, fisheye and omnidirectional imaging systems is presented. We extend the calibration procedure proposed by Scaramuzza et al. by replacing the residual function and joint refinement of all parameters. In doing so, we achieve a more stable, robust and accurate calibration (up to factor 7) and can reduce the number of necessary calibration steps from five to three. After introducing the camera model and highlighting the differences from the current calibration procedure, we perform a comprehensive performance evaluation using several data sets and show the impact of the proposed calibration procedure on the calibration results.

  16. Improved iris localization by using wide and narrow field of view cameras for iris recognition

    NASA Astrophysics Data System (ADS)

    Kim, Yeong Gon; Shin, Kwang Yong; Park, Kang Ryoung

    2013-10-01

    Biometrics is a method of identifying individuals by their physiological or behavioral characteristics. Among other biometric identifiers, iris recognition has been widely used for various applications that require a high level of security. When a conventional iris recognition camera is used, the size and position of the iris region in a captured image vary according to the X, Y positions of a user's eye and the Z distance between a user and the camera. Therefore, the searching area of the iris detection algorithm is increased, which can inevitably decrease both the detection speed and accuracy. To solve these problems, we propose a new method of iris localization that uses wide field of view (WFOV) and narrow field of view (NFOV) cameras. Our study is new as compared to previous studies in the following four ways. First, the device used in our research acquires three images, one each of the face and both irises, using one WFOV and two NFOV cameras simultaneously. The relation between the WFOV and NFOV cameras is determined by simple geometric transformation without complex calibration. Second, the Z distance (between a user's eye and the iris camera) is estimated based on the iris size in the WFOV image and anthropometric data of the size of the human iris. Third, the accuracy of the geometric transformation between the WFOV and NFOV cameras is enhanced by using multiple matrices of the transformation according to the Z distance. Fourth, the searching region for iris localization in the NFOV image is significantly reduced based on the detected iris region in the WFOV image and the matrix of geometric transformation corresponding to the estimated Z distance. Experimental results showed that the performance of the proposed iris localization method is better than that of conventional methods in terms of accuracy and processing time.

  17. Characterizing Geometric Distortion of the Lunar Reconnaissance Orbiter Wide Angle Camera

    NASA Astrophysics Data System (ADS)

    Speyerer, E.; Wagner, R.; Robinson, M. S.; Becker, K. J.; Anderson, J.; Thomas, P. C.

    2011-12-01

    Each month the Lunar Reconnaissance Orbiter (LRO) Wide Angle Camera (WAC) provides 100 m scale images of nearly the entire Moon, each month with different range of lighting conditions [1]. Pre-flight calibration efforts provided a baseline for correcting the geometric distortion present in the WAC. However, residual errors of 1-2 pixels existed with this original model. In-flight calibration enables the derivation of a precise correction for geometric distortion to provide sub-pixel map projection accuracy. For the in-flight calibration, we compared WAC images to high-resolution (0.5 - 2.0 meter scale) images provided by the Narrow Angle Camera (NAC). Since the NAC has very narrow field of view (2.86°) its geometric accuracy is well characterized. The additions of the WAC-derived 100 m/pixel digital terrain model (GLD100) [2] and refined ephemeris provided by LOLA [3] have improved our efforts to remove small distortion artifacts in the WAC camera model. Since the NAC field of view is always in the same cross-track location in the WAC frame, NAC and WAC images of the same regions, under similar lighting conditions, were map projected. Hundreds of NAC (truth image) and WAC images were then co-registered using an automatic registration algorithm in ISIS [4]. This output was fed into a second ISIS program (fplanemap) that converted the registration offsets to focal plane coordinates for the distorted (original) and undistorted (corrected location derived from the truth image) pixel [4]. With this dataset, offsets in the WAC distortion model were identified and accounted for with a new 2D Taylor series function that has been added to the existing radial model. This technique improves the accurate placement of each pixel across the sensor in target space. We have applied this correction to the 643 nm band and will derive the coefficients for the remaining bands. Once this study is complete, a new camera model, instrument kernel (IK), and frames kernel (FK) will be

  18. 10. 22'X34' original blueprint, VariableAngle Launcher, 'SIDE VIEW CAMERA CARSTEEL ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    10. 22'X34' original blueprint, Variable-Angle Launcher, 'SIDE VIEW CAMERA CAR-STEEL FRAME AND AXLES' drawn at 1/2'=1'-0'. (BOURD Sketch # 209124). - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  19. 13. 22'X34' original vellum, VariableAngle Launcher, 'SIDEVIEW CAMERA CAR TRACK ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    13. 22'X34' original vellum, Variable-Angle Launcher, 'SIDEVIEW CAMERA CAR TRACK DETAILS' drawn at 1/4'=1'-0' (BUORD Sketch # 208078, PAPW 908). - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  20. Spectral data of specular reflectance, narrow-angle transmittance and angle-resolved surface scattering of materials for solar concentrators.

    PubMed

    Good, Philipp; Cooper, Thomas; Querci, Marco; Wiik, Nicolay; Ambrosetti, Gianluca; Steinfeld, Aldo

    2016-03-01

    The spectral specular reflectance of conventional and novel reflective materials for solar concentrators is measured with an acceptance angle of 17.5 mrad over the wavelength range 300-2500 nm at incidence angles 15-60° using a spectroscopic goniometry system. The same experimental setup is used to determine the spectral narrow-angle transmittance of semi-transparent materials for solar collector covers at incidence angles 0-60°. In addition, the angle-resolved surface scattering of reflective materials is recorded by an area-scan CCD detector over the spectral range 350-1050 nm. A comprehensive summary, discussion, and interpretation of the results are included in the associated research article "Spectral reflectance, transmittance, and angular scattering of materials for solar concentrators" in Solar Energy Materials and Solar Cells.

  1. Spectral data of specular reflectance, narrow-angle transmittance and angle-resolved surface scattering of materials for solar concentrators

    PubMed Central

    Good, Philipp; Cooper, Thomas; Querci, Marco; Wiik, Nicolay; Ambrosetti, Gianluca; Steinfeld, Aldo

    2015-01-01

    The spectral specular reflectance of conventional and novel reflective materials for solar concentrators is measured with an acceptance angle of 17.5 mrad over the wavelength range 300−2500 nm at incidence angles 15–60° using a spectroscopic goniometry system. The same experimental setup is used to determine the spectral narrow-angle transmittance of semi-transparent materials for solar collector covers at incidence angles 0–60°. In addition, the angle-resolved surface scattering of reflective materials is recorded by an area-scan CCD detector over the spectral range 350–1050 nm. A comprehensive summary, discussion, and interpretation of the results are included in the associated research article “Spectral reflectance, transmittance, and angular scattering of materials for solar concentrators” in Solar Energy Materials and Solar Cells. PMID:26862556

  2. What are the benefits of having multiple camera angles?

    Atmospheric Science Data Center

    2014-12-08

    ... zenith angles allowed by the scanner or push-broom design of the sensor. The accumulation of multiangular observations with such ... More detailed information is provided on the MISR web site Science Goals and Objectives page.   ...

  3. Metamaterial based narrow bandwidth angle-of-incidence independent transmission filters for hyperspectral imaging

    NASA Astrophysics Data System (ADS)

    Crouse, David T.

    2016-05-01

    In this work hyperbolic metamaterials are integrated within Bragg transmission filters with the purpose of eliminating the dependence of the center wavelength of a narrow bandwidth transmission peak on the angle of incidence of the incoming TM polarized beam. The structure is composed of a multi-layer stack of dielectric materials with an array of metal wires vertically penetrating the entire structure. Two types of modeling methods are used to simulate the optical properties of the structure, a coupled wave algorithm that uses a transfer matrix method, and finite element modeling. It is shown that narrow band transmission filters can be designed such that the center wavelength of the transmission peak for TM polarized incident light does not change as the angle of incidence of an incoming beam changes. The method is applied to different hypothetical structures operating in the near infrared, mid wave infrared and long wave infrared. A structure operating at 1.5GHz is designed.

  4. Narrow-angle tail radio sources and the distribution of galaxy orbits in Abell clusters

    NASA Technical Reports Server (NTRS)

    O'Dea, Christopher P.; Sarazin, Craig L.; Owen, Frazer N.

    1987-01-01

    The present data on the orientations of the tails with respect to the cluster centers of a sample of 70 narrow-angle-tail (NAT) radio sources in Abell clusters show the distribution of tail angles to be inconsistent with purely radial or circular orbits in all the samples, while being consistent with isotropic orbits in (1) the whole sample, (2) the sample of NATs far from the cluster center, and (3) the samples of morphologically regular Abell clusters. Evidence for very radial orbits is found, however, in the sample of NATs near the cluster center. If these results can be generalized to all cluster galaxies, then the presence of radial orbits near the center of Abell clusters suggests that violent relaxation may not have been fully effective even within the cores of the regular clusters.

  5. A single camera photogrammetry system for multi-angle fast localization of EEG electrodes.

    PubMed

    Qian, Shuo; Sheng, Yang

    2011-11-01

    Photogrammetry has become an effective method for the determination of electroencephalography (EEG) electrode positions in three dimensions (3D). Capturing multi-angle images of the electrodes on the head is a fundamental objective in the design of photogrammetry system for EEG localization. Methods in previous studies are all based on the use of either a rotating camera or multiple cameras, which are time-consuming or not cost-effective. This study aims to present a novel photogrammetry system that can realize simultaneous acquisition of multi-angle head images in a single camera position. Aligning two planar mirrors with the angle of 51.4°, seven views of the head with 25 electrodes are captured simultaneously by the digital camera placed in front of them. A complete set of algorithms for electrode recognition, matching, and 3D reconstruction is developed. It is found that the elapsed time of the whole localization procedure is about 3 min, and camera calibration computation takes about 1 min, after the measurement of calibration points. The positioning accuracy with the maximum error of 1.19 mm is acceptable. Experimental results demonstrate that the proposed system provides a fast and cost-effective method for the EEG positioning.

  6. Narrow-angle tail radio sources and evidence for radial orbits in Abell clusters

    NASA Technical Reports Server (NTRS)

    O'Dea, Christopher P.; Owen, Frazer N.; Sarazin, Craig L.

    1986-01-01

    Published observational data on the tail orientations (TOs) of 60 narrow-angle-tail (NAT) radio sources in Abell clusters of galaxies are analyzed statistically using a maximum-likelihood approach. The results are presented in a table, and it is found that the observed TO distributions in the whole sample and in subsamples of morphologically regular NATs and NATs with pericentric distances d greater than 500 kpc are consistent with isotropic orbits, whereas the TOs for NATs with d less than 500 kpc are consistent with highly radial orbits. If radial orbits were observed near the centers of other types of cluster galaxies as well, it could be inferred that violent relaxation during cluster formation was incomplete, and that clusters form by spherical collapse and secondary infall, as proposed by Gunn (1977).

  7. The first demonstration of the concept of "narrow-FOV Si/CdTe semiconductor Compton camera"

    NASA Astrophysics Data System (ADS)

    Ichinohe, Yuto; Uchida, Yuusuke; Watanabe, Shin; Edahiro, Ikumi; Hayashi, Katsuhiro; Kawano, Takafumi; Ohno, Masanori; Ohta, Masayuki; Takeda, Shin`ichiro; Fukazawa, Yasushi; Katsuragawa, Miho; Nakazawa, Kazuhiro; Odaka, Hirokazu; Tajima, Hiroyasu; Takahashi, Hiromitsu; Takahashi, Tadayuki; Yuasa, Takayuki

    2016-01-01

    The Soft Gamma-ray Detector (SGD), to be deployed on board the ASTRO-H satellite, has been developed to provide the highest sensitivity observations of celestial sources in the energy band of 60-600 keV by employing a detector concept which uses a Compton camera whose field-of-view is restricted by a BGO shield to a few degree (narrow-FOV Compton camera). In this concept, the background from outside the FOV can be heavily suppressed by constraining the incident direction of the gamma ray reconstructed by the Compton camera to be consistent with the narrow FOV. We, for the first time, demonstrate the validity of the concept using background data taken during the thermal vacuum test and the low-temperature environment test of the flight model of SGD on ground. We show that the measured background level is suppressed to less than 10% by combining the event rejection using the anti-coincidence trigger of the active BGO shield and by using Compton event reconstruction techniques. More than 75% of the signals from the field-of-view are retained against the background rejection, which clearly demonstrates the improvement of signal-to-noise ratio. The estimated effective area of 22.8 cm2 meets the mission requirement even though not all of the operational parameters of the instrument have been fully optimized yet.

  8. Topview stereo: combining vehicle-mounted wide-angle cameras to a distance sensor array

    NASA Astrophysics Data System (ADS)

    Houben, Sebastian

    2015-03-01

    The variety of vehicle-mounted sensors in order to fulfill a growing number of driver assistance tasks has become a substantial factor in automobile manufacturing cost. We present a stereo distance method exploiting the overlapping field of view of a multi-camera fisheye surround view system, as they are used for near-range vehicle surveillance tasks, e.g. in parking maneuvers. Hence, we aim at creating a new input signal from sensors that are already installed. Particular properties of wide-angle cameras (e.g. hanging resolution) demand an adaptation of the image processing pipeline to several problems that do not arise in classical stereo vision performed with cameras carefully designed for this purpose. We introduce the algorithms for rectification, correspondence analysis, and regularization of the disparity image, discuss reasons and avoidance of the shown caveats, and present first results on a prototype topview setup.

  9. Narrowed Aortoseptal Angle Is Related to Increased Central Blood Pressure and Aortic Pressure Wave Reflection

    PubMed Central

    Olafiranye, Oladipupo; Ibrahim, Mediha; Kamran, Haroon; Venner-Jones, Kinda; McFarlane, Samy I.; Salciccioli, Louis; Lazar, Jason M.

    2012-01-01

    The left ventricular (LV) aortoseptal angle (ASA) decreases with age, and is associated with basal septal hypertrophy (septal bulge). Enhanced arterial pressure wave reflection is known to impact LV hypertrophy. We assessed whether ASA is related to central blood pressure (BP) and augmentation index (AI), a measure of the reflected pressure wave. We studied 75 subjects (age 62 ± 16 years; 66% female) who were referred for transthoracic echocardiography and had radial artery applanation tonometry within 24 h. Peripheral systolic BP (P-SBP), peripheral diastolic BP (P-DBP), and peripheral pulse pressure (P-PP) were obtained by sphygmomanometry. Central BPs (C-SBP, C-DBP, C-PP) and AI were derived from applanation tonometry. AI was corrected for heart rate (AI75). The basal septal wall thickness (SWT), mid SWT and ASA were measured using the parasternal long axis echocardiographic view. Mean ASA and AI75 were 117 ± 11° and 22 ± 11%, respectively. ASA correlated with AI75 (r = −0.31, p ≤ 0.01), C-SBP (r = −0.24, p = 0.04), C-PP (r = −0.29, p = 0.01), but only showed a trend towards significance with P-SBP (r = −0.2, p = 0.09) and P-PP (r = −0.21, p = 0.08). Interestingly, C-PP was correlated with basal SWT (r = 0.27, p = 0.02) but not with mid SWT (r = 0.19, p = 0.11). On multivariate linear regression analysis, adjusted for age, gender, weight, and mean arterial pressure, AI75 was an independent predictor of ASA (p = 0.02). Our results suggest that a narrowed ASA is related to increased pressure wave reflection and higher central BP. Further studies are needed to determine whether narrowed LV ASA is a cause or consequence of enhanced wave reflection and whether other factors are involved. PMID:22969773

  10. Miniature optical planar camera based on a wide-angle metasurface doublet corrected for monochromatic aberrations

    NASA Astrophysics Data System (ADS)

    Arbabi, Amir; Arbabi, Ehsan; Kamali, Seyedeh Mahsa; Horie, Yu; Han, Seunghoon; Faraon, Andrei

    2016-11-01

    Optical metasurfaces are two-dimensional arrays of nano-scatterers that modify optical wavefronts at subwavelength spatial resolution. They are poised to revolutionize optics by enabling complex low-cost systems where multiple metasurfaces are lithographically stacked and integrated with electronics. For imaging applications, metasurface stacks can perform sophisticated image corrections and can be directly integrated with image sensors. Here we demonstrate this concept with a miniature flat camera integrating a monolithic metasurface lens doublet corrected for monochromatic aberrations, and an image sensor. The doublet lens, which acts as a fisheye photographic objective, has a small f-number of 0.9, an angle-of-view larger than 60° × 60°, and operates at 850 nm wavelength with 70% focusing efficiency. The camera exhibits nearly diffraction-limited image quality, which indicates the potential of this technology in the development of optical systems for microscopy, photography, and computer vision.

  11. Miniature optical planar camera based on a wide-angle metasurface doublet corrected for monochromatic aberrations

    PubMed Central

    Arbabi, Amir; Arbabi, Ehsan; Kamali, Seyedeh Mahsa; Horie, Yu; Han, Seunghoon; Faraon, Andrei

    2016-01-01

    Optical metasurfaces are two-dimensional arrays of nano-scatterers that modify optical wavefronts at subwavelength spatial resolution. They are poised to revolutionize optics by enabling complex low-cost systems where multiple metasurfaces are lithographically stacked and integrated with electronics. For imaging applications, metasurface stacks can perform sophisticated image corrections and can be directly integrated with image sensors. Here we demonstrate this concept with a miniature flat camera integrating a monolithic metasurface lens doublet corrected for monochromatic aberrations, and an image sensor. The doublet lens, which acts as a fisheye photographic objective, has a small f-number of 0.9, an angle-of-view larger than 60° × 60°, and operates at 850 nm wavelength with 70% focusing efficiency. The camera exhibits nearly diffraction-limited image quality, which indicates the potential of this technology in the development of optical systems for microscopy, photography, and computer vision. PMID:27892454

  12. Visible-infrared achromatic imaging by wavefront coding with wide-angle automobile camera

    NASA Astrophysics Data System (ADS)

    Ohta, Mitsuhiko; Sakita, Koichi; Shimano, Takeshi; Sugiyama, Takashi; Shibasaki, Susumu

    2016-09-01

    We perform an experiment of achromatic imaging with wavefront coding (WFC) using a wide-angle automobile lens. Our original annular phase mask for WFC was inserted to the lens, for which the difference between the focal positions at 400 nm and at 950 nm is 0.10 mm. We acquired images of objects using a WFC camera with this lens under the conditions of visible and infrared light. As a result, the effect of the removal of the chromatic aberration of the WFC system was successfully determined. Moreover, we fabricated a demonstration set assuming the use of a night vision camera in an automobile and showed the effect of the WFC system.

  13. High-precision portable instrument to measure position angles of a video camera for bird flight research

    NASA Astrophysics Data System (ADS)

    Delinger, W. G.; Willis, W. R.

    1988-05-01

    A battery-powered portable instrument for research on the aerodynamics of bird flight has been built to automatically measure and record the horizontal and vertical angles at which a video camera is pointed as an operator videotapes a soaring bird. Each angle was measured to a precision of about 20 arc seconds or better. Two complete systems were constructed, and a triangulation method was used so the same bird in flight could be videotaped by two cameras at different locations to establish the radius vectors from an origin to the bird. The angle information was generated by rotary transducers attached to the camera mounts, and the angle values along with timing data were stored in the semiconductor memory of a single-board computer. The equipment has been successfully tested in the field and promises to have a wider application where a portable instrument is required to measure angles to high precision.

  14. Possible association of glaucoma with pectinate ligament dysplasia and narrowing of the iridocorneal angle in Shiba Inu dogs in Japan.

    PubMed

    Kato, Kumiko; Sasaki, Nobuo; Matsunaga, Satoru; Mochizuki, Manabu; Nishimura, Ryohei; Ogawa, Hiroyuki

    2006-01-01

    The purpose of the study was to describe the shape of the pectinate ligament (PL) and to estimate the width of the iridocorneal angle (ICA) in glaucomatous and non-glaucomatous eye of Shiba Inu dogs in Japan. One hundred fourteen Shiba Inu dogs were presented to the Veterinary Medical Center at the University of Tokyo between June 1998 and June 2003. Among these, 46 dogs had glaucoma; the remaining 68 dogs were presented for routine vaccinations or heartworm testing and were used as a control population. Complete ophthalmic examination and gonioscopy were performed in all the dogs. PL and ICA were evaluated by gonioscopy and goniophotographs. Of the 46 dogs with glaucoma, 17 (37%) were affected bilaterally, and 29 (63%) were affected unilaterally. Of the 29 dogs with unilateral glaucoma, the ICA of the normotensive eye was slightly narrow in 2 eyes (7%), narrow in 12 eyes (41%), and closed in 15 eyes (52%). Among the normotensive control group, the ICA was open in 13 dogs (19%), slightly narrow in 29 (43%), narrow in 22 (32%), and closed in 4 (6%). In dogs with narrow and slightly narrow ICAs, the PLs were thickened, and in some cases, formed a solid sheet. The majority of dogs with glaucoma had changes in both ICA and PL. A narrow or slightly narrow ICA was also detected in the majority of normotensive control dogs. This study suggests that ICA narrowing and PL thickening is a common abnormality in Shiba Inu dogs in Japan and may predispose those dogs to glaucoma.

  15. A New Approach to Micro-arcsecond Astrometry with SIM Allowing Early Mission Narrow Angle Measurements of Compelling Astronomical Targets

    NASA Technical Reports Server (NTRS)

    Shaklan, Stuart; Pan, Xiaopei

    2004-01-01

    The Space Interferometry Mission (SIM) is capable of detecting and measuring the mass of terrestrial planets around stars other than our own. It can measure the mass of black holes and the visual orbits of radio and x-ray binary sources. SIM makes possible a new level of understanding of complex astrophysical processes. SIM achieves its high precision in the so-called narrow-angle regime. This is defined by a 1 degree diameter field in which the position of a target star is measured with respect to a set of reference stars. The observation is performed in two parts: first, SIM observes a grid of stars that spans the full sky. After a few years, repeated observations of the grid allow one to determine the orientation of the interferometer baseline. Second, throughout the mission, SIM periodically observes in the narrow-angle mode. Every narrow-angle observation is linked to the grid to determine the precise attitude and length of the baseline. The narrow angle process demands patience. It is not until five years after launch that SIM achieves its ultimate accuracy of 1 microarcsecond. The accuracy is degraded by a factor of approx. 2 at mid-mission. Our work proposes a technique for narrow angle astrometry that does not rely on the measurement of grid stars. This technique, called Gridless Narrow Angle Astrometry (GNAA) can obtain microarcsecond accuracy and can detect extra-solar planets and other exciting objects with a few days of observation. It can be applied as early as during the first six months of in-orbit calibration (IOC). The motivations for doing this are strong. First, and obviously, it is an insurance policy against a catastrophic mid-mission failure. Second, at the start of the mission, with several space-based interferometers in the planning or implementation phase, NASA will be eager to capture the public's imagination with interferometric science. Third, early results and a technique that can duplicate those results throughout the mission will

  16. Evaluation of the Quality of Action Cameras with Wide-Angle Lenses in Uav Photogrammetry

    NASA Astrophysics Data System (ADS)

    Hastedt, H.; Ekkel, T.; Luhmann, T.

    2016-06-01

    The application of light-weight cameras in UAV photogrammetry is required due to restrictions in payload. In general, consumer cameras with normal lens type are applied to a UAV system. The availability of action cameras, like the GoPro Hero4 Black, including a wide-angle lens (fish-eye lens) offers new perspectives in UAV projects. With these investigations, different calibration procedures for fish-eye lenses are evaluated in order to quantify their accuracy potential in UAV photogrammetry. Herewith the GoPro Hero4 is evaluated using different acquisition modes. It is investigated to which extent the standard calibration approaches in OpenCV or Agisoft PhotoScan/Lens can be applied to the evaluation processes in UAV photogrammetry. Therefore different calibration setups and processing procedures are assessed and discussed. Additionally a pre-correction of the initial distortion by GoPro Studio and its application to the photogrammetric purposes will be evaluated. An experimental setup with a set of control points and a prospective flight scenario is chosen to evaluate the processing results using Agisoft PhotoScan. Herewith it is analysed to which extent a pre-calibration and pre-correction of a GoPro Hero4 will reinforce the reliability and accuracy of a flight scenario.

  17. The faster the narrower: characteristic bulk velocities and jet opening angles of gamma-ray bursts

    NASA Astrophysics Data System (ADS)

    Ghirlanda, G.; Ghisellini, G.; Salvaterra, R.; Nava, L.; Burlon, D.; Tagliaferri, G.; Campana, S.; D'Avanzo, P.; Melandri, A.

    2013-01-01

    The jet opening angle θjet and the bulk Lorentz factor Γ0 are crucial parameters for the computation of the energetics of gamma-ray bursts (GRBs). From the ˜30 GRBs with measured θjet or Γ0 it is known that (i) the real energetic Eγ, obtained by correcting the isotropic equivalent energy Eiso for the collimation factor ˜ θ2jet, is clustered around 1050-1051 erg and it is correlated with the peak energy Ep of the prompt emission and (ii) the comoving frame E'p and E'γ are clustered around typical values. Current estimates of Γ0 and θjet are based on incomplete data samples and their observed distributions could be subject to biases. Through a population synthesis code we investigate whether different assumed intrinsic distributions of Γ0 and θjet can reproduce a set of observational constraints. Assuming that all bursts have the same E'p and E'γ in the comoving frame, we find that Γ0 and θjet cannot be distributed as single power laws. The best agreement between our simulation and the available data is obtained assuming (a) log-normal distributions for θjet and Γ0 and (b) an intrinsic relation between the peak values of their distributions, i.e. θjet2.5Γ0 = const. On average, larger values of Γ0 (i.e. the `faster' bursts) correspond to smaller values of θjet (i.e. the `narrower'). We predict that ˜6 per cent of the bursts that point to us should not show any jet break in their afterglow light curve since they have sin θjet < 1/Γ0. Finally, we estimate that the local rate of GRBs is ˜0.3 per cent of all local Type Ib/c supernova (SNIb/c) and ˜4.3 per cent of local hypernovae, i.e. SNIb/c with broad lines.

  18. Feasibility of laser trabeculoplasty in angle closure glaucoma: a review of favourable histopathological findings in narrow angles.

    PubMed

    Matos, Alexis Galeno; Asrani, Sanjay G; Paula, Jayter Silva

    2017-02-28

    Selective laser trabeculoplasty (SLT) has been indicated as a safe and efficient treatment for primary open-angle glaucoma; however, recent studies have also shown positive results with the use of SLT in some clinical conditions related to primary angle-closure glaucoma (PACG). Despite the potential benefits of SLT in selected cases of PACG, the mechanisms underlying the modifications in the trabecular meshwork tissue of patients with PACG are poorly understood. This narrative review approached both the current, limited knowledge about the histological changes observed in different forms of PACG and the clinical results of SLT treatment for PACG. Favourable outcomes of SLT in patients with PACG, specifically in areas of non-occluded angle, need further substantiation through large controlled clinical trials. A deeper understanding of the biomolecular changes of those areas is essential to improve both laser technical details and the clinical efficacy of SLT therapy.

  19. The Mars observer camera

    NASA Technical Reports Server (NTRS)

    Malin, M. C.; Danielson, G. E.; Ingersoll, A. P.; Masursky, H.; Veverka, J.; Soulanille, T.; Ravine, M.

    1987-01-01

    A camera designed to operate under the extreme constraints of the Mars Observer Mission was selected by NASA in April, 1986. Contingent upon final confirmation in mid-November, the Mars Observer Camera (MOC) will begin acquiring images of the surface and atmosphere of Mars in September-October 1991. The MOC incorporates both a wide angle system for low resolution global monitoring and intermediate resolution regional targeting, and a narrow angle system for high resolution selective surveys. Camera electronics provide control of image clocking and on-board, internal editing and buffering to match whatever spacecraft data system capabilities are allocated to the experiment. The objectives of the MOC experiment follow.

  20. Empirical Photometric Normalization for the Seven Band UV-VIS Lunar Reconnaissance Orbiter Wide Angle Camera

    NASA Astrophysics Data System (ADS)

    Boyd, A. K.; Robinson, M. S.; Nuno, R. G.; Sato, H.

    2014-12-01

    We present results on a near-global (80°S to 80°N) seven color Wide Angle Camera (WAC) photometric normalization and color analysis. Over 100,000 WAC color observations were calibrated to reflectance (radiance factor: IoF), and photometric angles (i,e,g), latitude, and longitude were calculated and stored for each WAC pixel. Photometric angles were calculated using the WAC GLD100 [1], and a six-dimensional data set (3 spatial and 3 photometric) was reduced to three by photometrically normalizing the IoF with a global wavelength-dependent, 3rd-order multivariate polynomial. The multispectral mosaic was normalized to a standard viewing geometry (incidence angle=30°, emission angle=0°, phase angle=30°).The WAC has a 60° cross-track field-of-view in color mode, which allows the acquisition of a near global data set each month; however, the phase angle can change by as much as 60° across each image. These large changes in viewing geometry present challenges to the required photometric normalization. In the ratio of the 321 nm and 689 nm wavelengths, the Moon has a standard deviation less than 3% in the highlands and 7% globally; thus to allow confident identification of true color differences, the photometric normalization must be precise. Pyroclastic deposits in Marius Hills, Sinus Aestuum, and Mare Serenitatis are among the least reflective materials, with 643 nm normalized reflectance values less than 0.036.Low-reflectance deposits are generally concentrated close to the equator on the nearside, whereas high-reflectance materials are dispersed globally. The highest reflectance materials occur at Giordano Bruno and Virtanen craters and are attributed to exposure of immature materials. Immature ejecta has shallower spectral slope compared to the mean highlands spectra (321 nm to 689 nm), and UV weathering characteristics can be seen when comparing different aged Copernican ejecta [2]. Copernican ejecta is found to have 643 nm reflectance values greater than 0

  1. Small-angle approximation to the transfer of narrow laser beams in anisotropic scattering media

    NASA Technical Reports Server (NTRS)

    Box, M. A.; Deepak, A.

    1981-01-01

    The broadening and the signal power detected of a laser beam traversing an anisotropic scattering medium were examined using the small-angle approximation to the radiative transfer equation in which photons suffering large-angle deflections are neglected. To obtain tractable answers, simple Gaussian and non-Gaussian functions for the scattering phase functions are assumed. Two other approximate approaches employed in the field to further simplify the small-angle approximation solutions are described, and the results obtained by one of them are compared with those obtained using small-angle approximation. An exact method for obtaining the contribution of each higher order scattering to the radiance field is examined but no results are presented.

  2. Efficient skew-angle cladding-pumped tunable narrow-linewidth Yb-doped fiber laser.

    PubMed

    Jelger, Pär; Laurell, Fredrik

    2007-12-15

    A skew-angle cladding-pumped tunable Yb-doped fiber laser is presented. The laser was tunable over more than 30 nm, from 1022 to 1055 nm, by employing a volume Bragg grating in a retroreflector configuration as one of the cavity delimiters. Output powers in excess of 4.3 W were recorded with a spectral bandwidth of 5 GHz and an M(2) value below 1.3 over the whole tuning range.

  3. Baffling system for the Wide Angle Camera (WAC) of ROSETTA mission

    NASA Astrophysics Data System (ADS)

    Brunello, Pierfrancesco; Peron, Fabio; Barbieri, Cesare; Fornasier, Sonia

    2000-10-01

    After the experience of GIOTTO fly-by to comet Halley in 1986, the European Space Agency planned to improve the scientific knowledge of these astronomical objects by means of an even more ambitious rendezvous mission with another comet (P/Wirtanen). This mission, named ROSETTA, will go on from 2003 to 2013, ending after the comet perihelion phase and including also the fly-by with two asteroids of the main belt (140 Siwa and 4979 Otawara). Scientific priority of the mission is the in situ investigation of the cometary nucleus, with the aim of better understanding the formation and the composition of planetesimals and their evolution over the last 4.5 billions of years. In this context, the Authors were involved in the design of the baffling for the Wide Angle Camera (WAC) of the imaging system (OSIRIS) carried on board of the spacecraft. Scientific requirements for the WAC are : a large field of view (FOV) of 12 degree(s) x 12 degree(s) with a resolution of 100 (mu) rad per pixel, UV response, and a contrast ratio of 10-4 in order to detect gaseous and dusty features close to the nucleus of the comet. TO achieve these performances, a fairly novel class of optical solutions employing off-axis sections of concentric mirrors was explored. Regarding baffling, the peculiar demand was the rejection of stray-light generated by the optics for sources within the FOV, since the optical entrance aperture is located at the level of the secondary mirror (instead of the primary as usual). This paper describes the baffle design and analyzes its performances, calculated by numerical simulation with ray tracing methods, at different angles of incidence of the light, for sources both outside and inside the field of view.

  4. A triple axis double crystal multiple reflection camera for ultra small angle X-ray scattering

    NASA Astrophysics Data System (ADS)

    Lambard, Jacques; Lesieur, Pierre; Zemb, Thomas

    1992-06-01

    To extend the domain of small angle X-ray scattering requires multiple reflection crystals to collimate the beam. A double crystal, triple axis X-ray camera using multiple reflection channel cut crystals is described. Procedures for measuring the desmeared scattering cross-section on absolute scale are described as well as the measurement from several typical samples : fibrils of collagen, 0.3 μm diameter silica spheres, 0.16 μm diameter interacting latex spheres, porous lignite coal, liquid crystals in a surfactant-water system, colloidal crystal of 0.32 μm diameter silica spheres. L'extension du domaine de diffusion des rayons-X vers les petits angles demande l'emploi de cristaux à réflexions multiples pour collimater le faisceau. Nous décrivons une caméra à rayons-X à trois axes où les réflexions multiples sont réalisées dans deux cristaux à gorge. Nous donnons ensuite les procédures de déconvolution pour obtenir la section efficace de diffusion en échelle absolue, ainsi que les résultats des mesures effectuées avec plusieurs échantillons typiques : fibres de collagène, sphères de silice de 0,3 μm de diamètre, sphères de latex de 0,16 μm de diamètre en interaction, charbon lignite poreux, cristaux liquides formés dans un système eau-tensioactif, solution colloïdale de sphères de silice de 0,32 μm de diamètre.

  5. Numerical simulations of the bending of narrow-angle-tail radio jets by ram pressure or pressure gradients

    NASA Technical Reports Server (NTRS)

    Soker, Noam; Sarazin, Craig L.; O'Dea, Christopher P.

    1988-01-01

    Three-dimensional numerical hydrodynamic simulations are used to study the bending of radio jets. The simulations are compared with observations of jets in narrow-angle-tail radio sources. Two mechanisms for the observed bending are considered: direct bending of quasi-continuous jets by ram pressure from intergalactic gas and bending by pressure gradients in the interstellar gas of the host galaxy, the pressure gradients themselves being the result of ram pressure by intergalactic gas. It is shown that the pressure gradients are much less effective in bending jets, implying that the jets have roughly 30 times lower momentum fluxes if they are bent by this mechanism. Ram-pressure bending produces jets with 'kidney-shaped' cross sections; when observed from the side, these jets appear to have diffuse extensions on the downstream side. On the other hand, pressure-gradient bending causes the jets to be densest near their upstream side.

  6. Wide angle and narrow-band asymmetric absorption in visible and near-infrared regime through lossy Bragg stacks

    PubMed Central

    Shu, Shiwei; Zhan, Yawen; Lee, Chris; Lu, Jian; Li, Yang Yang

    2016-01-01

    Absorber is an important component in various optical devices. Here we report a novel type of asymmetric absorber in the visible and near-infrared spectrum which is based on lossy Bragg stacks. The lossy Bragg stacks can achieve near-perfect absorption at one side and high reflection at the other within the narrow bands (several nm) of resonance wavelengths, whereas display almost identical absorption/reflection responses for the rest of the spectrum. Meanwhile, this interesting wavelength-selective asymmetric absorption behavior persists for wide angles, does not depend on polarization, and can be ascribed to the lossy characteristics of the Bragg stacks. Moreover, interesting Fano resonance with easily tailorable peak profiles can be realized using the lossy Bragg stacks. PMID:27251768

  7. Wide-angle narrow-bandpass optical detection system optimally designed to have a large signal-to-noise ratio

    NASA Astrophysics Data System (ADS)

    Schweitzer, Naftali; Arieli, Yoel

    2000-02-01

    A method for achieving optimal design of a wide-angle narrow-bandpass optical detection system composed of a spherical interference filter and a circular photodetector is introduced. It was found that there is an optimal photodetector diameter that maximizes the signal-to-noise ratio (SNR) for a given filter configuration. We show how to optimize optical detection systems based on spherical interference filters for all the important parameters simultaneously. The SNR values of these systems are compared with the SNR values of spherical-step-filter-based detection systems. When large silicon photodetectors are used, the two systems have equal SNR values so that the more economical step-filter systems are preferable. The results given here in the near-infrared region can be used for the optimization of any configuration of a detection system based on a spherical interference filter and a silicon photodetector working at the same wavelength range, without further calculations.

  8. Development of the Multi-Angle Snowflake Camera for Improved Hydrometeor Habit-Diameter-Mass-Fallspeed Parameterizations

    DTIC Science & Technology

    2011-05-20

    storms. Completion of the Multi-angle Snowflake Camera (MASC) will provide for the first time auto - mated measurement of these properties. The...data that can be collected for hydrometeors at the point that MASC development is completed. • Stereoscopic polarized color images – Color and...10 µm at 1/100,000th second shutter speed. • Volume and mass – Calculated through stereoscopic reconstruction. • Fall speed – Vertically stacked

  9. Early direct-injection, low-temperature combustion of diesel fuel in an optical engine utilizing a 15-hole, dual-row, narrow-included-angle nozzle.

    SciTech Connect

    Gehrke, Christopher R.; Radovanovic, Michael S.; Milam, David M.; Martin, Glen C.; Mueller, Charles J.

    2008-04-01

    Low-temperature combustion of diesel fuel was studied in a heavy-duty, single-cylinder optical engine employing a 15-hole, dual-row, narrow-included-angle nozzle (10 holes x 70/mD and 5 holes x 35/mD) with 103-/gmm-diameter orifices. This nozzle configuration provided the spray targeting necessary to contain the direct-injected diesel fuel within the piston bowl for injection timings as early as 70/mD before top dead center. Spray-visualization movies, acquired using a high-speed camera, show that impingement of liquid fuel on the piston surface can result when the in-cylinder temperature and density at the time of injection are sufficiently low. Seven single- and two-parameter sweeps around a 4.82-bar gross indicated mean effective pressure load point were performed to map the sensitivity of the combustion and emissions to variations in injection timing, injection pressure, equivalence ratio, simulated exhaust-gas recirculation, intake temperature, intake boost pressure, and load. High-speed movies of natural luminosity were acquired by viewing through a window in the cylinder wall and through a window in the piston to provide quasi-3D information about the combustion process. These movies revealed that advanced combustion phasing resulted in intense pool fires within the piston bowl, after the end of significant heat release. These pool fires are a result of fuel-films created when the injected fuel impinged on the piston surface. The emissions results showed a strong correlation with pool-fire activity. Smoke and NO/dx emissions rose steadily as pool-fire intensity increased, whereas HC and CO showed a dramatic increase with near-zero pool-fire activity.

  10. Miniature Wide-Angle Lens for Small-Pixel Electronic Camera

    NASA Technical Reports Server (NTRS)

    Mouroulils, Pantazis; Blazejewski, Edward

    2009-01-01

    A proposed wideangle lens is shown that would be especially well suited for an electronic camera in which the focal plane is occupied by an image sensor that has small pixels. The design of the lens is intended to satisfy requirements for compactness, high image quality, and reasonably low cost, while addressing issues peculiar to the operation of small-pixel image sensors. Hence, this design is expected to enable the development of a new generation of compact, high-performance electronic cameras. The lens example shown has a 60 degree field of view and a relative aperture (f-number) of 3.2. The main issues affecting the design are also shown.

  11. Mars Observer camera

    NASA Technical Reports Server (NTRS)

    Malin, M. C.; Danielson, G. E.; Ingersoll, A. P.; Masursky, H.; Veverka, J.; Ravine, M. A.; Soulanille, T. A.

    1992-01-01

    The Mars Observer camera (MOC) is a three-component system (one narrow-angle and two wide-angle cameras) designed to take high spatial resolution pictures of the surface of Mars and to obtain lower spatial resolution, synoptic coverage of the planet's surface and atmosphere. The cameras are based on the 'push broom' technique; that is, they do not take 'frames' but rather build pictures, one line at a time, as the spacecraft moves around the planet in its orbit. MOC is primarily a telescope for taking extremely high resolution pictures of selected locations on Mars. Using the narrow-angle camera, areas ranging from 2.8 km x 2.8 km to 2.8 km x 25.2 km (depending on available internal digital buffer memory) can be photographed at about 1.4 m/pixel. Additionally, lower-resolution pictures (to a lowest resolution of about 11 m/pixel) can be acquired by pixel averaging; these images can be much longer, ranging up to 2.8 x 500 km at 11 m/pixel. High-resolution data will be used to study sediments and sedimentary processes, polar processes and deposits, volcanism, and other geologic/geomorphic processes.

  12. Lunar Reconnaissance Orbiter Camera (LROC) instrument overview

    USGS Publications Warehouse

    Robinson, M.S.; Brylow, S.M.; Tschimmel, M.; Humm, D.; Lawrence, S.J.; Thomas, P.C.; Denevi, B.W.; Bowman-Cisneros, E.; Zerr, J.; Ravine, M.A.; Caplinger, M.A.; Ghaemi, F.T.; Schaffner, J.A.; Malin, M.C.; Mahanti, P.; Bartels, A.; Anderson, J.; Tran, T.N.; Eliason, E.M.; McEwen, A.S.; Turtle, E.; Jolliff, B.L.; Hiesinger, H.

    2010-01-01

    The Lunar Reconnaissance Orbiter Camera (LROC) Wide Angle Camera (WAC) and Narrow Angle Cameras (NACs) are on the NASA Lunar Reconnaissance Orbiter (LRO). The WAC is a 7-color push-frame camera (100 and 400 m/pixel visible and UV, respectively), while the two NACs are monochrome narrow-angle linescan imagers (0.5 m/pixel). The primary mission of LRO is to obtain measurements of the Moon that will enable future lunar human exploration. The overarching goals of the LROC investigation include landing site identification and certification, mapping of permanently polar shadowed and sunlit regions, meter-scale mapping of polar regions, global multispectral imaging, a global morphology base map, characterization of regolith properties, and determination of current impact hazards.

  13. On an assessment of surface roughness estimates from lunar laser altimetry pulse-widths for the Moon from LOLA using LROC narrow-angle stereo DTMs.

    NASA Astrophysics Data System (ADS)

    Muller, Jan-Peter; Poole, William

    2013-04-01

    Neumann et al. [1] proposed that laser altimetry pulse-widths could be employed to derive "within-footprint" surface roughness as opposed to surface roughness estimated from between laser altimetry pierce-points such as the example for Mars [2] and more recently from the 4-pointed star-shaped LOLA (Lunar reconnaissance Orbiter Laser Altimeter) onboard the NASA-LRO [3]. Since 2009, the LOLA has been collecting extensive global laser altimetry data with a 5m footprint and ?25m between the 5 points in a star-shape. In order to assess how accurately surface roughness (defined as simple RMS after slope correction) derived from LROC matches with surface roughness derived from LOLA footprints, publicly released LROC-NA (LRO Camera Narrow Angle) 1m Digital Terrain Models (DTMs) were employed to measure the surface roughness directly within each 5m footprint. A set of 20 LROC-NA DTMs were examined. Initially the match-up between the LOLA and LROC-NA orthorectified images (ORIs) is assessed visually to ensure that the co-registration is better than the LOLA footprint resolution. For each LOLA footprint, the pulse-width geolocation is then retrieved and this is used to "cookie-cut" the surface roughness and slopes derived from the LROC-NA DTMs. The investigation which includes data from a variety of different landforms shows little, if any correlation between surface roughness estimated from DTMs with LOLA pulse-widths at sub-footprint scale. In fact there is only any perceptible correlation between LOLA and LROC-DTMs at baselines of 40-60m for surface roughness and 20m for slopes. [1] Neumann et al. Mars Orbiter Laser Altimeter pulse width measurements and footprint-scale roughness. Geophysical Research Letters (2003) vol. 30 (11), paper 1561. DOI: 10.1029/2003GL017048 [2] Kreslavsky and Head. Kilometer-scale roughness of Mars: results from MOLA data analysis. J Geophys Res (2000) vol. 105 (E11) pp. 26695-26711. [3] Rosenburg et al. Global surface slopes and roughness of the

  14. Matching the Best Viewing Angle in Depth Cameras for Biomass Estimation Based on Poplar Seedling Geometry

    PubMed Central

    Andújar, Dionisio; Fernández-Quintanilla, César; Dorado, José

    2015-01-01

    In energy crops for biomass production a proper plant structure is important to optimize wood yields. A precise crop characterization in early stages may contribute to the choice of proper cropping techniques. This study assesses the potential of the Microsoft Kinect for Windows v.1 sensor to determine the best viewing angle of the sensor to estimate the plant biomass based on poplar seedling geometry. Kinect Fusion algorithms were used to generate a 3D point cloud from the depth video stream. The sensor was mounted in different positions facing the tree in order to obtain depth (RGB-D) images from different angles. Individuals of two different ages, e.g., one month and one year old, were scanned. Four different viewing angles were compared: top view (0°), 45° downwards view, front view (90°) and ground upwards view (−45°). The ground-truth used to validate the sensor readings consisted of a destructive sampling in which the height, leaf area and biomass (dry weight basis) were measured in each individual plant. The depth image models agreed well with 45°, 90° and −45° measurements in one-year poplar trees. Good correlations (0.88 to 0.92) between dry biomass and the area measured with the Kinect were found. In addition, plant height was accurately estimated with a few centimeters error. The comparison between different viewing angles revealed that top views showed poorer results due to the fact the top leaves occluded the rest of the tree. However, the other views led to good results. Conversely, small poplars showed better correlations with actual parameters from the top view (0°). Therefore, although the Microsoft Kinect for Windows v.1 sensor provides good opportunities for biomass estimation, the viewing angle must be chosen taking into account the developmental stage of the crop and the desired parameters. The results of this study indicate that Kinect is a promising tool for a rapid canopy characterization, i.e., for estimating crop biomass

  15. Matching the best viewing angle in depth cameras for biomass estimation based on poplar seedling geometry.

    PubMed

    Andújar, Dionisio; Fernández-Quintanilla, César; Dorado, José

    2015-06-04

    In energy crops for biomass production a proper plant structure is important to optimize wood yields. A precise crop characterization in early stages may contribute to the choice of proper cropping techniques. This study assesses the potential of the Microsoft Kinect for Windows v.1 sensor to determine the best viewing angle of the sensor to estimate the plant biomass based on poplar seedling geometry. Kinect Fusion algorithms were used to generate a 3D point cloud from the depth video stream. The sensor was mounted in different positions facing the tree in order to obtain depth (RGB-D) images from different angles. Individuals of two different ages, e.g., one month and one year old, were scanned. Four different viewing angles were compared: top view (0°), 45° downwards view, front view (90°) and ground upwards view (-45°). The ground-truth used to validate the sensor readings consisted of a destructive sampling in which the height, leaf area and biomass (dry weight basis) were measured in each individual plant. The depth image models agreed well with 45°, 90° and -45° measurements in one-year poplar trees. Good correlations (0.88 to 0.92) between dry biomass and the area measured with the Kinect were found. In addition, plant height was accurately estimated with a few centimeters error. The comparison between different viewing angles revealed that top views showed poorer results due to the fact the top leaves occluded the rest of the tree. However, the other views led to good results. Conversely, small poplars showed better correlations with actual parameters from the top view (0°). Therefore, although the Microsoft Kinect for Windows v.1 sensor provides good opportunities for biomass estimation, the viewing angle must be chosen taking into account the developmental stage of the crop and the desired parameters. The results of this study indicate that Kinect is a promising tool for a rapid canopy characterization, i.e., for estimating crop biomass

  16. Large-area proportional counter camera for the US National Small-Angle Neutron Scattering Facility

    SciTech Connect

    Abele, R.K.; Allin, G.W.; Clay, W.T.; Fowler, C.E.; Kopp, M.K.

    1980-01-01

    An engineering model of a multiwire position-sensitive proportional-counter (PSPC) was developed, tested, and installed at the US National Small-Angle Neutron Scattering Facility at ORNL. The PSPC is based on the RC-encoding and time-difference decoding method to measure the spatial coordinates of the interaction loci of individual scattered neutrons. The active area of the PSPC is 65 cm x 65 cm, and the active depth is 3.6 cm. The spatial uncertainty in both coordinates is approx. 1.0 cm (fwhm) for thermal neutrons; thus, a matrix of 64 x 64 picture elements is resolved. The count rate capability for randomly detected neutrons is 10/sup 4/ counts per second, with < 3% coincidence loss. The PSPC gas composition is 63% /sup 3/He, 32% Xe, and 5% CO/sub 2/ at an absolute pressure of approx. 3 x 10/sup 5/ Pa (3 atm). The detection efficiency is approx. 90% for the 0.475-nm (4.75-A) neutrons used in the scattering experiments.

  17. Comparing Laser Peripheral Iridotomy to Cataract Extraction in Narrow Angle Eyes Using Anterior Segment Optical Coherence Tomography

    PubMed Central

    Melese, Ephrem; Peterson, Jeffrey R.; Feldman, Robert M.; Baker, Laura A.; Bell, Nicholas P.; Chuang, Alice Z.

    2016-01-01

    Purpose To evaluate the changes in anterior chamber angle (ACA) parameters in primary angle closure (PAC) spectrum eyes before and after cataract extraction (CE) and compare to the changes after laser peripheral iridotomy (LPI) using anterior segment optical coherence tomography (ASOCT). Methods Twenty-eight PAC spectrum eyes of 18 participants who underwent CE and 34 PAC spectrum eyes of 21 participants who underwent LPI were included. ASOCT images with 3-dimensional mode angle analysis scans were taken with the CASIA SS-1000 (Tomey Corp., Nagoya, Japan) before and after CE or LPI. Mixed-effect model analysis was used to 1) compare best-corrected visual acuity, intraocular pressure, and ACA parameters before and after CE; 2) identify and estimate the effects of potential contributing factors affecting changes in ACA parameters; and 3) compare CE and LPI treatment groups. Results The increase in average angle parameters (TISA750 and TICV750) was significantly greater after CE than LPI. TICV750 increased by 102% (2.114 [±1.203] μL) after LPI and by 174% (4.546 [± 1.582] μL) after CE (P < 0.001). Change of TICV750 in the CE group was significantly affected by age (P = 0.002), race (P = 0.006), and intraocular lens power (P = 0.037). Conclusions CE results in greater anatomic changes in the ACA than LPI in PAC spectrum eyes. ASOCT may be used to follow anatomic changes in the angle after intervention. PMID:27606482

  18. Post-trial anatomical frame alignment procedure for comparison of 3D joint angle measurement from magnetic/inertial measurement units and camera-based systems.

    PubMed

    Li, Qingguo; Zhang, Jun-Tian

    2014-11-01

    Magnetic and inertial measurement units (MIMUs) have been widely used as an alternative to traditional camera-based motion capture systems for 3D joint kinematics measurement. Since these sensors do not directly measure position, a pre-trial anatomical calibration, either with the assistance of a special protocol/apparatus or with another motion capture system is required to establish the transformation matrices between the local sensor frame and the anatomical frame (AF) of each body segment on which the sensors are attached. Because the axes of AFs are often used as the rotational axes in the joint angle calculation, any difference in the AF determination will cause discrepancies in the calculated joint angles. Therefore, a direct comparison of joint angles between MIMU systems and camera-based systems is less meaningful because the calculated joint angles contain a systemic error due to the differences in the AF determination. To solve this problem a new post-trial AF alignment procedure is proposed. By correcting the AF misalignments, the joint angle differences caused by the difference in AF determination are eliminated and the remaining discrepancies are mainly from the measurement accuracy of the systems themselves. Lower limb joint angles from 30 walking trials were used to validate the effectiveness of the proposed AF alignment procedure. This technique could serve as a new means for calibrating magnetic/inertial sensor-based motion capture systems and correcting for AF misalignment in scenarios where joint angles are compared directly.

  19. 2. VAL CAMERA CAR, VIEW OF CAMERA CAR AND TRACK ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    2. VAL CAMERA CAR, VIEW OF CAMERA CAR AND TRACK WITH CAMERA STATION ABOVE LOOKING WEST TAKEN FROM RESERVOIR. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  20. 7. VAL CAMERA CAR, DETAIL OF 'FLARE' OR TRAJECTORY CAMERA ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    7. VAL CAMERA CAR, DETAIL OF 'FLARE' OR TRAJECTORY CAMERA INSIDE CAMERA CAR. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  1. 6. VAL CAMERA CAR, DETAIL OF COMMUNICATION EQUIPMENT INSIDE CAMERA ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    6. VAL CAMERA CAR, DETAIL OF COMMUNICATION EQUIPMENT INSIDE CAMERA CAR WITH CAMERA MOUNT IN FOREGROUND. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  2. Inner jet kinematics and the viewing angle towards the γ-ray narrow-line Seyfert 1 galaxy 1H 0323+342

    NASA Astrophysics Data System (ADS)

    Fuhrmann, Lars; Karamanavis, Vassilis; Komossa, Stefanie; Angelakis, Emmanouil; Krichbaum, Thomas P.; Schulz, Robert; Kreikenbohm, Annika; Kadler, Matthias; Myserlis, Ioannis; Ros, Eduardo; Nestoras, Ioannis; Zensus, J. Anton

    2016-11-01

    Near-Eddington accretion rates onto low-mass black holes are thought to be a prime driver of the multi-wavelength properties of the narrow-line Seyfert 1 (NLS1) population of active galactic nuclei (AGNs). Orientation effects have repeatedly been considered as another important factor involved, but detailed studies have been hampered by the lack of measured viewing angles towards this type of AGN. Here we present multi-epoch, 15 GHz VLBA images (MOJAVE program) of the radio-loud and Fermi/LAT-detected NLS1 galaxy 1H 0323+342. These are combined with single-dish, multi-frequency radio monitoring of the source's variability, obtained with the Effelsberg 100-m and IRAM 30-m telescopes, in the course of the F-GAMMA program. The VLBA images reveal six components with apparent speeds of ˜ 1-7 c, and one quasi-stationary feature. Combining the obtained apparent jet speed (β app) and variability Doppler factor (D var) estimates together with other methods, we constrain the viewing angle θ towards 1H 0323+342 to θ ≤ 4°-13°. Using literature values of βapp and D var, we also deduce a viewing angle of ≤ 8°-9° towards another radio- and γ-ray-loud NLS1, namely SBS 0846+513.

  3. Combined ab interno trabeculotomy and lens extraction: a novel management option for combined uveitic and chronic narrow angle raised intraocular pressure.

    PubMed

    Lin, Siying; Gupta, Bhaskar; Rossiter, Jonathan

    2016-02-01

    Minimally invasive glaucoma surgery is a developing area that has the potential to replace traditional glaucoma surgery, with its known risk profile, but at present there are no randomised controlled data to validate its use. We report on a case where sequential bilateral combined ab interno trabeculotomy and lens extraction surgery was performed on a 45-year-old woman with combined uveitic and chronic narrow angle raised intraocular pressure. Maximal medical management alone could not control the intraocular pressure. At 12-month follow-up, the patient had achieved stable intraocular pressure in both eyes on a combination of topical ocular antiglaucomatous and steroid therapies. This case demonstrates the effectiveness of trabecular meshwork ablation via ab interno trabeculotomy in a case of complex mixed mechanism glaucoma.

  4. The rate and causes of lunar space weathering: Insights from Lunar Reconnaissance Orbiter Wide Angle Camera ultraviolet observations

    NASA Astrophysics Data System (ADS)

    Denevi, B. W.; Robinson, M. S.; Sato, H.; Hapke, B. W.; McEwen, A. S.; Hawke, B. R.

    2011-12-01

    Lunar Reconnaissance Orbiter Wide Angle Camera global ultraviolet and visible imaging provides a unique opportunity to examine the rate and causes of space weathering on the Moon. Silicates typically have a strong decrease in reflectance toward UV wavelengths (<~450 nm) due to strong bands at 250 nm and in the far UV. Metallic iron is relatively spectrally neutral, and laboratory spectra suggest that its addition to mature soils in the form of submicroscopic iron (also known as nanophase iron) flattens silicate spectra, significantly reducing spectral slope in the ultraviolet. Reflectance at ultraviolet wavelengths may be especially sensitive to the surface coatings that form due to exposure to space weathering because scattering from the surfaces of grains contributes a larger fraction to the reflectance spectrum at short wavelengths. We find that the UV slope (as measured by the 320/415 nm ratio) is a more sensitive measure of maturity than indexes based on visible and near-infrared wavelengths. Only the youngest features (less than ~100 Ma) retain a UV slope that is distinct from mature soils of the same composition. No craters >20 km have UV slopes that approach those observed in laboratory spectra of fresh lunar materials (powdered lunar rocks). While the 320/415 nm ratio increases by ~18% from powdered rocks to mature soils in laboratory samples, Giordano Bruno, the freshest large crater, only shows a 3% difference between fresh and mature materials. At the resolution of our UV data (400 m/pixel), we observe some small (<5 km) craters that show a ~14% difference in 320/415 nm ratio from their mature surroundings. UV observations show that Reiner Gamma has had significantly lower levels of space weathering than any of the Copernican craters we examined, and was the only region we found with a UV slope that approached laboratory values for fresh powdered rock samples. This is consistent with the hypothesis that its high albedo is due to magnetic shielding from

  5. Use of Multiple-Angle Snow Camera (MASC) Observations as a Constraint on Radar-Based Retrievals of Snowfall Rate

    NASA Astrophysics Data System (ADS)

    Cooper, S.; Garrett, T. J.; Wood, N.; L'Ecuyer, T. S.

    2015-12-01

    We use a combination of Ka-band Zenith Radar (KaZR) and Multiple-Angle Snow Camera (MASC) observations at the ARM North Slope Alaska Climate Facility Site at Barrow to quantify snowfall. The optimal-estimation framework is used to combine information from the KaZR and MASC into a common retrieval scheme, where retrieved estimates of snowfall are compared to observations at a nearby NWS measurement site for evaluation. Modified from the operational CloudSat algorithm, the retrieval scheme returns estimates of the vertical profile of exponential PSD slope parameter with a constant number density. These values, in turn, can be used to calculate surface snowrate (liquid equivalent) given knowledge of snowflake microphysical properties and fallspeeds. We exploit scattering models for a variety of ice crystal shapes including aggregates developed specifically from observations of snowfall properties at high-latitudes, as well as more pristine crystal shapes involving sector plates, bullet rosettes, and hexagonal columns. As expected, initial retrievals suggest large differences (300% for some events) in estimated snowfall accumulations given the use of the different ice crystal assumptions. The complex problem of how we can more quantitatively link MASC snowflake images to specific radar scattering properties is an ongoing line of research. Here, however, we do quantify the use of MASC observations of fallspeed and PSD parameters as constraint on our optimal-estimation retrieval approach. In terms of fallspeed, we find differences in estimated snowfall of nearly 50% arising from the use of MASC observed fallspeeds relative to those derived from traditional fallspeed parameterizations. In terms of snowflake PSD, we find differences of nearly 25% arising from the use of MASC observed slope parameters relative to those derived from field campaign observations of high-altitude snow events. Of course, these different sources of error conspire to make the estimate of snowfall

  6. Reliability of sagittal plane hip, knee, and ankle joint angles from a single frame of video data using the GAITRite camera system.

    PubMed

    Ross, Sandy A; Rice, Clinton; Von Behren, Kristyn; Meyer, April; Alexander, Rachel; Murfin, Scott

    2015-01-01

    The purpose of this study was to establish intra-rater, intra-session, and inter-rater, reliability of sagittal plane hip, knee, and ankle angles with and without reflective markers using the GAITRite walkway and single video camera between student physical therapists and an experienced physical therapist. This study included thirty-two healthy participants age 20-59, stratified by age and gender. Participants performed three successful walks with and without markers applied to anatomical landmarks. GAITRite software was used to digitize sagittal hip, knee, and ankle angles at two phases of gait: (1) initial contact; and (2) mid-stance. Intra-rater reliability was more consistent for the experienced physical therapist, regardless of joint or phase of gait. Intra-session reliability was variable, the experienced physical therapist showed moderate to high reliability (intra-class correlation coefficient (ICC) = 0.50-0.89) and the student physical therapist showed very poor to high reliability (ICC = 0.07-0.85). Inter-rater reliability was highest during mid-stance at the knee with markers (ICC = 0.86) and lowest during mid-stance at the hip without markers (ICC = 0.25). Reliability of a single camera system, especially at the knee joint shows promise. Depending on the specific type of reliability, error can be attributed to the testers (e.g. lack of digitization practice and marker placement), participants (e.g. loose fitting clothing) and camera systems (e.g. frame rate and resolution). However, until the camera technology can be upgraded to a higher frame rate and resolution, and the software can be linked to the GAITRite walkway, the clinical utility for pre/post measures is limited.

  7. Why do I sometimes see bright speckles in an image of the Terrain product, particularly at the oblique camera angles?

    Atmospheric Science Data Center

    2014-12-08

    MISR Level 1B2 data products use various high data values to signify fill, and one of the fill values (16377) in the 14 MSB's of the scaled radiances signifies that this location on the SOM grid was obscured from the camera's view by...

  8. Quasi-null lens optical system for the fabrication of an oblate convex ellipsoidal mirror: application to the Wide Angle Camera of the Rosetta space mission.

    PubMed

    Pelizzo, Maria-Guglielmina; Da Deppo, Vania; Naletto, Giampiero; Ragazzoni, Roberto; Novi, Andrea

    2006-08-20

    The design of a quasi-null lens system for the fabrication of an aspheric oblate convex ellipsoidal mirror is presented. The Performance and tolerance of the system have been analyzed. The system has been applied successfully for the fabrication of the primary mirror of the Wide Angle Camera (WAC), the imaging system onboard the Rosetta, the European Space Agency cornerstone mission dedicated to the exploration of a comet. The WAC is based on an off-axis two-mirror configuration, in which the primary mirror is an oblate convex ellipsoid with a significant conic constant.

  9. 7. VAL CAMERA STATION, INTERIOR VIEW OF CAMERA MOUNT, COMMUNICATION ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    7. VAL CAMERA STATION, INTERIOR VIEW OF CAMERA MOUNT, COMMUNICATION EQUIPMENT AND STORAGE CABINET. - Variable Angle Launcher Complex, Camera Stations, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  10. 3. VAL CAMERA CAR, VIEW OF CAMERA CAR AND TRACK ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    3. VAL CAMERA CAR, VIEW OF CAMERA CAR AND TRACK WITH THE VAL TO THE RIGHT, LOOKING NORTHEAST. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  11. 1. VARIABLEANGLE LAUNCHER CAMERA CAR, VIEW OF CAMERA CAR AND ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    1. VARIABLE-ANGLE LAUNCHER CAMERA CAR, VIEW OF CAMERA CAR AND TRACK WITH CAMERA STATION ABOVE LOOKING NORTH TAKEN FROM RESERVOIR. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  12. A point-focusing small angle x-ray scattering camera using a doubly curved monochromator of a W/Si multilayer

    NASA Astrophysics Data System (ADS)

    Sasanuma, Yuji; Law, Robert V.; Kobayashi, Yuji

    1996-03-01

    A point-focusing small angle x-ray scattering (SAXS) camera using a doubly curved monochromator of a W/Si multilayer has been designed, constructed, and tested. The two radii of curvature of the monochromator are 20 400 and 7.6 mm. The reflectivity of its first-order Bragg reflection for CuKα radiation was calculated to be 0.82, being comparable to that (0.81) of its total reflection. By only 10 s x-ray exposure, scattering from a high-density polyethylene film was detected on an imaging plate (IP). A rotating-anode x-ray generator operated at 40 kV and 30 mA was used. Diffraction from rat-tail collagen has shown that the optical arrangement gives the Bragg spacing up to, at least, 30 nm for CuKα radiation. Combined with IPs, the camera may permit us to carry out time-resolved SAXS measurements for phase behaviors of liquid crystals, lipids, polymer alloys, etc., on conventional x-ray generators available in laboratories.

  13. The limits of narrow and wide-angle AVA inversions for high Vp/Vs ratios: An application to elastic seabed characterization

    NASA Astrophysics Data System (ADS)

    Aleardi, Mattia; Tognarelli, Andrea

    2016-08-01

    Since its introduction in the oil and gas industry, amplitude versus angle (AVA) inversion has become a standard tool in deep hydrocarbon exploration. However, with the intensification of offshore construction activity, applications of this method have been extended to evaluate the elastic properties of seabed sediments and of the shallowest part of the subsurface. These regions are often characterized by undercompacted sediments with very low S-wave velocities (Vs) and high P-wave velocity to S-wave velocity (Vp/Vs) ratios. However, the importance of the Vp/Vs ratio is usually underrated in AVA inversion. In this study, we analyse the limits of the AVA method in cases of high Vp/Vs ratios and the benefits introduced by wide-angle reflections in constraining the inversion results. A simplified seabed model that is characterized by a high Vp/Vs ratio is used to study the influence of the elastic and viscoelastic parameters on the P-wave reflection coefficients and to compute the error function of the AVA inversion. In addition, a synthetic AVA inversion is performed on this simplified model, which enables us to apply the sensitivity analysis tools to the inversion kernel. These theoretical analyses show that in the case of high Vp/Vs ratios, the Vs contrast at the reflecting interface plays a very minor role in determining the P-wave reflection coefficients and that the viscoelastic effects can be neglected when only pre-critical angles are considered in the inversion. In addition, wide-angle reflections are essential to reducing both the cross-talk between the inverted elastic parameters and the uncertainties in the Vp and density estimations, but they are not sufficient to better constrain the Vs estimation. As an application to field data, we derive the elastic properties of the seabed interface by applying AVA inversion to a 2D seismic dataset from a well-site survey acquisition. The limited water depth, the maximum available source-to-receiver offset, and the

  14. Miniaturized fundus camera

    NASA Astrophysics Data System (ADS)

    Gliss, Christine; Parel, Jean-Marie A.; Flynn, John T.; Pratisto, Hans S.; Niederer, Peter F.

    2003-07-01

    We present a miniaturized version of a fundus camera. The camera is designed for the use in screening for retinopathy of prematurity (ROP). There, but also in other applications a small, light weight, digital camera system can be extremely useful. We present a small wide angle digital camera system. The handpiece is significantly smaller and lighter then in all other systems. The electronics is truly portable fitting in a standard boardcase. The camera is designed to be offered at a compatible price. Data from tests on young rabbits' eyes is presented. The development of the camera system is part of a telemedicine project screening for ROP. Telemedical applications are a perfect application for this camera system using both advantages: the portability as well as the digital image.

  15. Pre-flight and On-orbit Geometric Calibration of the Lunar Reconnaissance Orbiter Camera

    NASA Astrophysics Data System (ADS)

    Speyerer, E. J.; Wagner, R. V.; Robinson, M. S.; Licht, A.; Thomas, P. C.; Becker, K.; Anderson, J.; Brylow, S. M.; Humm, D. C.; Tschimmel, M.

    2016-04-01

    The Lunar Reconnaissance Orbiter Camera (LROC) consists of two imaging systems that provide multispectral and high resolution imaging of the lunar surface. The Wide Angle Camera (WAC) is a seven color push-frame imager with a 90∘ field of view in monochrome mode and 60∘ field of view in color mode. From the nominal 50 km polar orbit, the WAC acquires images with a nadir ground sampling distance of 75 m for each of the five visible bands and 384 m for the two ultraviolet bands. The Narrow Angle Camera (NAC) consists of two identical cameras capable of acquiring images with a ground sampling distance of 0.5 m from an altitude of 50 km. The LROC team geometrically calibrated each camera before launch at Malin Space Science Systems in San Diego, California and the resulting measurements enabled the generation of a detailed camera model for all three cameras. The cameras were mounted and subsequently launched on the Lunar Reconnaissance Orbiter (LRO) on 18 June 2009. Using a subset of the over 793000 NAC and 207000 WAC images of illuminated terrain collected between 30 June 2009 and 15 December 2013, we improved the interior and exterior orientation parameters for each camera, including the addition of a wavelength dependent radial distortion model for the multispectral WAC. These geometric refinements, along with refined ephemeris, enable seamless projections of NAC image pairs with a geodetic accuracy better than 20 meters and sub-pixel precision and accuracy when orthorectifying WAC images.

  16. Wide Angle Movie

    NASA Technical Reports Server (NTRS)

    1999-01-01

    This brief movie illustrates the passage of the Moon through the Saturn-bound Cassini spacecraft's wide-angle camera field of view as the spacecraft passed by the Moon on the way to its closest approach with Earth on August 17, 1999. From beginning to end of the sequence, 25 wide-angle images (with a spatial image scale of about 14 miles per pixel (about 23 kilometers)were taken over the course of 7 and 1/2 minutes through a series of narrow and broadband spectral filters and polarizers, ranging from the violet to the near-infrared regions of the spectrum, to calibrate the spectral response of the wide-angle camera. The exposure times range from 5 milliseconds to 1.5 seconds. Two of the exposures were smeared and have been discarded and replaced with nearby images to make a smooth movie sequence. All images were scaled so that the brightness of Crisium basin, the dark circular region in the upper right, is approximately the same in every image. The imaging data were processed and released by the Cassini Imaging Central Laboratory for Operations (CICLOPS)at the University of Arizona's Lunar and Planetary Laboratory, Tucson, AZ.

    Photo Credit: NASA/JPL/Cassini Imaging Team/University of Arizona

    Cassini, launched in 1997, is a joint mission of NASA, the European Space Agency and Italian Space Agency. The mission is managed by NASA's Jet Propulsion Laboratory, Pasadena, CA, for NASA's Office of Space Science, Washington DC. JPL is a division of the California Institute of Technology, Pasadena, CA.

  17. HIGH SPEED CAMERA

    DOEpatents

    Rogers, B.T. Jr.; Davis, W.C.

    1957-12-17

    This patent relates to high speed cameras having resolution times of less than one-tenth microseconds suitable for filming distinct sequences of a very fast event such as an explosion. This camera consists of a rotating mirror with reflecting surfaces on both sides, a narrow mirror acting as a slit in a focal plane shutter, various other mirror and lens systems as well as an innage recording surface. The combination of the rotating mirrors and the slit mirror causes discrete, narrow, separate pictures to fall upon the film plane, thereby forming a moving image increment of the photographed event. Placing a reflecting surface on each side of the rotating mirror cancels the image velocity that one side of the rotating mirror would impart, so as a camera having this short a resolution time is thereby possible.

  18. The DSLR Camera

    NASA Astrophysics Data System (ADS)

    Berkó, Ernő; Argyle, R. W.

    Cameras have developed significantly in the past decade; in particular, digital Single-Lens Reflex Cameras (DSLR) have appeared. As a consequence we can buy cameras of higher and higher pixel number, and mass production has resulted in the great reduction of prices. CMOS sensors used for imaging are increasingly sensitive, and the electronics in the cameras allows images to be taken with much less noise. The software background is developing in a similar way—intelligent programs are created for after-processing and other supplementary works. Nowadays we can find a digital camera in almost every household, most of these cameras are DSLR ones. These can be used very well for astronomical imaging, which is nicely demonstrated by the amount and quality of the spectacular astrophotos appearing in different publications. These examples also show how much post-processing software contributes to the rise in the standard of the pictures. To sum up, the DSLR camera serves as a cheap alternative for the CCD camera, with somewhat weaker technical characteristics. In the following, I will introduce how we can measure the main parameters (position angle and separation) of double stars, based on the methods, software and equipment I use. Others can easily apply these for their own circumstances.

  19. 8. VAL CAMERA CAR, CLOSEUP VIEW OF 'FLARE' OR TRAJECTORY ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    8. VAL CAMERA CAR, CLOSE-UP VIEW OF 'FLARE' OR TRAJECTORY CAMERA ON SLIDING MOUNT. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  20. Camera Optics.

    ERIC Educational Resources Information Center

    Ruiz, Michael J.

    1982-01-01

    The camera presents an excellent way to illustrate principles of geometrical optics. Basic camera optics of the single-lens reflex camera are discussed, including interchangeable lenses and accessories available to most owners. Several experiments are described and results compared with theoretical predictions or manufacturer specifications.…

  1. 9. WEST SIDE OF EAST TOWER CAMERA POINTED STRAIGHT DOWN ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    9. WEST SIDE OF EAST TOWER CAMERA POINTED STRAIGHT DOWN FOR THE BENEFIT OF ACROPHOBICS; NOTE EXTREMELY TURBULENT WATER FROM CURRENT - Tacoma Narrows Bridge, Spanning Narrows at State Route 16, Tacoma, Pierce County, WA

  2. The Uses of a Polarimetric Camera

    DTIC Science & Technology

    2008-09-01

    18 Figure 18. Image of angle of polarization (From Bossa Nova Tech, 2007)......................20 Figure 19. The Salsa camera (From Bossa Nova Tech...22 Figure 21. Diagram of the inner workings of the SALSA camera (From: Bossa Nova Tech, 2007...23 Figure 22. Salsa camera with computer setup looking south toward California Pacific Highway 1

  3. The PAU Camera

    NASA Astrophysics Data System (ADS)

    Casas, R.; Ballester, O.; Cardiel-Sas, L.; Carretero, J.; Castander, F. J.; Castilla, J.; Crocce, M.; de Vicente, J.; Delfino, M.; Fernández, E.; Fosalba, P.; García-Bellido, J.; Gaztañaga, E.; Grañena, F.; Jiménez, J.; Madrid, F.; Maiorino, M.; Martí, P.; Miquel, R.; Neissner, C.; Ponce, R.; Sánchez, E.; Serrano, S.; Sevilla, I.; Tonello, N.; Troyano, I.

    2011-11-01

    The PAU Camera (PAUCam) is a wide-field camera designed to be mounted at the William Herschel Telescope (WHT) prime focus, located at the Observatorio del Roque de los Muchachos in the island of La Palma (Canary Islands).Its primary function is to carry out a cosmological survey, the PAU Survey, covering an area of several hundred square degrees of sky. Its purpose is to determine positions and distances using photometric redshift techniques. To achieve accurate photo-z's, PAUCam will be equipped with 40 narrow-band filters covering the range from 450 to850 nm, and six broad-band filters, those of the SDSS system plus the Y band. To fully cover the focal plane delivered by the telescope optics, 18 CCDs 2k x 4k are needed. The pixels are square of 15 μ m size. The optical characteristics of the prime focus corrector deliver a field-of-view where eight of these CCDs will have an illumination of more than 95% covering a field of 40 arc minutes. The rest of the CCDs will occupy the vignetted region extending the field diameter to one degree. Two of the CCDs will be devoted to auto-guiding.This camera have some innovative features. Firstly, both the broad-band and the narrow-band filters will be placed in mobile trays, hosting 16 such filters at most. Those are located inside the cryostat at few millimeters in front of the CCDs when observing. Secondly, a pressurized liquid nitrogen tank outside the camera will feed a boiler inside the cryostat with a controlled massflow. The read-out electronics will use the Monsoon architecture, originally developed by NOAO, modified and manufactured by our team in the frame of the DECam project (the camera used in the DES Survey).PAUCam will also be available to the astronomical community of the WHT.

  4. Automatic inference of geometric camera parameters and inter-camera topology in uncalibrated disjoint surveillance cameras

    NASA Astrophysics Data System (ADS)

    den Hollander, Richard J. M.; Bouma, Henri; Baan, Jan; Eendebak, Pieter T.; van Rest, Jeroen H. C.

    2015-10-01

    Person tracking across non-overlapping cameras and other types of video analytics benefit from spatial calibration information that allows an estimation of the distance between cameras and a relation between pixel coordinates and world coordinates within a camera. In a large environment with many cameras, or for frequent ad-hoc deployments of cameras, the cost of this calibration is high. This creates a barrier for the use of video analytics. Automating the calibration allows for a short configuration time, and the use of video analytics in a wider range of scenarios, including ad-hoc crisis situations and large scale surveillance systems. We show an autocalibration method entirely based on pedestrian detections in surveillance video in multiple non-overlapping cameras. In this paper, we show the two main components of automatic calibration. The first shows the intra-camera geometry estimation that leads to an estimate of the tilt angle, focal length and camera height, which is important for the conversion from pixels to meters and vice versa. The second component shows the inter-camera topology inference that leads to an estimate of the distance between cameras, which is important for spatio-temporal analysis of multi-camera tracking. This paper describes each of these methods and provides results on realistic video data.

  5. LROC - Lunar Reconnaissance Orbiter Camera

    NASA Astrophysics Data System (ADS)

    Robinson, M. S.; Eliason, E.; Hiesinger, H.; Jolliff, B. L.; McEwen, A.; Malin, M. C.; Ravine, M. A.; Thomas, P. C.; Turtle, E. P.

    2009-12-01

    The Lunar Reconnaissance Orbiter (LRO) went into lunar orbit on 23 June 2009. The LRO Camera (LROC) acquired its first lunar images on June 30 and commenced full scale testing and commissioning on July 10. The LROC consists of two narrow-angle cameras (NACs) that provide 0.5 m scale panchromatic images over a combined 5 km swath, and a wide-angle camera (WAC) to provide images at a scale of 100 m per pixel in five visible wavelength bands (415, 566, 604, 643, and 689 nm) and 400 m per pixel in two ultraviolet bands (321 nm and 360 nm) from the nominal 50 km orbit. Early operations were designed to test the performance of the cameras under all nominal operating conditions and provided a baseline for future calibrations. Test sequences included off-nadir slews to image stars and the Earth, 90° yaw sequences to collect flat field calibration data, night imaging for background characterization, and systematic mapping to test performance. LRO initially was placed into a terminator orbit resulting in images acquired under low signal conditions. Over the next three months the incidence angle at the spacecraft’s equator crossing gradually decreased towards high noon, providing a range of illumination conditions. Several hundred south polar images were collected in support of impact site selection for the LCROSS mission; details can be seen in many of the shadows. Commissioning phase images not only proved the instruments’ overall performance was nominal, but also that many geologic features of the lunar surface are well preserved at the meter-scale. Of particular note is the variety of impact-induced morphologies preserved in a near pristine state in and around kilometer-scale and larger young Copernican age impact craters that include: abundant evidence of impact melt of a variety of rheological properties, including coherent flows with surface textures and planimetric properties reflecting supersolidus (e.g., liquid melt) emplacement, blocks delicately perched on

  6. Cardiac cameras.

    PubMed

    Travin, Mark I

    2011-05-01

    Cardiac imaging with radiotracers plays an important role in patient evaluation, and the development of suitable imaging instruments has been crucial. While initially performed with the rectilinear scanner that slowly transmitted, in a row-by-row fashion, cardiac count distributions onto various printing media, the Anger scintillation camera allowed electronic determination of tracer energies and of the distribution of radioactive counts in 2D space. Increased sophistication of cardiac cameras and development of powerful computers to analyze, display, and quantify data has been essential to making radionuclide cardiac imaging a key component of the cardiac work-up. Newer processing algorithms and solid state cameras, fundamentally different from the Anger camera, show promise to provide higher counting efficiency and resolution, leading to better image quality, more patient comfort and potentially lower radiation exposure. While the focus has been on myocardial perfusion imaging with single-photon emission computed tomography, increased use of positron emission tomography is broadening the field to include molecular imaging of the myocardium and of the coronary vasculature. Further advances may require integrating cardiac nuclear cameras with other imaging devices, ie, hybrid imaging cameras. The goal is to image the heart and its physiological processes as accurately as possible, to prevent and cure disease processes.

  7. Electronic Still Camera

    NASA Technical Reports Server (NTRS)

    Holland, S. Douglas (Inventor)

    1992-01-01

    A handheld, programmable, digital camera is disclosed that supports a variety of sensors and has program control over the system components to provide versatility. The camera uses a high performance design which produces near film quality images from an electronic system. The optical system of the camera incorporates a conventional camera body that was slightly modified, thus permitting the use of conventional camera accessories, such as telephoto lenses, wide-angle lenses, auto-focusing circuitry, auto-exposure circuitry, flash units, and the like. An image sensor, such as a charge coupled device ('CCD') collects the photons that pass through the camera aperture when the shutter is opened, and produces an analog electrical signal indicative of the image. The analog image signal is read out of the CCD and is processed by preamplifier circuitry, a correlated double sampler, and a sample and hold circuit before it is converted to a digital signal. The analog-to-digital converter has an accuracy of eight bits to insure accuracy during the conversion. Two types of data ports are included for two different data transfer needs. One data port comprises a general purpose industrial standard port and the other a high speed/high performance application specific port. The system uses removable hard disks as its permanent storage media. The hard disk receives the digital image signal from the memory buffer and correlates the image signal with other sensed parameters, such as longitudinal or other information. When the storage capacity of the hard disk has been filled, the disk can be replaced with a new disk.

  8. Electronic still camera

    NASA Astrophysics Data System (ADS)

    Holland, S. Douglas

    1992-09-01

    A handheld, programmable, digital camera is disclosed that supports a variety of sensors and has program control over the system components to provide versatility. The camera uses a high performance design which produces near film quality images from an electronic system. The optical system of the camera incorporates a conventional camera body that was slightly modified, thus permitting the use of conventional camera accessories, such as telephoto lenses, wide-angle lenses, auto-focusing circuitry, auto-exposure circuitry, flash units, and the like. An image sensor, such as a charge coupled device ('CCD') collects the photons that pass through the camera aperture when the shutter is opened, and produces an analog electrical signal indicative of the image. The analog image signal is read out of the CCD and is processed by preamplifier circuitry, a correlated double sampler, and a sample and hold circuit before it is converted to a digital signal. The analog-to-digital converter has an accuracy of eight bits to insure accuracy during the conversion. Two types of data ports are included for two different data transfer needs. One data port comprises a general purpose industrial standard port and the other a high speed/high performance application specific port. The system uses removable hard disks as its permanent storage media. The hard disk receives the digital image signal from the memory buffer and correlates the image signal with other sensed parameters, such as longitudinal or other information. When the storage capacity of the hard disk has been filled, the disk can be replaced with a new disk.

  9. 5. VAL CAMERA CAR, DETAIL OF HOIST AT SIDE OF ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    5. VAL CAMERA CAR, DETAIL OF HOIST AT SIDE OF BRIDGE AND ENGINE CAR ON TRACKS, LOOKING NORTHEAST. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  10. Faint Object Camera imaging and spectroscopy of NGC 4151

    NASA Technical Reports Server (NTRS)

    Boksenberg, A.; Catchpole, R. M.; Macchetto, F.; Albrecht, R.; Barbieri, C.; Blades, J. C.; Crane, P.; Deharveng, J. M.; Disney, M. J.; Jakobsen, P.

    1995-01-01

    We describe ultraviolet and optical imaging and spectroscopy within the central few arcseconds of the Seyfert galaxy NGC 4151, obtained with the Faint Object Camera on the Hubble Space Telescope. A narrowband image including (O III) lambda(5007) shows a bright nucleus centered on a complex biconical structure having apparent opening angle approximately 65 deg and axis at a position angle along 65 deg-245 deg; images in bands including Lyman-alpha and C IV lambda(1550) and in the optical continuum near 5500 A, show only the bright nucleus. In an off-nuclear optical long-slit spectrum we find a high and a low radial velocity component within the narrow emission lines. We identify the low-velocity component with the bright, extended, knotty structure within the cones, and the high-velocity component with more confined diffuse emission. Also present are strong continuum emission and broad Balmer emission line components, which we attribute to the extended point spread function arising from the intense nuclear emission. Adopting the geometry pointed out by Pedlar et al. (1993) to explain the observed misalignment of the radio jets and the main optical structure we model an ionizing radiation bicone, originating within a galactic disk, with apex at the active nucleus and axis centered on the extended radio jets. We confirm that through density bounding the gross spatial structure of the emission line region can be reproduced with a wide opening angle that includes the line of sight, consistent with the presence of a simple opaque torus allowing direct view of the nucleus. In particular, our modelling reproduces the observed decrease in position angle with distance from the nucleus, progressing initially from the direction of the extended radio jet, through our optical structure, and on to the extended narrow-line region. We explore the kinematics of the narrow-line low- and high-velocity components on the basis of our spectroscopy and adopted model structure.

  11. CCD Camera

    DOEpatents

    Roth, Roger R.

    1983-01-01

    A CCD camera capable of observing a moving object which has varying intensities of radiation eminating therefrom and which may move at varying speeds is shown wherein there is substantially no overlapping of successive images and wherein the exposure times and scan times may be varied independently of each other.

  12. CCD Camera

    DOEpatents

    Roth, R.R.

    1983-08-02

    A CCD camera capable of observing a moving object which has varying intensities of radiation emanating therefrom and which may move at varying speeds is shown wherein there is substantially no overlapping of successive images and wherein the exposure times and scan times may be varied independently of each other. 7 figs.

  13. Wide Angle Michelson Doppler Imaging Interferometer (WAMDII)

    NASA Technical Reports Server (NTRS)

    Roberts, W. T.

    1985-01-01

    The wide angle Michelson Doppler imaging interferometer (WAMDII) is a specialized type of optical Michelson interferometer working at sufficiently long path difference to measure Doppler shifts and to infer Doppler line widths of naturally occurring upper atmospheric Gaussian line emissions. The instrument is intended to measure vertical profiles of atmospheric winds and temperatures within the altitude range of 85 km to 300 km. The WAMDII consists of a Michelson interferometer followed by a camera lens and an 85 x 106 charge coupled device photodiode array. Narrow band filters in a filter wheel are used to isolate individual line emissions and the lens forms an image of the emitting region on the charge coupled device array.

  14. Wide Angle Michelson Doppler Imaging Interferometer (WAMDII)

    NASA Technical Reports Server (NTRS)

    Roberts, B.

    1986-01-01

    The wide angle Michelson Doppler imaging interferometer (WAMDII) is a specialized type of optical Michelson interferometer working at sufficiently long path difference to measure Doppler shifts and to infer Doppler line widths of naturally occurring upper atmospheric Gaussian line emissions. The instrument is intended to measure vertical profiles of atmospheric winds and temperatures within the altitude range of 85 km to 300 km. The WAMDII consists of a Michelson interferometer followed by a camera lens and an 85 x 106 charge coupled device photodiode array. Narrow band filters in a filter wheel are used to isolate individual line emissions and the lens forms an image of the emitting region on the charge coupled device array.

  15. The DRAGO gamma camera

    SciTech Connect

    Fiorini, C.; Gola, A.; Peloso, R.; Longoni, A.; Lechner, P.; Soltau, H.; Strueder, L.; Ottobrini, L.; Martelli, C.; Lui, R.; Madaschi, L.; Belloli, S.

    2010-04-15

    In this work, we present the results of the experimental characterization of the DRAGO (DRift detector Array-based Gamma camera for Oncology), a detection system developed for high-spatial resolution gamma-ray imaging. This camera is based on a monolithic array of 77 silicon drift detectors (SDDs), with a total active area of 6.7 cm{sup 2}, coupled to a single 5-mm-thick CsI(Tl) scintillator crystal. The use of an array of SDDs provides a high quantum efficiency for the detection of the scintillation light together with a very low electronics noise. A very compact detection module based on the use of integrated readout circuits was developed. The performances achieved in gamma-ray imaging using this camera are reported here. When imaging a 0.2 mm collimated {sup 57}Co source (122 keV) over different points of the active area, a spatial resolution ranging from 0.25 to 0.5 mm was measured. The depth-of-interaction capability of the detector, thanks to the use of a Maximum Likelihood reconstruction algorithm, was also investigated by imaging a collimated beam tilted to an angle of 45 deg. with respect to the scintillator surface. Finally, the imager was characterized with in vivo measurements on mice, in a real preclinical environment.

  16. High-power, narrow-band, high-repetition-rate, 5.9 eV coherent light source using passive optical cavity for laser-based angle-resolved photoelectron spectroscopy.

    PubMed

    Omachi, J; Yoshioka, K; Kuwata-Gonokami, M

    2012-10-08

    We demonstrate a scheme for efficient generation of a 5.9 eV coherent light source with an average power of 23 mW, 0.34 meV linewidth, and 73 MHz repetition rate from a Ti: sapphire picosecond mode-locked laser with an output power of 1 W. Second-harmonic light is generated in a passive optical cavity by a BiB(3)O(6) crystal with a conversion efficiency as high as 67%. By focusing the second-harmonic light transmitted from the cavity into a β-BaB(2)O(4) crystal, we obtain fourth-harmonic light at 5.9 eV. This light source offers stable operation for at least a week. We discuss the suitability of the laser light source for high-resolution angle-resolved photoelectron spectroscopy by comparing it with other sources (synchrotron radiation facilities and gas discharge lamp).

  17. Calibration of the Lunar Reconnaissance Orbiter Camera

    NASA Astrophysics Data System (ADS)

    Tschimmel, M.; Robinson, M. S.; Humm, D. C.; Denevi, B. W.; Lawrence, S. J.; Brylow, S.; Ravine, M.; Ghaemi, T.

    2008-12-01

    The Lunar Reconnaissance Orbiter Camera (LROC) onboard the NASA Lunar Reconnaissance Orbiter (LRO) spacecraft consists of three cameras: the Wide-Angle Camera (WAC) and two identical Narrow Angle Cameras (NAC-L, NAC-R). The WAC is push-frame imager with 5 visible wavelength filters (415 to 680 nm) at a spatial resolution of 100 m/pixel and 2 UV filters (315 and 360 nm) with a resolution of 400 m/pixel. In addition to the multicolor imaging the WAC can operate in monochrome mode to provide a global large- incidence angle basemap and a time-lapse movie of the illumination conditions at both poles. The WAC has a highly linear response, a read noise of 72 e- and a full well capacity of 47,200 e-. The signal-to-noise ratio in each band is 140 in the worst case. There are no out-of-band leaks and the spectral response of each filter is well characterized. Each NAC is a monochrome pushbroom scanner, providing images with a resolution of 50 cm/pixel from a 50-km orbit. A single NAC image has a swath width of 2.5 km and a length of up to 26 km. The NACs are mounted to acquire side-by-side imaging for a combined swath width of 5 km. The NAC is designed to fully characterize future human and robotic landing sites in terms of topography and hazard risks. The North and South poles will be mapped on a 1-meter-scale poleward of 85.5° latitude. Stereo coverage can be provided by pointing the NACs off-nadir. The NACs are also highly linear. Read noise is 71 e- for NAC-L and 74 e- for NAC-R and the full well capacity is 248,500 e- for NAC-L and 262,500 e- for NAC- R. The focal lengths are 699.6 mm for NAC-L and 701.6 mm for NAC-R; the system MTF is 28% for NAC-L and 26% for NAC-R. The signal-to-noise ratio is at least 46 (terminator scene) and can be higher than 200 (high sun scene). Both NACs exhibit a straylight feature, which is caused by out-of-field sources and is of a magnitude of 1-3%. However, as this feature is well understood it can be greatly reduced during ground

  18. Pinhole Cameras: For Science, Art, and Fun!

    ERIC Educational Resources Information Center

    Button, Clare

    2007-01-01

    A pinhole camera is a camera without a lens. A tiny hole replaces the lens, and light is allowed to come in for short amount of time by means of a hand-operated shutter. The pinhole allows only a very narrow beam of light to enter, which reduces confusion due to scattered light on the film. This results in an image that is focused, reversed, and…

  19. Omnidirectional underwater camera design and calibration.

    PubMed

    Bosch, Josep; Gracias, Nuno; Ridao, Pere; Ribas, David

    2015-03-12

    This paper presents the development of an underwater omnidirectional multi-camera system (OMS) based on a commercially available six-camera system, originally designed for land applications. A full calibration method is presented for the estimation of both the intrinsic and extrinsic parameters, which is able to cope with wide-angle lenses and non-overlapping cameras simultaneously. This method is valid for any OMS in both land or water applications. For underwater use, a customized housing is required, which often leads to strong image distortion due to refraction among the different media. This phenomena makes the basic pinhole camera model invalid for underwater cameras, especially when using wide-angle lenses, and requires the explicit modeling of the individual optical rays. To address this problem, a ray tracing approach has been adopted to create a field-of-view (FOV) simulator for underwater cameras. The simulator allows for the testing of different housing geometries and optics for the cameras to ensure a complete hemisphere coverage in underwater operation. This paper describes the design and testing of a compact custom housing for a commercial off-the-shelf OMS camera (Ladybug 3) and presents the first results of its use. A proposed three-stage calibration process allows for the estimation of all of the relevant camera parameters. Experimental results are presented, which illustrate the performance of the calibration method and validate the approach.

  20. Omnidirectional Underwater Camera Design and Calibration

    PubMed Central

    Bosch, Josep; Gracias, Nuno; Ridao, Pere; Ribas, David

    2015-01-01

    This paper presents the development of an underwater omnidirectional multi-camera system (OMS) based on a commercially available six-camera system, originally designed for land applications. A full calibration method is presented for the estimation of both the intrinsic and extrinsic parameters, which is able to cope with wide-angle lenses and non-overlapping cameras simultaneously. This method is valid for any OMS in both land or water applications. For underwater use, a customized housing is required, which often leads to strong image distortion due to refraction among the different media. This phenomena makes the basic pinhole camera model invalid for underwater cameras, especially when using wide-angle lenses, and requires the explicit modeling of the individual optical rays. To address this problem, a ray tracing approach has been adopted to create a field-of-view (FOV) simulator for underwater cameras. The simulator allows for the testing of different housing geometries and optics for the cameras to ensure a complete hemisphere coverage in underwater operation. This paper describes the design and testing of a compact custom housing for a commercial off-the-shelf OMS camera (Ladybug 3) and presents the first results of its use. A proposed three-stage calibration process allows for the estimation of all of the relevant camera parameters. Experimental results are presented, which illustrate the performance of the calibration method and validate the approach. PMID:25774707

  1. Viewing angle changeable display

    NASA Astrophysics Data System (ADS)

    Leng, Jinbi; Huang, Ziqiang; Yang, Wenjun; Chen, Xiaoxi

    2010-10-01

    Viewing angle changeable display can change the display viewing angle as needed: In the public place the display could have a narrow viewing angle for privacy, while in the private place the displays could have a wide viewing angle for the convenience of the operation and better viewing experience. This article propose a novel adjustable optical transmission device to realize the viewing angle changes for LCD by using the principle of guest- host effect of liquid crystal. The major technology is to insert a special equipment between the backlight and the LCD, through which the backlight will display either parallel or scattered features to get an either narrow or wide viewing angle. The equipment is an adjustable transmission cell (ATC) which is actually a black G-H LC cell. This ATC is the main focus of our invention. The ATC consists of a polarizer sheet and a special guest-host liquid crystal device filled with the two-phase dye (called as GH-LC in this report), to achieve the viewing angle change in the LCD. When an electrical field charges to the ATC, only the so-called near-axis lights can pass through the ATC within a relatively small angle, while the other scattered lights are absorbed sequentially by GH-LC and the polarizer sheet. On the other hand, when there is no electrical charge to the ATC, the cell behaves like a normal polarizer; and the scattered light can pass through the cell and polarizer in a normal way. This paper describes the principle and structure of the device, applies the electric field on the sample to observe the electro-optical properties, combine the theoretical and experimental research, getting the viewing angle effects of the display.

  2. Caught on Camera.

    ERIC Educational Resources Information Center

    Milshtein, Amy

    2002-01-01

    Describes the benefits of and rules to be followed when using surveillance cameras for school security. Discusses various camera models, including indoor and outdoor fixed position cameras, pan-tilt zoom cameras, and pinhole-lens cameras for covert surveillance. (EV)

  3. Laboratory Calibration and Characterization of Video Cameras

    NASA Technical Reports Server (NTRS)

    Burner, A. W.; Snow, W. L.; Shortis, M. R.; Goad, W. K.

    1989-01-01

    Some techniques for laboratory calibration and characterization of video cameras used with frame grabber boards are presented. A laser-illuminated displaced reticle technique (with camera lens removed) is used to determine the camera/grabber effective horizontal and vertical pixel spacing as well as the angle of non-perpendicularity of the axes. The principal point of autocollimation and point of symmetry are found by illuminating the camera with an unexpanded laser beam, either aligned with the sensor or lens. Lens distortion and the principal distance are determined from images of a calibration plate suitable aligned with the camera. Calibration and characterization results for several video cameras are presented. Differences between these laboratory techniques and test range and plumb line calibration are noted.

  4. Laboratory calibration and characterization of video cameras

    NASA Astrophysics Data System (ADS)

    Burner, A. W.; Snow, W. L.; Shortis, M. R.; Goad, W. K.

    1990-08-01

    Some techniques for laboratory calibration and characterization of video cameras used with frame grabber boards are presented. A laser-illuminated displaced reticle technique (with camera lens removed) is used to determine the camera/grabber effective horizontal and vertical pixel spacing as well as the angle of nonperpendicularity of the axes. The principal point of autocollimation and point of symmetry are found by illuminating the camera with an unexpanded laser beam, either aligned with the sensor or lens. Lens distortion and the principal distance are determined from images of a calibration plate suitably aligned with the camera. Calibration and characterization results for several video cameras are presented. Differences between these laboratory techniques and test range and plumb line calibration are noted.

  5. Bacterial motion in narrow capillaries

    PubMed Central

    Ping, Liyan; Wasnik, Vaibhav; Emberly, Eldon

    2014-01-01

    Motile bacteria often have to pass through small tortuous pores in soil or tissue of higher organisms. However, their motion in this prevalent type of niche is not fully understood. Here, we modeled it with narrow glass capillaries and identified a critical radius (Rc) for bacterial motion. Near the surface of capillaries narrower than that, the swimming trajectories are helices. In larger capillaries, they swim in distorted circles. Under non-slip condition, the peritrichous Escherichia coli swam in left-handed helices with an Rc of ∼10 μm near glass surface. However, slipping could occur in the fast monotrichous Pseudomonas fluorescens, when a speed threshold was exceeded, and thus both left-handed and right-handed helices were executed in glass capillaries. In the natural non-cylindrical pores, the near-surface trajectories would be spirals and twisted loops. Engaging in such motions reduces the bacterial migration rate. With a given pore size, the run length and the tumbling angle of the bacterium determine the probability and duration of their near-surface motion. Shear flow and chemotaxis potentially enhance it. Based on this observation, the puzzling previous observations on bacterial migration in porous environments can be interpreted. PMID:25764548

  6. Toward a miniaturized fundus camera.

    PubMed

    Gliss, Christine; Parel, Jean-Marie; Flynn, John T; Pratisto, Hans; Niederer, Peter

    2004-01-01

    Retinopathy of prematurity (ROP) describes a pathological development of the retina in prematurely born children. In order to prevent severe permanent damage to the eye and enable timely treatment, the fundus of the eye in such children has to be examined according to established procedures. For these examinations, our miniaturized fundus camera is intended to allow the acquisition of wide-angle digital pictures of the fundus for on-line or off-line diagnosis and documentation. We designed two prototypes of a miniaturized fundus camera, one with graded refractive index (GRIN)-based optics, the other with conventional optics. Two different modes of illumination were compared: transscleral and transpupillary. In both systems, the size and weight of the camera were minimized. The prototypes were tested on young rabbits. The experiments led to the conclusion that the combination of conventional optics with transpupillary illumination yields the best results in terms of overall image quality.

  7. System Synchronizes Recordings from Separated Video Cameras

    NASA Technical Reports Server (NTRS)

    Nail, William; Nail, William L.; Nail, Jasper M.; Le, Doung T.

    2009-01-01

    A system of electronic hardware and software for synchronizing recordings from multiple, physically separated video cameras is being developed, primarily for use in multiple-look-angle video production. The system, the time code used in the system, and the underlying method of synchronization upon which the design of the system is based are denoted generally by the term "Geo-TimeCode(TradeMark)." The system is embodied mostly in compact, lightweight, portable units (see figure) denoted video time-code units (VTUs) - one VTU for each video camera. The system is scalable in that any number of camera recordings can be synchronized. The estimated retail price per unit would be about $350 (in 2006 dollars). The need for this or another synchronization system external to video cameras arises because most video cameras do not include internal means for maintaining synchronization with other video cameras. Unlike prior video-camera-synchronization systems, this system does not depend on continuous cable or radio links between cameras (however, it does depend on occasional cable links lasting a few seconds). Also, whereas the time codes used in prior video-camera-synchronization systems typically repeat after 24 hours, the time code used in this system does not repeat for slightly more than 136 years; hence, this system is much better suited for long-term deployment of multiple cameras.

  8. Narrowness and Liberality

    ERIC Educational Resources Information Center

    Agresto, John

    2003-01-01

    John Agresto, whose task has been to rebuild the war-ravaged infrastructure of a Middle-Eastern university system, is discouraged to see that narrow expertise is the only goal of education there, to the utter exclusion of intellectual breadth. He comments that, although it is not that bad in the U.S., he feels that doctoral programs as currently…

  9. Cydonia: Wide Angle Color Image

    NASA Technical Reports Server (NTRS)

    2000-01-01

    [figure removed for brevity, see original site]

    Although the resolution of the MOC wide angle cameras is too low to tell much about the geomorphology of the Cydonia region, the images from the red and blue wide angle cameras provide us with two types of information that is of interest in their own right: color and stereoscopic data. Above are a color view and a stereoscopic anaglyph rendition of Geodesy Campaign images acquired by MGS MOC in May 1999. To view the stereo image, you need red/blue '3-D' glasses.

  10. Characterization of previously unidentified lunar pyroclastic deposits using Lunar Reconnaissance Orbiter Camera (LROC) data

    USGS Publications Warehouse

    Gustafson, J. Olaf; Bell, James F.; Gaddis, Lisa R.R.; Hawke, B. Ray Ray; Giguere, Thomas A.

    2012-01-01

    We used a Lunar Reconnaissance Orbiter Camera (LROC) global monochrome Wide-angle Camera (WAC) mosaic to conduct a survey of the Moon to search for previously unidentified pyroclastic deposits. Promising locations were examined in detail using LROC multispectral WAC mosaics, high-resolution LROC Narrow Angle Camera (NAC) images, and Clementine multispectral (ultraviolet-visible or UVVIS) data. Out of 47 potential deposits chosen for closer examination, 12 were selected as probable newly identified pyroclastic deposits. Potential pyroclastic deposits were generally found in settings similar to previously identified deposits, including areas within or near mare deposits adjacent to highlands, within floor-fractured craters, and along fissures in mare deposits. However, a significant new finding is the discovery of localized pyroclastic deposits within floor-fractured craters Anderson E and F on the lunar farside, isolated from other known similar deposits. Our search confirms that most major regional and localized low-albedo pyroclastic deposits have been identified on the Moon down to ~100 m/pix resolution, and that additional newly identified deposits are likely to be either isolated small deposits or additional portions of discontinuous, patchy deposits.

  11. Pre-hibernation performances of the OSIRIS cameras onboard the Rosetta spacecraft

    NASA Astrophysics Data System (ADS)

    Magrin, S.; La Forgia, F.; Da Deppo, V.; Lazzarin, M.; Bertini, I.; Ferri, F.; Pajola, M.; Barbieri, M.; Naletto, G.; Barbieri, C.; Tubiana, C.; Küppers, M.; Fornasier, S.; Jorda, L.; Sierks, H.

    2015-02-01

    Context. The ESA cometary mission Rosetta was launched in 2004. In the past years and until the spacecraft hibernation in June 2011, the two cameras of the OSIRIS imaging system (Narrow Angle and Wide Angle Camera, NAC and WAC) observed many different sources. On 20 January 2014 the spacecraft successfully exited hibernation to start observing the primary scientific target of the mission, comet 67P/Churyumov-Gerasimenko. Aims: A study of the past performances of the cameras is now mandatory to be able to determine whether the system has been stable through the time and to derive, if necessary, additional analysis methods for the future precise calibration of the cometary data. Methods: The instrumental responses and filter passbands were used to estimate the efficiency of the system. A comparison with acquired images of specific calibration stars was made, and a refined photometric calibration was computed, both for the absolute flux and for the reflectivity of small bodies of the solar system. Results: We found a stability of the instrumental performances within ±1.5% from 2007 to 2010, with no evidence of an aging effect on the optics or detectors. The efficiency of the instrumentation is found to be as expected in the visible range, but lower than expected in the UV and IR range. A photometric calibration implementation was discussed for the two cameras. Conclusions: The calibration derived from pre-hibernation phases of the mission will be checked as soon as possible after the awakening of OSIRIS and will be continuously monitored until the end of the mission in December 2015. A list of additional calibration sources has been determined that are to be observed during the forthcoming phases of the mission to ensure a better coverage across the wavelength range of the cameras and to study the possible dust contamination of the optics.

  12. [Narrow lumbar canal].

    PubMed

    Deshayes, P; Louvel, J P

    1992-03-01

    The diagnosis of spinal stenosis can be strongly suspected when the following symptoms are present: limbs neuralgias with a poorly defined location, paresthesias in several dermatomas neurogenic intermittent claudication. Myelography coupled with scan yields the best information about morphology, levels of stenosis and narrowing factors, bone bridges ligaments and discal structures. If surgery is decided after failure of medical treatment to improve the patient's condition, the choice will be best guided by the myeloscan analysis.

  13. Experience with duplex bearings in narrow angle oscillating applications

    NASA Technical Reports Server (NTRS)

    Phinney, D. D.; Pollard, C. L.; Hinricks, J. T.

    1988-01-01

    Duplex ball bearings are matched pairs on which the abutting faces of the rings have been accurately ground so that when the rings are clamped together, a controlled amount of interference (preload) exists across the balls. These bearings are vulnerable to radial temperature gradients, blocking in oscillation and increased sensitivity to contamination. These conditions decrease the service life of these bearings. It was decided that an accelerated thermal vacuum life test should be conducted. The test apparatus and results are described and the rationale is presented for reducing a multiyear life test on oil lubricated bearings to less than a year.

  14. Determining Camera Gain in Room Temperature Cameras

    SciTech Connect

    Joshua Cogliati

    2010-12-01

    James R. Janesick provides a method for determining the amplification of a CCD or CMOS camera when only access to the raw images is provided. However, the equation that is provided ignores the contribution of dark current. For CCD or CMOS cameras that are cooled well below room temperature, this is not a problem, however, the technique needs adjustment for use with room temperature cameras. This article describes the adjustment made to the equation, and a test of this method.

  15. Two-Camera Acquisition and Tracking of a Flying Target

    NASA Technical Reports Server (NTRS)

    Biswas, Abhijit; Assad, Christopher; Kovalik, Joseph M.; Pain, Bedabrata; Wrigley, Chris J.; Twiss, Peter

    2008-01-01

    A method and apparatus have been developed to solve the problem of automated acquisition and tracking, from a location on the ground, of a luminous moving target in the sky. The method involves the use of two electronic cameras: (1) a stationary camera having a wide field of view, positioned and oriented to image the entire sky; and (2) a camera that has a much narrower field of view (a few degrees wide) and is mounted on a two-axis gimbal. The wide-field-of-view stationary camera is used to initially identify the target against the background sky. So that the approximate position of the target can be determined, pixel locations on the image-detector plane in the stationary camera are calibrated with respect to azimuth and elevation. The approximate target position is used to initially aim the gimballed narrow-field-of-view camera in the approximate direction of the target. Next, the narrow-field-of view camera locks onto the target image, and thereafter the gimbals are actuated as needed to maintain lock and thereby track the target with precision greater than that attainable by use of the stationary camera.

  16. Full Stokes polarization imaging camera

    NASA Astrophysics Data System (ADS)

    Vedel, M.; Breugnot, S.; Lechocinski, N.

    2011-10-01

    Objective and background: We present a new version of Bossa Nova Technologies' passive polarization imaging camera. The previous version was performing live measurement of the Linear Stokes parameters (S0, S1, S2), and its derivatives. This new version presented in this paper performs live measurement of Full Stokes parameters, i.e. including the fourth parameter S3 related to the amount of circular polarization. Dedicated software was developed to provide live images of any Stokes related parameters such as the Degree Of Linear Polarization (DOLP), the Degree Of Circular Polarization (DOCP), the Angle Of Polarization (AOP). Results: We first we give a brief description of the camera and its technology. It is a Division Of Time Polarimeter using a custom ferroelectric liquid crystal cell. A description of the method used to calculate Data Reduction Matrix (DRM)5,9 linking intensity measurements and the Stokes parameters is given. The calibration was developed in order to maximize the condition number of the DRM. It also allows very efficient post processing of the images acquired. Complete evaluation of the precision of standard polarization parameters is described. We further present the standard features of the dedicated software that was developed to operate the camera. It provides live images of the Stokes vector components and the usual associated parameters. Finally some tests already conducted are presented. It includes indoor laboratory and outdoor measurements. This new camera will be a useful tool for many applications such as biomedical, remote sensing, metrology, material studies, and others.

  17. Perceptual narrowing: retrospect and prospect.

    PubMed

    Flom, Ross

    2014-11-01

    Research is reviewed demonstrating perceptual narrowing across a variety of domains. Research is also reviewed showing that the temporal window of perceptual narrowing can be extended and, in some cases, perceptual narrowing can be reversed. Research is also reviewed highlighting the neurophysiological correlates of perceptual narrowing as well as some of the individual neurophysiological differences associated with perceptual narrowing. Various methodological issues associated with perceptual narrowing are also discussed. The broader purpose of this paper, however, is to argue that the term perceptual narrowing fails to capture the dynamic nature of this perceptual process. Finally, it is argued that just as other concepts associated with experience and development are refined and modified as new evidence emerges, likewise we need to evaluate and refine how we conceptualize perceptual narrowing.

  18. Constrained space camera assembly

    DOEpatents

    Heckendorn, Frank M.; Anderson, Erin K.; Robinson, Casandra W.; Haynes, Harriet B.

    1999-01-01

    A constrained space camera assembly which is intended to be lowered through a hole into a tank, a borehole or another cavity. The assembly includes a generally cylindrical chamber comprising a head and a body and a wiring-carrying conduit extending from the chamber. Means are included in the chamber for rotating the body about the head without breaking an airtight seal formed therebetween. The assembly may be pressurized and accompanied with a pressure sensing means for sensing if a breach has occurred in the assembly. In one embodiment, two cameras, separated from their respective lenses, are installed on a mounting apparatus disposed in the chamber. The mounting apparatus includes means allowing both longitudinal and lateral movement of the cameras. Moving the cameras longitudinally focuses the cameras, and moving the cameras laterally away from one another effectively converges the cameras so that close objects can be viewed. The assembly further includes means for moving lenses of different magnification forward of the cameras.

  19. Novel fundus camera design

    NASA Astrophysics Data System (ADS)

    Dehoog, Edward A.

    A fundus camera a complex optical system that makes use of the principle of reflex free indirect ophthalmoscopy to image the retina. Despite being in existence as early as 1900's, little has changed in the design of a fundus camera and there is minimal information about the design principles utilized. Parameters and specifications involved in the design of fundus camera are determined and their affect on system performance are discussed. Fundus cameras incorporating different design methods are modeled and a performance evaluation based on design parameters is used to determine the effectiveness of each design strategy. By determining the design principles involved in the fundus camera, new cameras can be designed to include specific imaging modalities such as optical coherence tomography, imaging spectroscopy and imaging polarimetry to gather additional information about properties and structure of the retina. Design principles utilized to incorporate such modalities into fundus camera systems are discussed. Design, implementation and testing of a snapshot polarimeter fundus camera are demonstrated.

  20. Making Ceramic Cameras

    ERIC Educational Resources Information Center

    Squibb, Matt

    2009-01-01

    This article describes how to make a clay camera. This idea of creating functional cameras from clay allows students to experience ceramics, photography, and painting all in one unit. (Contains 1 resource and 3 online resources.)

  1. Vacuum Camera Cooler

    NASA Technical Reports Server (NTRS)

    Laugen, Geoffrey A.

    2011-01-01

    Acquiring cheap, moving video was impossible in a vacuum environment, due to camera overheating. This overheating is brought on by the lack of cooling media in vacuum. A water-jacketed camera cooler enclosure machined and assembled from copper plate and tube has been developed. The camera cooler (see figure) is cup-shaped and cooled by circulating water or nitrogen gas through copper tubing. The camera, a store-bought "spy type," is not designed to work in a vacuum. With some modifications the unit can be thermally connected when mounted in the cup portion of the camera cooler. The thermal conductivity is provided by copper tape between parts of the camera and the cooled enclosure. During initial testing of the demonstration unit, the camera cooler kept the CPU (central processing unit) of this video camera at operating temperature. This development allowed video recording of an in-progress test, within a vacuum environment.

  2. Angle performance on optima MDxt

    SciTech Connect

    David, Jonathan; Kamenitsa, Dennis

    2012-11-06

    Angle control on medium current implanters is important due to the high angle-sensitivity of typical medium current implants, such as halo implants. On the Optima MDxt, beam-to-wafer angles are controlled in both the horizontal and vertical directions. In the horizontal direction, the beam angle is measured through six narrow slits, and any angle adjustment is made by electrostatically steering the beam, while cross-wafer beam parallelism is adjusted by changing the focus of the electrostatic parallelizing lens (P-lens). In the vertical direction, the beam angle is measured through a high aspect ratio mask, and any angle adjustment is made by slightly tilting the wafer platen prior to implant. A variety of tests were run to measure the accuracy and repeatability of Optima MDxt's angle control. SIMS profiles of a high energy, channeling sensitive condition show both the cross-wafer angle uniformity, along with the small-angle resolution of the system. Angle repeatability was quantified by running a channeling sensitive implant as a regular monitor over a seven month period and measuring the sheet resistance-to-angle sensitivity. Even though crystal cut error was not controlled for in this case, when attributing all Rs variation to angle changes, the overall angle repeatability was measured as 0.16 Degree-Sign (1{sigma}). A separate angle repeatability test involved running a series of V-curves tests over a four month period using low crystal cut wafers selected from the same boule. The results of this test showed the angle repeatability to be <0.1 Degree-Sign (1{sigma}).

  3. Those Nifty Digital Cameras!

    ERIC Educational Resources Information Center

    Ekhaml, Leticia

    1996-01-01

    Describes digital photography--an electronic imaging technology that merges computer capabilities with traditional photography--and its uses in education. Discusses how a filmless camera works, types of filmless cameras, advantages and disadvantages, and educational applications of the consumer digital cameras. (AEF)

  4. Digital Pinhole Camera

    ERIC Educational Resources Information Center

    Lancor, Rachael; Lancor, Brian

    2014-01-01

    In this article we describe how the classic pinhole camera demonstration can be adapted for use with digital cameras. Students can easily explore the effects of the size of the pinhole and its distance from the sensor on exposure time, magnification, and image quality. Instructions for constructing a digital pinhole camera and our method for…

  5. Study on airflow characteristics in the semi-closed irregular narrow flow channel

    NASA Astrophysics Data System (ADS)

    Jin, Yuzhen; Hu, Xiaodong; Zhu, Linhang; Hu, Xudong; Jin, Yingzi

    2016-04-01

    The air-jet loom is widely used in the textile industry. The interaction mechanism of airflow and yarn is not clear in such a narrow flow channel, the gas consumption is relatively large, the yarn motion is unstable and the weft insertion is often interrupted during the operation. In order to study the characteristics of the semi-closed flow field in profiled dents, the momentum conservation equation is modified and the model parameters and boundary conditions are set. Compared with the different r, the ratio of profiled dent's thickness and gap, the results show that the smaller the r is, the smaller the velocity fluctuations of the airflow is. When the angle of profiled dents α is close to zero, the diffusion of the airflow will be less. The experiment is also conducted to verify the result of the simulation with a high-speed camera and pressure sensor in profiled dents. The airflow characteristics in the semi-closed irregular narrow flow channel in the paper would provide the theoretical basis for optimizing the weft insertion process of the air-jet loom.

  6. Prediction of Viking lander camera image quality

    NASA Technical Reports Server (NTRS)

    Huck, F. O.; Burcher, E. E.; Jobson, D. J.; Wall, S. D.

    1976-01-01

    Formulations are presented that permit prediction of image quality as a function of camera performance, surface radiance properties, and lighting and viewing geometry. Predictions made for a wide range of surface radiance properties reveal that image quality depends strongly on proper camera dynamic range command and on favorable lighting and viewing geometry. Proper camera dynamic range commands depend mostly on the surface albedo that will be encountered. Favorable lighting and viewing geometries depend mostly on lander orientation with respect to the diurnal sun path over the landing site, and tend to be independent of surface albedo and illumination scattering function. Side lighting with low sun elevation angles (10 to 30 deg) is generally favorable for imaging spatial details and slopes, whereas high sun elevation angles are favorable for measuring spectral reflectances.

  7. Tower Camera Handbook

    SciTech Connect

    Moudry, D

    2005-01-01

    The tower camera in Barrow provides hourly images of ground surrounding the tower. These images may be used to determine fractional snow cover as winter arrives, for comparison with the albedo that can be calculated from downward-looking radiometers, as well as some indication of present weather. Similarly, during spring time, the camera images show the changes in the ground albedo as the snow melts. The tower images are saved in hourly intervals. In addition, two other cameras, the skydeck camera in Barrow and the piling camera in Atqasuk, show the current conditions at those sites.

  8. 10. CONSTRUCTION OF FIXED CAMERA STATION 1100 (BUILDING NO. 42020) ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    10. CONSTRUCTION OF FIXED CAMERA STATION 1100 (BUILDING NO. 42020) LOOKING NORTHEAST SHOWING CONCRETE FOUNDATION, WOOD FORMWORK AND STEEL REINFORCING, March 26, 1948. (Original photograph in possession of Dave Willis, San Diego, California.) - Variable Angle Launcher Complex, Camera Stations, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  9. 9. COMPLETED ROLLING CAMERA CAR ON RAILROAD TRACK AND BRIDGE ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    9. COMPLETED ROLLING CAMERA CAR ON RAILROAD TRACK AND BRIDGE LOOKING WEST, APRIL 26, 1948. (ORIGINAL PHOTOGRAPH IN POSSESSION OF DAVE WILLIS, SAN DIEGO, CALIFORNIA.) - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  10. Design Versatility Of The Prism Panoramic Camera: The KS-116 And KA-95 Cameras

    NASA Astrophysics Data System (ADS)

    Ruck, Richard

    1981-12-01

    The increased need for smaller and more versatile aerial reconnaissance equipment has led to the use of the KS-116 camera in the RF-4B aircraft, and the KA-95 in the RIF-5E. Both cameras use a 12-inch fl, f/4.0 lens and a 5-inch film transport. The main difference between these two cameras is their configuration. This paper describes the features of the two cameras, including: selectable scan angle, forward motion compensation (FMC), roll stabilization, exposure control, unique packaging differences and focus correction. The inherent packaging flexibility of the prism pan camera and the availability of key modules have led to multiple configurations of which the KS-116 and KA-95 are two examples.

  11. Traffic camera system development

    NASA Astrophysics Data System (ADS)

    Hori, Toshi

    1997-04-01

    The intelligent transportation system has generated a strong need for the development of intelligent camera systems to meet the requirements of sophisticated applications, such as electronic toll collection (ETC), traffic violation detection and automatic parking lot control. In order to achieve the highest levels of accuracy in detection, these cameras must have high speed electronic shutters, high resolution, high frame rate, and communication capabilities. A progressive scan interline transfer CCD camera, with its high speed electronic shutter and resolution capabilities, provides the basic functions to meet the requirements of a traffic camera system. Unlike most industrial video imaging applications, traffic cameras must deal with harsh environmental conditions and an extremely wide range of light. Optical character recognition is a critical function of a modern traffic camera system, with detection and accuracy heavily dependent on the camera function. In order to operate under demanding conditions, communication and functional optimization is implemented to control cameras from a roadside computer. The camera operates with a shutter speed faster than 1/2000 sec. to capture highway traffic both day and night. Consequently camera gain, pedestal level, shutter speed and gamma functions are controlled by a look-up table containing various parameters based on environmental conditions, particularly lighting. Lighting conditions are studied carefully, to focus only on the critical license plate surface. A unique light sensor permits accurate reading under a variety of conditions, such as a sunny day, evening, twilight, storms, etc. These camera systems are being deployed successfully in major ETC projects throughout the world.

  12. Sky camera geometric calibration using solar observations

    NASA Astrophysics Data System (ADS)

    Urquhart, Bryan; Kurtz, Ben; Kleissl, Jan

    2016-09-01

    A camera model and associated automated calibration procedure for stationary daytime sky imaging cameras is presented. The specific modeling and calibration needs are motivated by remotely deployed cameras used to forecast solar power production where cameras point skyward and use 180° fisheye lenses. Sun position in the sky and on the image plane provides a simple and automated approach to calibration; special equipment or calibration patterns are not required. Sun position in the sky is modeled using a solar position algorithm (requiring latitude, longitude, altitude and time as inputs). Sun position on the image plane is detected using a simple image processing algorithm. The performance evaluation focuses on the calibration of a camera employing a fisheye lens with an equisolid angle projection, but the camera model is general enough to treat most fixed focal length, central, dioptric camera systems with a photo objective lens. Calibration errors scale with the noise level of the sun position measurement in the image plane, but the calibration is robust across a large range of noise in the sun position. Calibration performance on clear days ranged from 0.94 to 1.24 pixels root mean square error.

  13. Single-Camera Panoramic-Imaging Systems

    NASA Technical Reports Server (NTRS)

    Lindner, Jeffrey L.; Gilbert, John

    2007-01-01

    Panoramic detection systems (PDSs) are developmental video monitoring and image-data processing systems that, as their name indicates, acquire panoramic views. More specifically, a PDS acquires images from an approximately cylindrical field of view that surrounds an observation platform. The main subsystems and components of a basic PDS are a charge-coupled- device (CCD) video camera and lens, transfer optics, a panoramic imaging optic, a mounting cylinder, and an image-data-processing computer. The panoramic imaging optic is what makes it possible for the single video camera to image the complete cylindrical field of view; in order to image the same scene without the benefit of the panoramic imaging optic, it would be necessary to use multiple conventional video cameras, which have relatively narrow fields of view.

  14. Ultra-fast framing camera tube

    DOEpatents

    Kalibjian, Ralph

    1981-01-01

    An electronic framing camera tube features focal plane image dissection and synchronized restoration of the dissected electron line images to form two-dimensional framed images. Ultra-fast framing is performed by first streaking a two-dimensional electron image across a narrow slit, thereby dissecting the two-dimensional electron image into sequential electron line images. The dissected electron line images are then restored into a framed image by a restorer deflector operated synchronously with the dissector deflector. The number of framed images on the tube's viewing screen is equal to the number of dissecting slits in the tube. The distinguishing features of this ultra-fast framing camera tube are the focal plane dissecting slits, and the synchronously-operated restorer deflector which restores the dissected electron line images into a two-dimensional framed image. The framing camera tube can produce image frames having high spatial resolution of optical events in the sub-100 picosecond range.

  15. Microchannel plate streak camera

    DOEpatents

    Wang, Ching L.

    1989-01-01

    An improved streak camera in which a microchannel plate electron multiplier is used in place of or in combination with the photocathode used in prior streak cameras. The improved streak camera is far more sensitive to photons (UV to gamma-rays) than the conventional x-ray streak camera which uses a photocathode. The improved streak camera offers gamma-ray detection with high temporal resolution. It also offers low-energy x-ray detection without attenuation inside the cathode. Using the microchannel plate in the improved camera has resulted in a time resolution of about 150 ps, and has provided a sensitivity sufficient for 1000 KeV x-rays.

  16. GRACE star camera noise

    NASA Astrophysics Data System (ADS)

    Harvey, Nate

    2016-08-01

    Extending results from previous work by Bandikova et al. (2012) and Inacio et al. (2015), this paper analyzes Gravity Recovery and Climate Experiment (GRACE) star camera attitude measurement noise by processing inter-camera quaternions from 2003 to 2015. We describe a correction to star camera data, which will eliminate a several-arcsec twice-per-rev error with daily modulation, currently visible in the auto-covariance function of the inter-camera quaternion, from future GRACE Level-1B product releases. We also present evidence supporting the argument that thermal conditions/settings affect long-term inter-camera attitude biases by at least tens-of-arcsecs, and that several-to-tens-of-arcsecs per-rev star camera errors depend largely on field-of-view.

  17. Microchannel plate streak camera

    DOEpatents

    Wang, C.L.

    1984-09-28

    An improved streak camera in which a microchannel plate electron multiplier is used in place of or in combination with the photocathode used in prior streak cameras. The improved streak camera is far more sensitive to photons (uv to gamma-rays) than the conventional x-ray streak camera which uses a photocathode. The improved streak camera offers gamma-ray detection with high temporal resolution. It also offers low-energy x-ray detection without attenuation inside the cathode. Using the microchannel plate in the improved camera has resulted in a time resolution of about 150 ps, and has provided a sensitivity sufficient for 1000 keV x-rays.

  18. Microchannel plate streak camera

    DOEpatents

    Wang, C.L.

    1989-03-21

    An improved streak camera in which a microchannel plate electron multiplier is used in place of or in combination with the photocathode used in prior streak cameras is disclosed. The improved streak camera is far more sensitive to photons (UV to gamma-rays) than the conventional x-ray streak camera which uses a photocathode. The improved streak camera offers gamma-ray detection with high temporal resolution. It also offers low-energy x-ray detection without attenuation inside the cathode. Using the microchannel plate in the improved camera has resulted in a time resolution of about 150 ps, and has provided a sensitivity sufficient for 1,000 KeV x-rays. 3 figs.

  19. Angle Performance on Optima XE

    SciTech Connect

    David, Jonathan; Satoh, Shu

    2011-01-07

    Angle control on high energy implanters is important due to shrinking device dimensions, and sensitivity to channeling at high beam energies. On Optima XE, beam-to-wafer angles are controlled in both the horizontal and vertical directions. In the horizontal direction, the beam angle is measured through a series of narrow slits, and any angle adjustment is made by steering the beam with the corrector magnet. In the vertical direction, the beam angle is measured through a high aspect ratio mask, and any angle adjustment is made by slightly tilting the wafer platen during implant.Using a sensitive channeling condition, we were able to quantify the angle repeatability of Optima XE. By quantifying the sheet resistance sensitivity to both horizontal and vertical angle variation, the total angle variation was calculated as 0.04 deg. (1{sigma}). Implants were run over a five week period, with all of the wafers selected from a single boule, in order to control for any crystal cut variation.

  20. Analytical multicollimator camera calibration

    USGS Publications Warehouse

    Tayman, W.P.

    1978-01-01

    Calibration with the U.S. Geological survey multicollimator determines the calibrated focal length, the point of symmetry, the radial distortion referred to the point of symmetry, and the asymmetric characteristiecs of the camera lens. For this project, two cameras were calibrated, a Zeiss RMK A 15/23 and a Wild RC 8. Four test exposures were made with each camera. Results are tabulated for each exposure and averaged for each set. Copies of the standard USGS calibration reports are included. ?? 1978.

  1. Streak camera meeting summary

    SciTech Connect

    Dolan, Daniel H.; Bliss, David E.

    2014-09-01

    Streak cameras are important for high-speed data acquisition in single event experiments, where the total recorded information (I) is shared between the number of measurements (M) and the number of samples (S). Topics of this meeting included: streak camera use at the national laboratories; current streak camera production; new tube developments and alternative technologies; and future planning. Each topic is summarized in the following sections.

  2. Ringfield lithographic camera

    DOEpatents

    Sweatt, William C.

    1998-01-01

    A projection lithography camera is presented with a wide ringfield optimized so as to make efficient use of extreme ultraviolet radiation from a large area radiation source (e.g., D.sub.source .apprxeq.0.5 mm). The camera comprises four aspheric mirrors optically arranged on a common axis of symmetry with an increased etendue for the camera system. The camera includes an aperture stop that is accessible through a plurality of partial aperture stops to synthesize the theoretical aperture stop. Radiation from a mask is focused to form a reduced image on a wafer, relative to the mask, by reflection from the four aspheric mirrors.

  3. Digital Electronic Still Camera

    NASA Technical Reports Server (NTRS)

    Holland, Samuel D.; Yeates, Herbert D.

    1993-01-01

    Digital electronic still camera part of electronic recording, processing, tansmitting, and displaying system. Removable hard-disk drive in camera serves as digital electronic equivalent of photographic film. Images viewed, analyzed, or transmitted quickly. Camera takes images of nearly photographic quality and stores them in digital form. Portable, hand-held, battery-powered unit designed for scientific use. Camera used in conjunction with playback unit also serving as transmitting unit if images sent to remote station. Remote station equipped to store, process, and display images. Digital image data encoded with error-correcting code at playback/transmitting unit for error-free transmission to remote station.

  4. LSST Camera Optics Design

    SciTech Connect

    Riot, V J; Olivier, S; Bauman, B; Pratuch, S; Seppala, L; Gilmore, D; Ku, J; Nordby, M; Foss, M; Antilogus, P; Morgado, N

    2012-05-24

    The Large Synoptic Survey Telescope (LSST) uses a novel, three-mirror, telescope design feeding a camera system that includes a set of broad-band filters and three refractive corrector lenses to produce a flat field at the focal plane with a wide field of view. Optical design of the camera lenses and filters is integrated in with the optical design of telescope mirrors to optimize performance. We discuss the rationale for the LSST camera optics design, describe the methodology for fabricating, coating, mounting and testing the lenses and filters, and present the results of detailed analyses demonstrating that the camera optics will meet their performance goals.

  5. Digital camera simulation.

    PubMed

    Farrell, Joyce E; Catrysse, Peter B; Wandell, Brian A

    2012-02-01

    We describe a simulation of the complete image processing pipeline of a digital camera, beginning with a radiometric description of the scene captured by the camera and ending with a radiometric description of the image rendered on a display. We show that there is a good correspondence between measured and simulated sensor performance. Through the use of simulation, we can quantify the effects of individual digital camera components on system performance and image quality. This computational approach can be helpful for both camera design and image quality assessment.

  6. The PAU camera at the WHT

    NASA Astrophysics Data System (ADS)

    Padilla, Cristóbal; Ballester, Otger; Cardiel-Sas, Laia; Carretero, Jorge; Casas, Ricard; Castilla, Javier; Croce, Martin; Delfino, Manuel; Eriksen, Martin; Fernández, Enrique; Fosalba, Pablo; García-Bellido, Juan; Gaztañaga, Enrique; Grañena, Ferràn; Hernández, Cales; Jiménez, Jorge; Lopez, Luis; Martí, Pol; Miquel, Ramon; Niessner, Christian; Pío, Cristóbal; Ponce, Rafael; Sánchez, Eusebio; Serrano, Santiago; Sevilla, Ignacio; Tonello, Nadia; de Vicente, Juan

    2016-08-01

    The PAU (Physics of the Accelerating Universe) project goal is the study of dark energy with a new photometric technique aiming at obtaining photo-z resolution for Luminous Red Galaxies (LRGs) roughly one order of magnitude better than current photometric surveys. To accomplish this, a new large field of view camera (PAUCam) has been built and commissioned at the William Herschel Telescope (WHT). With the current WHT corrector, the camera covers 1 degree diameter Field of View (FoV). The focal plane consists of 18 2kx4k Hamamatsu fully depleted CCDs, with high quantum efficiency up to 1 μm. To maximize the detector coverage within the FoV, filters are placed in front of the CCD's inside the camera cryostat (made of carbon fiber material) using a challenging movable tray system. The camera uses a set of 40 narrow band filters ranging from 4400 to 8600 angstroms complemented with six standard broad-band filters, ugrizY. Here, we describe the camera and its first commissioning results. The PAU project aims to cover roughly 100 square degrees and to obtain accurate photometric redshifts for galaxies down to iAB 22:5 detecting also galaxies down to iAB 24 with less precision in redshift. With this data set we will obtain competitive constraints in cosmological parameters using both weak lensing and galaxy clustering as main observational probes.

  7. Improved Tracking of Targets by Cameras on a Mars Rover

    NASA Technical Reports Server (NTRS)

    Kim, Won; Ansar, Adnan; Steele, Robert

    2007-01-01

    A paper describes a method devised to increase the robustness and accuracy of tracking of targets by means of three stereoscopic pairs of video cameras on a Mars-rover-type exploratory robotic vehicle. Two of the camera pairs are mounted on a mast that can be adjusted in pan and tilt; the third camera pair is mounted on the main vehicle body. Elements of the method include a mast calibration, a camera-pointing algorithm, and a purely geometric technique for handing off tracking between different camera pairs at critical distances as the rover approaches a target of interest. The mast calibration is an extension of camera calibration in which the camera images of calibration targets at known positions are collected at various pan and tilt angles. In the camerapointing algorithm, pan and tilt angles are computed by a closed-form, non-iterative solution of inverse kinematics of the mast combined with mathematical models of the cameras. The purely geometric camera-handoff technique involves the use of stereoscopic views of a target of interest in conjunction with the mast calibration.

  8. CCD Luminescence Camera

    NASA Technical Reports Server (NTRS)

    Janesick, James R.; Elliott, Tom

    1987-01-01

    New diagnostic tool used to understand performance and failures of microelectronic devices. Microscope integrated to low-noise charge-coupled-device (CCD) camera to produce new instrument for analyzing performance and failures of microelectronics devices that emit infrared light during operation. CCD camera also used to indentify very clearly parts that have failed where luminescence typically found.

  9. Constrained space camera assembly

    DOEpatents

    Heckendorn, F.M.; Anderson, E.K.; Robinson, C.W.; Haynes, H.B.

    1999-05-11

    A constrained space camera assembly which is intended to be lowered through a hole into a tank, a borehole or another cavity is disclosed. The assembly includes a generally cylindrical chamber comprising a head and a body and a wiring-carrying conduit extending from the chamber. Means are included in the chamber for rotating the body about the head without breaking an airtight seal formed therebetween. The assembly may be pressurized and accompanied with a pressure sensing means for sensing if a breach has occurred in the assembly. In one embodiment, two cameras, separated from their respective lenses, are installed on a mounting apparatus disposed in the chamber. The mounting apparatus includes means allowing both longitudinal and lateral movement of the cameras. Moving the cameras longitudinally focuses the cameras, and moving the cameras laterally away from one another effectively converges the cameras so that close objects can be viewed. The assembly further includes means for moving lenses of different magnification forward of the cameras. 17 figs.

  10. Compact Solar Camera.

    ERIC Educational Resources Information Center

    Juergens, Albert

    1980-01-01

    Describes a compact solar camera built as a one-semester student project. This camera is used for taking pictures of the sun and moon and for direct observation of the image of the sun on a screen. (Author/HM)

  11. Camera Operator and Videographer

    ERIC Educational Resources Information Center

    Moore, Pam

    2007-01-01

    Television, video, and motion picture camera operators produce images that tell a story, inform or entertain an audience, or record an event. They use various cameras to shoot a wide range of material, including television series, news and sporting events, music videos, motion pictures, documentaries, and training sessions. Those who film or…

  12. Dry imaging cameras.

    PubMed

    Indrajit, Ik; Alam, Aftab; Sahni, Hirdesh; Bhatia, Mukul; Sahu, Samaresh

    2011-04-01

    Dry imaging cameras are important hard copy devices in radiology. Using dry imaging camera, multiformat images of digital modalities in radiology are created from a sealed unit of unexposed films. The functioning of a modern dry camera, involves a blend of concurrent processes, in areas of diverse sciences like computers, mechanics, thermal, optics, electricity and radiography. Broadly, hard copy devices are classified as laser and non laser based technology. When compared with the working knowledge and technical awareness of different modalities in radiology, the understanding of a dry imaging camera is often superficial and neglected. To fill this void, this article outlines the key features of a modern dry camera and its important issues that impact radiology workflow.

  13. Impact of CCD camera SNR on polarimetric accuracy.

    PubMed

    Chen, Zhenyue; Wang, Xia; Pacheco, Shaun; Liang, Rongguang

    2014-11-10

    A comprehensive charge-coupled device (CCD) camera noise model is employed to study the impact of CCD camera signal-to-noise ratio (SNR) on polarimetric accuracy. The study shows that the standard deviations of the measured degree of linear polarization (DoLP) and angle of linear polarization (AoLP) are mainly dependent on the camera SNR. With increase in the camera SNR, both the measurement errors and the standard deviations caused by the CCD camera noise decrease. When the DoLP of the incident light is smaller than 0.1, the camera SNR should be at least 75 to achieve a measurement error of less than 0.01. When the input DoLP is larger than 0.5, a SNR of 15 is sufficient to achieve the same measurement accuracy. An experiment is carried out to verify the simulation results.

  14. Angle detector

    NASA Technical Reports Server (NTRS)

    Parra, G. T. (Inventor)

    1978-01-01

    An angle detector for determining a transducer's angular disposition to a capacitive pickup element is described. The transducer comprises a pendulum mounted inductive element moving past the capacitive pickup element. The capacitive pickup element divides the inductive element into two parts L sub 1 and L sub 2 which form the arms of one side of an a-c bridge. Two networks R sub 1 and R sub 2 having a plurality of binary weighted resistors and an equal number of digitally controlled switches for removing resistors from the networks form the arms of the other side of the a-c bridge. A binary counter, controlled by a phase detector, balances the bridge by adjusting the resistance of R sub 1 and R sub 2. The binary output of the counter is representative of the angle.

  15. Compact and robust linear Stokes polarization camera

    NASA Astrophysics Data System (ADS)

    Vedel, M.; Lechocinski, N.; Breugnot, S.

    2010-06-01

    We present novel applications of Bossa Nova Technologies Linear Stokes polarization camera. The SALSA camera is able to perform live measurement of Linear Stokes parameters, usual polarization parameters (Degree Of Linear Polarization and Angle Of Polarization) and other polarization based parameters (polarized image, depolarized image, virtual polarizer, polarization difference). First a brief description of the SALSA camera and its calibration is given. Then we present and discuss several results of target detection and contrast enhancement experiments. We will also introduce a novel polarization based metrological method of 3D shape measurement for in-line control of optical surfaces and control of highly aspheric optical surfaces. The architecture of the hardware and calibration results is presented. A new algorithm based on polarization imaging leading to the construction of the gradient field is described. Finally experimental results and observations as well as possible further steps are discussed.

  16. Ringfield lithographic camera

    DOEpatents

    Sweatt, W.C.

    1998-09-08

    A projection lithography camera is presented with a wide ringfield optimized so as to make efficient use of extreme ultraviolet radiation from a large area radiation source (e.g., D{sub source} {approx_equal} 0.5 mm). The camera comprises four aspheric mirrors optically arranged on a common axis of symmetry. The camera includes an aperture stop that is accessible through a plurality of partial aperture stops to synthesize the theoretical aperture stop. Radiation from a mask is focused to form a reduced image on a wafer, relative to the mask, by reflection from the four aspheric mirrors. 11 figs.

  17. Night Vision Camera

    NASA Technical Reports Server (NTRS)

    1996-01-01

    PixelVision, Inc. developed the Night Video NV652 Back-illuminated CCD Camera, based on the expertise of a former Jet Propulsion Laboratory employee and a former employee of Scientific Imaging Technologies, Inc. The camera operates without an image intensifier, using back-illuminated and thinned CCD technology to achieve extremely low light level imaging performance. The advantages of PixelVision's system over conventional cameras include greater resolution and better target identification under low light conditions, lower cost and a longer lifetime. It is used commercially for research and aviation.

  18. Kitt Peak speckle camera

    NASA Technical Reports Server (NTRS)

    Breckinridge, J. B.; Mcalister, H. A.; Robinson, W. G.

    1979-01-01

    The speckle camera in regular use at Kitt Peak National Observatory since 1974 is described in detail. The design of the atmospheric dispersion compensation prisms, the use of film as a recording medium, the accuracy of double star measurements, and the next generation speckle camera are discussed. Photographs of double star speckle patterns with separations from 1.4 sec of arc to 4.7 sec of arc are shown to illustrate the quality of image formation with this camera, the effects of seeing on the patterns, and to illustrate the isoplanatic patch of the atmosphere.

  19. Do speed cameras reduce collisions?

    PubMed

    Skubic, Jeffrey; Johnson, Steven B; Salvino, Chris; Vanhoy, Steven; Hu, Chengcheng

    2013-01-01

    We investigated the effects of speed cameras along a 26 mile segment in metropolitan Phoenix, Arizona. Motor vehicle collisions were retrospectively identified according to three time periods - before cameras were placed, while cameras were in place and after cameras were removed. A 14 mile segment in the same area without cameras was used for control purposes. Five cofounding variables were eliminated. In this study, the placement or removal of interstate highway speed cameras did not independently affect the incidence of motor vehicle collisions.

  20. Status of the Los Almos Anger camera

    NASA Astrophysics Data System (ADS)

    Seeger, P. A.; Nutter, M. J.

    Results of preliminary tests of the neutron Anger camera being developed at Los Alamos are presented. This detector uses a unique encoding scheme involving parallel processing of multiple receptive fields. Design goals have not yet been met, but the results are very encouraging and improvements in the test procedures are expected to show that the detector will be ready for use on a small-angle scattering instrument next year.

  1. Advanced CCD camera developments

    SciTech Connect

    Condor, A.

    1994-11-15

    Two charge coupled device (CCD) camera systems are introduced and discussed, describing briefly the hardware involved, and the data obtained in their various applications. The Advanced Development Group Defense Sciences Engineering Division has been actively designing, manufacturing, fielding state-of-the-art CCD camera systems for over a decade. These systems were originally developed for the nuclear test program to record data from underground nuclear tests. Today, new and interesting application for these systems have surfaced and development is continuing in the area of advanced CCD camera systems, with the new CCD camera that will allow experimenters to replace film for x-ray imaging at the JANUS, USP, and NOVA laser facilities.

  2. The MKID Camera

    NASA Astrophysics Data System (ADS)

    Maloney, P. R.; Czakon, N. G.; Day, P. K.; Duan, R.; Gao, J.; Glenn, J.; Golwala, S.; Hollister, M.; LeDuc, H. G.; Mazin, B.; Noroozian, O.; Nguyen, H. T.; Sayers, J.; Schlaerth, J.; Vaillancourt, J. E.; Vayonakis, A.; Wilson, P.; Zmuidzinas, J.

    2009-12-01

    The MKID Camera project is a collaborative effort of Caltech, JPL, the University of Colorado, and UC Santa Barbara to develop a large-format, multi-color millimeter and submillimeter-wavelength camera for astronomy using microwave kinetic inductance detectors (MKIDs). These are superconducting, micro-resonators fabricated from thin aluminum and niobium films. We couple the MKIDs to multi-slot antennas and measure the change in surface impedance produced by photon-induced breaking of Cooper pairs. The readout is almost entirely at room temperature and can be highly multiplexed; in principle hundreds or even thousands of resonators could be read out on a single feedline. The camera will have 576 spatial pixels that image simultaneously in four bands at 750, 850, 1100 and 1300 microns. It is scheduled for deployment at the Caltech Submillimeter Observatory in the summer of 2010. We present an overview of the camera design and readout and describe the current status of testing and fabrication.

  3. The Complementary Pinhole Camera.

    ERIC Educational Resources Information Center

    Bissonnette, D.; And Others

    1991-01-01

    Presents an experiment based on the principles of rectilinear motion of light operating in a pinhole camera that projects the image of an illuminated object through a small hole in a sheet to an image screen. (MDH)

  4. Calibration of action cameras for photogrammetric purposes.

    PubMed

    Balletti, Caterina; Guerra, Francesco; Tsioukas, Vassilios; Vernier, Paolo

    2014-09-18

    The use of action cameras for photogrammetry purposes is not widespread due to the fact that until recently the images provided by the sensors, using either still or video capture mode, were not big enough to perform and provide the appropriate analysis with the necessary photogrammetric accuracy. However, several manufacturers have recently produced and released new lightweight devices which are: (a) easy to handle, (b) capable of performing under extreme conditions and more importantly (c) able to provide both still images and video sequences of high resolution. In order to be able to use the sensor of action cameras we must apply a careful and reliable self-calibration prior to the use of any photogrammetric procedure, a relatively difficult scenario because of the short focal length of the camera and its wide angle lens that is used to obtain the maximum possible resolution of images. Special software, using functions of the OpenCV library, has been created to perform both the calibration and the production of undistorted scenes for each one of the still and video image capturing mode of a novel action camera, the GoPro Hero 3 camera that can provide still images up to 12 Mp and video up 8 Mp resolution.

  5. Calibration of Action Cameras for Photogrammetric Purposes

    PubMed Central

    Balletti, Caterina; Guerra, Francesco; Tsioukas, Vassilios; Vernier, Paolo

    2014-01-01

    The use of action cameras for photogrammetry purposes is not widespread due to the fact that until recently the images provided by the sensors, using either still or video capture mode, were not big enough to perform and provide the appropriate analysis with the necessary photogrammetric accuracy. However, several manufacturers have recently produced and released new lightweight devices which are: (a) easy to handle, (b) capable of performing under extreme conditions and more importantly (c) able to provide both still images and video sequences of high resolution. In order to be able to use the sensor of action cameras we must apply a careful and reliable self-calibration prior to the use of any photogrammetric procedure, a relatively difficult scenario because of the short focal length of the camera and its wide angle lens that is used to obtain the maximum possible resolution of images. Special software, using functions of the OpenCV library, has been created to perform both the calibration and the production of undistorted scenes for each one of the still and video image capturing mode of a novel action camera, the GoPro Hero 3 camera that can provide still images up to 12 Mp and video up 8 Mp resolution. PMID:25237898

  6. Calibration Procedures on Oblique Camera Setups

    NASA Astrophysics Data System (ADS)

    Kemper, G.; Melykuti, B.; Yu, C.

    2016-06-01

    the nadir camera and the GPS/IMU data, an initial orientation correction and radial correction were calculated. With this approach, the whole project was calculated and calibrated in one step. During the iteration process the radial and tangential parameters were switched on individually for the camera heads and after that the camera constants and principal point positions were checked and finally calibrated. Besides that, the bore side calibration can be performed either on basis of the nadir camera and their offsets, or independently for each camera without correlation to the others. This must be performed in a complete mission anyway to get stability between the single camera heads. Determining the lever arms of the nodal-points to the IMU centre needs more caution than for a single camera especially due to the strong tilt angle. Prepared all these previous steps, you get a highly accurate sensor that enables a fully automated data extraction with a rapid update of you existing data. Frequently monitoring urban dynamics is then possible in fully 3D environment.

  7. Spacecraft camera image registration

    NASA Technical Reports Server (NTRS)

    Kamel, Ahmed A. (Inventor); Graul, Donald W. (Inventor); Chan, Fred N. T. (Inventor); Gamble, Donald W. (Inventor)

    1987-01-01

    A system for achieving spacecraft camera (1, 2) image registration comprises a portion external to the spacecraft and an image motion compensation system (IMCS) portion onboard the spacecraft. Within the IMCS, a computer (38) calculates an image registration compensation signal (60) which is sent to the scan control loops (84, 88, 94, 98) of the onboard cameras (1, 2). At the location external to the spacecraft, the long-term orbital and attitude perturbations on the spacecraft are modeled. Coefficients (K, A) from this model are periodically sent to the onboard computer (38) by means of a command unit (39). The coefficients (K, A) take into account observations of stars and landmarks made by the spacecraft cameras (1, 2) themselves. The computer (38) takes as inputs the updated coefficients (K, A) plus synchronization information indicating the mirror position (AZ, EL) of each of the spacecraft cameras (1, 2), operating mode, and starting and stopping status of the scan lines generated by these cameras (1, 2), and generates in response thereto the image registration compensation signal (60). The sources of periodic thermal errors on the spacecraft are discussed. The system is checked by calculating measurement residuals, the difference between the landmark and star locations predicted at the external location and the landmark and star locations as measured by the spacecraft cameras (1, 2).

  8. Neutron cameras for ITER

    SciTech Connect

    Johnson, L.C.; Barnes, C.W.; Batistoni, P.

    1998-12-31

    Neutron cameras with horizontal and vertical views have been designed for ITER, based on systems used on JET and TFTR. The cameras consist of fan-shaped arrays of collimated flight tubes, with suitably chosen detectors situated outside the biological shield. The sight lines view the ITER plasma through slots in the shield blanket and penetrate the vacuum vessel, cryostat, and biological shield through stainless steel windows. This paper analyzes the expected performance of several neutron camera arrangements for ITER. In addition to the reference designs, the authors examine proposed compact cameras, in which neutron fluxes are inferred from {sup 16}N decay gammas in dedicated flowing water loops, and conventional cameras with fewer sight lines and more limited fields of view than in the reference designs. It is shown that the spatial sampling provided by the reference designs is sufficient to satisfy target measurement requirements and that some reduction in field of view may be permissible. The accuracy of measurements with {sup 16}N-based compact cameras is not yet established, and they fail to satisfy requirements for parameter range and time resolution by large margins.

  9. Cartography of the Luna-21 landing site and Lunokhod-2 traverse area based on Lunar Reconnaissance Orbiter Camera images and surface archive TV-panoramas

    NASA Astrophysics Data System (ADS)

    Karachevtseva, I. P.; Kozlova, N. A.; Kokhanov, A. A.; Zubarev, A. E.; Nadezhdina, I. E.; Patratiy, V. D.; Konopikhin, A. A.; Basilevsky, A. T.; Abdrakhimov, A. M.; Oberst, J.; Haase, I.; Jolliff, B. L.; Plescia, J. B.; Robinson, M. S.

    2017-02-01

    The Lunar Reconnaissance Orbiter Camera (LROC) system consists of a Wide Angle Camera (WAC) and Narrow Angle Camera (NAC). NAC images (∼0.5 to 1.7 m/pixel) reveal details of the Luna-21 landing site and Lunokhod-2 traverse area. We derived a Digital Elevation Model (DEM) and an orthomosaic for the study region using photogrammetric stereo processing techniques with NAC images. The DEM and mosaic allowed us to analyze the topography and morphology of the landing site area and to map the Lunokhod-2 rover route. The total range of topographic elevation along the traverse was found to be less than 144 m; and the rover encountered slopes of up to 20°. With the orthomosaic tied to the lunar reference frame, we derived coordinates of the Lunokhod-2 landing module and overnight stop points. We identified the exact rover route by following its tracks and determined its total length as 39.16 km, more than was estimated during the mission (37 km), which until recently was a distance record for planetary robotic rovers held for more than 40 years.

  10. Early Experience & Multisensory Perceptual Narrowing

    PubMed Central

    Lewkowicz, David J.

    2014-01-01

    Perceptual narrowing is a reflection of early experience and contributes in key ways to perceptual and cognitive development. In general, findings have shown that unisensory perceptual sensitivity in early infancy is broadly tuned such that young infants respond to, and discriminate, native as well as non-native sensory inputs, whereas older infants only respond to native inputs. Recently, my colleagues and I discovered that perceptual narrowing occurs at the multisensory processing level as well. The present article reviews this new evidence and puts it in the larger context of multisensory perceptual development and the role that perceptual experience plays in it. Together, the evidence on unisensory and multisensory narrowing shows that early experience shapes the emergence of perceptual specialization and expertise. PMID:24435505

  11. Deployable Wireless Camera Penetrators

    NASA Technical Reports Server (NTRS)

    Badescu, Mircea; Jones, Jack; Sherrit, Stewart; Wu, Jiunn Jeng

    2008-01-01

    A lightweight, low-power camera dart has been designed and tested for context imaging of sampling sites and ground surveys from an aerobot or an orbiting spacecraft in a microgravity environment. The camera penetrators also can be used to image any line-of-sight surface, such as cliff walls, that is difficult to access. Tethered cameras to inspect the surfaces of planetary bodies use both power and signal transmission lines to operate. A tether adds the possibility of inadvertently anchoring the aerobot, and requires some form of station-keeping capability of the aerobot if extended examination time is required. The new camera penetrators are deployed without a tether, weigh less than 30 grams, and are disposable. They are designed to drop from any altitude with the boost in transmitting power currently demonstrated at approximately 100-m line-of-sight. The penetrators also can be deployed to monitor lander or rover operations from a distance, and can be used for surface surveys or for context information gathering from a touch-and-go sampling site. Thanks to wireless operation, the complexity of the sampling or survey mechanisms may be reduced. The penetrators may be battery powered for short-duration missions, or have solar panels for longer or intermittent duration missions. The imaging device is embedded in the penetrator, which is dropped or projected at the surface of a study site at 90 to the surface. Mirrors can be used in the design to image the ground or the horizon. Some of the camera features were tested using commercial "nanny" or "spy" camera components with the charge-coupled device (CCD) looking at a direction parallel to the ground. Figure 1 shows components of one camera that weighs less than 8 g and occupies a volume of 11 cm3. This camera could transmit a standard television signal, including sound, up to 100 m. Figure 2 shows the CAD models of a version of the penetrator. A low-volume array of such penetrator cameras could be deployed from an

  12. Stellar Occultations in the Coma of Comet 67/P Chuyumov-Gerasimenko Observed by the OSIRIS Camera System

    NASA Astrophysics Data System (ADS)

    Moissl, Richard; Kueppers, Michael

    2016-10-01

    In this paper we present the results of an analysis on a large part of the existing Image data from the OSIRIS camera system onboard the Rosetta Spacecraft, in which stars of sufficient brightness (down to a limiting magnitude of 6) have been observed through the coma of Comet 67/P Churyumov-Gerasimenko ("C-G"). Over the course of the Rosetta main mission the Coma of the comet underwent large changes in density and structure, owed to the changing insolation along the orbit of C-G. We report on the changes of the stellar signals in the wavelength ranges, covered by the filters of the OSIRIS Narrow-Angle (NAC) and Wide-Angle (WAC) cameras.Acknowledgements: OSIRIS was built by a consortium led by the Max-Planck-Institut für Sonnensystemforschung, Göttingen, Germany, in collaboration with CISAS, University of Padova, Italy, the Laboratoire d'Astrophysique de Marseille, France, the Instituto de Astrofísica de Andalucia, CSIC, Granada, Spain, the Scientific Support Office of the European Space Agency, Noordwijk, The Netherlands, the Instituto Nacional de Técnica Aeroespacial, Madrid, Spain, the Universidad Politéchnica de Madrid, Spain, the Department of Physics and Astronomy of Uppsala University, Sweden, and the Institut für Datentechnik und Kommunikationsnetze der Technischen Universität Braunschweig, Germany.

  13. Auto-converging stereo cameras for 3D robotic tele-operation

    NASA Astrophysics Data System (ADS)

    Edmondson, Richard; Aycock, Todd; Chenault, David

    2012-06-01

    Polaris Sensor Technologies has developed a Stereovision Upgrade Kit for TALON robot to provide enhanced depth perception to the operator. This kit previously required the TALON Operator Control Unit to be equipped with the optional touchscreen interface to allow for operator control of the camera convergence angle adjustment. This adjustment allowed for optimal camera convergence independent of the distance from the camera to the object being viewed. Polaris has recently improved the performance of the stereo camera by implementing an Automatic Convergence algorithm in a field programmable gate array in the camera assembly. This algorithm uses scene content to automatically adjust the camera convergence angle, freeing the operator to focus on the task rather than adjustment of the vision system. The autoconvergence capability has been demonstrated on both visible zoom cameras and longwave infrared microbolometer stereo pairs.

  14. The VISTA IR camera

    NASA Astrophysics Data System (ADS)

    Dalton, Gavin B.; Caldwell, Martin; Ward, Kim; Whalley, Martin S.; Burke, Kevin; Lucas, John M.; Richards, Tony; Ferlet, Marc; Edeson, Ruben L.; Tye, Daniel; Shaughnessy, Bryan M.; Strachan, Mel; Atad-Ettedgui, Eli; Leclerc, Melanie R.; Gallie, Angus; Bezawada, Nagaraja N.; Clark, Paul; Bissonauth, Nirmal; Luke, Peter; Dipper, Nigel A.; Berry, Paul; Sutherland, Will; Emerson, Jim

    2004-09-01

    The VISTA IR Camera has now completed its detailed design phase and is on schedule for delivery to ESO"s Cerro Paranal Observatory in 2006. The camera consists of 16 Raytheon VIRGO 2048x2048 HgCdTe arrays in a sparse focal plane sampling a 1.65 degree field of view. A 1.4m diameter filter wheel provides slots for 7 distinct science filters, each comprising 16 individual filter panes. The camera also provides autoguiding and curvature sensing information for the VISTA telescope, and relies on tight tolerancing to meet the demanding requirements of the f/1 telescope design. The VISTA IR camera is unusual in that it contains no cold pupil-stop, but rather relies on a series of nested cold baffles to constrain the light reaching the focal plane to the science beam. In this paper we present a complete overview of the status of the final IR Camera design, its interaction with the VISTA telescope, and a summary of the predicted performance of the system.

  15. THE DARK ENERGY CAMERA

    SciTech Connect

    Flaugher, B.; Diehl, H. T.; Alvarez, O.; Angstadt, R.; Annis, J. T.; Buckley-Geer, E. J.; Honscheid, K.; Abbott, T. M. C.; Bonati, M.; Antonik, M.; Brooks, D.; Ballester, O.; Cardiel-Sas, L.; Beaufore, L.; Bernstein, G. M.; Bernstein, R. A.; Bigelow, B.; Boprie, D.; Campa, J.; Castander, F. J.; Collaboration: DES Collaboration; and others

    2015-11-15

    The Dark Energy Camera is a new imager with a 2.°2 diameter field of view mounted at the prime focus of the Victor M. Blanco 4 m telescope on Cerro Tololo near La Serena, Chile. The camera was designed and constructed by the Dark Energy Survey Collaboration and meets or exceeds the stringent requirements designed for the wide-field and supernova surveys for which the collaboration uses it. The camera consists of a five-element optical corrector, seven filters, a shutter with a 60 cm aperture, and a charge-coupled device (CCD) focal plane of 250 μm thick fully depleted CCDs cooled inside a vacuum Dewar. The 570 megapixel focal plane comprises 62 2k × 4k CCDs for imaging and 12 2k × 2k CCDs for guiding and focus. The CCDs have 15 μm × 15 μm pixels with a plate scale of 0.″263 pixel{sup −1}. A hexapod system provides state-of-the-art focus and alignment capability. The camera is read out in 20 s with 6–9 electron readout noise. This paper provides a technical description of the camera's engineering, construction, installation, and current status.

  16. The Dark Energy Camera

    SciTech Connect

    Flaugher, B.

    2015-04-11

    The Dark Energy Camera is a new imager with a 2.2-degree diameter field of view mounted at the prime focus of the Victor M. Blanco 4-meter telescope on Cerro Tololo near La Serena, Chile. The camera was designed and constructed by the Dark Energy Survey Collaboration, and meets or exceeds the stringent requirements designed for the wide-field and supernova surveys for which the collaboration uses it. The camera consists of a five element optical corrector, seven filters, a shutter with a 60 cm aperture, and a CCD focal plane of 250-μm thick fully depleted CCDs cooled inside a vacuum Dewar. The 570 Mpixel focal plane comprises 62 2k x 4k CCDs for imaging and 12 2k x 2k CCDs for guiding and focus. The CCDs have 15μm x 15μm pixels with a plate scale of 0.263" per pixel. A hexapod system provides state-of-the-art focus and alignment capability. The camera is read out in 20 seconds with 6-9 electrons readout noise. This paper provides a technical description of the camera's engineering, construction, installation, and current status.

  17. Satellite camera image navigation

    NASA Technical Reports Server (NTRS)

    Kamel, Ahmed A. (Inventor); Graul, Donald W. (Inventor); Savides, John (Inventor); Hanson, Charles W. (Inventor)

    1987-01-01

    Pixels within a satellite camera (1, 2) image are precisely located in terms of latitude and longitude on a celestial body, such as the earth, being imaged. A computer (60) on the earth generates models (40, 50) of the satellite's orbit and attitude, respectively. The orbit model (40) is generated from measurements of stars and landmarks taken by the camera (1, 2), and by range data. The orbit model (40) is an expression of the satellite's latitude and longitude at the subsatellite point, and of the altitude of the satellite, as a function of time, using as coefficients (K) the six Keplerian elements at epoch. The attitude model (50) is based upon star measurements taken by each camera (1, 2). The attitude model (50) is a set of expressions for the deviations in a set of mutually orthogonal reference optical axes (x, y, z) as a function of time, for each camera (1, 2). Measured data is fit into the models (40, 50) using a walking least squares fit algorithm. A transformation computer (66 ) transforms pixel coordinates as telemetered by the camera (1, 2) into earth latitude and longitude coordinates, using the orbit and attitude models (40, 50).

  18. The Dark Energy Camera

    DOE PAGES

    Flaugher, B.

    2015-04-11

    The Dark Energy Camera is a new imager with a 2.2-degree diameter field of view mounted at the prime focus of the Victor M. Blanco 4-meter telescope on Cerro Tololo near La Serena, Chile. The camera was designed and constructed by the Dark Energy Survey Collaboration, and meets or exceeds the stringent requirements designed for the wide-field and supernova surveys for which the collaboration uses it. The camera consists of a five element optical corrector, seven filters, a shutter with a 60 cm aperture, and a CCD focal plane of 250-μm thick fully depleted CCDs cooled inside a vacuum Dewar.more » The 570 Mpixel focal plane comprises 62 2k x 4k CCDs for imaging and 12 2k x 2k CCDs for guiding and focus. The CCDs have 15μm x 15μm pixels with a plate scale of 0.263" per pixel. A hexapod system provides state-of-the-art focus and alignment capability. The camera is read out in 20 seconds with 6-9 electrons readout noise. This paper provides a technical description of the camera's engineering, construction, installation, and current status.« less

  19. Neutron counting with cameras

    SciTech Connect

    Van Esch, Patrick; Crisanti, Marta; Mutti, Paolo

    2015-07-01

    A research project is presented in which we aim at counting individual neutrons with CCD-like cameras. We explore theoretically a technique that allows us to use imaging detectors as counting detectors at lower counting rates, and transits smoothly to continuous imaging at higher counting rates. As such, the hope is to combine the good background rejection properties of standard neutron counting detectors with the absence of dead time of integrating neutron imaging cameras as well as their very good spatial resolution. Compared to Xray detection, the essence of thermal neutron detection is the nuclear conversion reaction. The released energies involved are of the order of a few MeV, while X-ray detection releases energies of the order of the photon energy, which is in the 10 KeV range. Thanks to advances in camera technology which have resulted in increased quantum efficiency, lower noise, as well as increased frame rate up to 100 fps for CMOS-type cameras, this more than 100-fold higher available detection energy implies that the individual neutron detection light signal can be significantly above the noise level, as such allowing for discrimination and individual counting, which is hard to achieve with X-rays. The time scale of CMOS-type cameras doesn't allow one to consider time-of-flight measurements, but kinetic experiments in the 10 ms range are possible. The theory is next confronted to the first experimental results. (authors)

  20. CAOS-CMOS camera.

    PubMed

    Riza, Nabeel A; La Torre, Juan Pablo; Amin, M Junaid

    2016-06-13

    Proposed and experimentally demonstrated is the CAOS-CMOS camera design that combines the coded access optical sensor (CAOS) imager platform with the CMOS multi-pixel optical sensor. The unique CAOS-CMOS camera engages the classic CMOS sensor light staring mode with the time-frequency-space agile pixel CAOS imager mode within one programmable optical unit to realize a high dynamic range imager for extreme light contrast conditions. The experimentally demonstrated CAOS-CMOS camera is built using a digital micromirror device, a silicon point-photo-detector with a variable gain amplifier, and a silicon CMOS sensor with a maximum rated 51.3 dB dynamic range. White light imaging of three different brightness simultaneously viewed targets, that is not possible by the CMOS sensor, is achieved by the CAOS-CMOS camera demonstrating an 82.06 dB dynamic range. Applications for the camera include industrial machine vision, welding, laser analysis, automotive, night vision, surveillance and multispectral military systems.

  1. Binocular Camera for cockpit visibility of general aviation aircraft

    NASA Astrophysics Data System (ADS)

    Barile, A. J.

    1981-04-01

    A history of cockpit visibility studies and requirements with regard to aircraft safety, human factors, collision avoidance, and accident investigations is presented. The Federal Aviation Administration's development of the Binocular Camera is reviewed, and the technical details of a new and improved camera are discussed. The Binocular Camera uses two 65 mm wide angle F6.8 lenses and covers an 88 1/2 deg field of vision. The camera produces images, representative of what the human eyes see before the brain integrates them into one, thus making it possible to analyze the effect of obstruction to vision. The improvements, applications, and uses of the camera in the research, development, and operations of general aviation aircraft are discussed.

  2. Seasonal and vertical changes in leaf angle distribution for selected deciduous broadleaf tree species common to Europe

    NASA Astrophysics Data System (ADS)

    Raabe, Kairi; Pisek, Jan; Sonnentag, Oliver; Annuk, Kalju

    2014-05-01

    Leaf inclination angle distribution is a key parameter in determining the transmission and reflection of radiation by vegetation canopies. It has been previously observed that leaf inclination angle might change gradually from more vertical in the upper canopy and in high light habitats to more horizontal in the lower canopy and in low light habitats [1]. Despite its importance, relatively few measurements on actual leaf angle distributions have been reported for different tree species. Even smaller number of studies have dealt with the possible seasonal changes in leaf angle distribution [2]. In this study the variation of leaf inclination angle distributions was examined both temporally throughout the growing season and vertically at different heights of trees. We report on leaf inclination angle distributions for five deciduous broadleaf species found commonly in several parts of Europe: grey alder (Alnus incana), Silver birch (Betula pendula Roth), chestnut (Castanea), Norway maple (Acer platanoides), and aspen (Populus tremula). The angles were measured using the leveled camera method [3], with the data collected at several separate heights and four times during the period of May-September 2013. The results generally indicate the greatest change in leaf inclination angles for spring, with the changes usually being the most pronounced at the top of the canopy. It should also be noted, however, that whereas the temporal variation proved to be rather consistent for different species, the vertical variation differed more between species. The leveled camera method was additionally tested in terms of sensitivity to different users. Ten people were asked to measure the leaf angles for four different species. The results indicate the method is quite robust in providing coinciding distributions irrespective of the user and level of previous experience with the method. However, certain caution must be exercised when measuring long narrow leaves. References [1] G.G. Mc

  3. Streak Camera 101: Visualizing Charged-Particle Beam Dynamics

    NASA Astrophysics Data System (ADS)

    Yang, Bingxin

    2006-11-01

    Radiation generated by high-energy particle beams such as optical transition radiation (OTR) and optical synchrotron radiation (OSR) is widely used to characterize the beam properties. It has enabled dual-sweep streak cameras to visualize charged-particle beam dynamics. In this introductory paper, we will discuss properties of dual-sweep streak cameras and basic optics systems for imaging charged-particle bunches at different projection angles to produce top view, side view, and front view. Examples will be used to illustrate the beam dynamics that streak images reveal. We will also discuss practical considerations of optical transport line design to preserve spatial and temporal resolution of the camera system.

  4. Selective-imaging camera

    NASA Astrophysics Data System (ADS)

    Szu, Harold; Hsu, Charles; Landa, Joseph; Cha, Jae H.; Krapels, Keith A.

    2015-05-01

    How can we design cameras that image selectively in Full Electro-Magnetic (FEM) spectra? Without selective imaging, we cannot use, for example, ordinary tourist cameras to see through fire, smoke, or other obscurants contributing to creating a Visually Degraded Environment (VDE). This paper addresses a possible new design of selective-imaging cameras at firmware level. The design is consistent with physics of the irreversible thermodynamics of Boltzmann's molecular entropy. It enables imaging in appropriate FEM spectra for sensing through the VDE, and displaying in color spectra for Human Visual System (HVS). We sense within the spectra the largest entropy value of obscurants such as fire, smoke, etc. Then we apply a smart firmware implementation of Blind Sources Separation (BSS) to separate all entropy sources associated with specific Kelvin temperatures. Finally, we recompose the scene using specific RGB colors constrained by the HVS, by up/down shifting Planck spectra at each pixel and time.

  5. ROSAT wide field camera mirrors.

    PubMed

    Willingale, R

    1988-04-15

    The ROSAT wide field camera (WFC) is an XUV telescope operating in the 12-250-eV energy band. The mirror system utilizes Wolter-Schwarzschild type I (WS I) grazing incidence optics with a focal length of 525 mm, comprised of three nested aluminum shells with an outermost diameter of 576 mm providing a geometric aperture area of 456 cm(2). The reflecting surfaces are electroless nickel plated and coated with gold to enhance their reflectivity in the XUV. The mirrors have undergone full aperture optical testing, narrow beam XUV testing, and full aperture XUV testing. Measurements of the reflectivity are compared to theoretical values derived from the optical constants of gold in the XUV range. Analysis of the focused distribution is used to estimate the surface roughness and figuring errors of the polished surfaces. The results are compared to the mechanical metrology data collected during manufacture of the shells and the power spectral density of the reflecting surfaces is found to have a power-law form.

  6. ROSAT Wide Field Camera Mirrors

    NASA Astrophysics Data System (ADS)

    Willingale, R.

    1988-08-01

    The ROSAT wide field camera (WFC) is an XUV telescope operating in the 12-250-eV energy band. The mirror system utilizes Wolter-Schwarzschild type I (WSI) grazing incidence optics with a focal length of 525 mm, comprised of three nested aluminum shells with an outermost diameter of 576 mm providing a geometric aperture area of 456 cm2. The reflecting surfaces are electroless nickel plated and coated with gold to enhance their reflectivity in the XUV. The mirrors have undergone full aperture optical testing, narrow beam XUV testing, and full aperture XUV testing. Measurements of the reflectivity are compared to theoretical values derived from the optical constants of gold in the XUV range. Analysis of the focused distribution is used to estimate the surface roughness and figuring errors of the polished surfaces. The results are compared to the mechanical metrology data collected during manufacture of the shells and the power spectral density of the reflecting surfaces is found to have a power-law form.

  7. Artificial human vision camera

    NASA Astrophysics Data System (ADS)

    Goudou, J.-F.; Maggio, S.; Fagno, M.

    2014-10-01

    In this paper we present a real-time vision system modeling the human vision system. Our purpose is to inspire from human vision bio-mechanics to improve robotic capabilities for tasks such as objects detection and tracking. This work describes first the bio-mechanical discrepancies between human vision and classic cameras and the retinal processing stage that takes place in the eye, before the optic nerve. The second part describes our implementation of these principles on a 3-camera optical, mechanical and software model of the human eyes and associated bio-inspired attention model.

  8. Laser Range Camera Modeling

    SciTech Connect

    Storjohann, K.

    1990-01-01

    This paper describes an imaging model that was derived for use with a laser range camera (LRC) developed by the Advanced Intelligent Machines Division of Odetics. However, this model could be applied to any comparable imaging system. Both the derivation of the model and the determination of the LRC's intrinsic parameters are explained. For the purpose of evaluating the LRC's extrinsic parameters, i.e., its external orientation, a transformation of the LRC's imaging model into a standard camera's (SC) pinhole model is derived. By virtue of this transformation, the evaluation of the LRC's external orientation can be found by applying any SC calibration technique.

  9. Autonomous pedestrian localization technique using CMOS camera sensors

    NASA Astrophysics Data System (ADS)

    Chun, Chanwoo

    2014-09-01

    We present a pedestrian localization technique that does not need infrastructure. The proposed angle-only measurement method needs specially manufactured shoes. Each shoe has two CMOS cameras and two markers such as LEDs attached on the inward side. The line of sight (LOS) angles towards the two markers on the forward shoe are measured using the two cameras on the other rear shoe. Our simulation results shows that a pedestrian walking down in a shopping mall wearing this device can be accurately guided to the front of a destination store located 100m away, if the floor plan of the mall is available.

  10. Underwater camera with depth measurement

    NASA Astrophysics Data System (ADS)

    Wang, Wei-Chih; Lin, Keng-Ren; Tsui, Chi L.; Schipf, David; Leang, Jonathan

    2016-04-01

    The objective of this study is to develop an RGB-D (video + depth) camera that provides three-dimensional image data for use in the haptic feedback of a robotic underwater ordnance recovery system. Two camera systems were developed and studied. The first depth camera relies on structured light (as used by the Microsoft Kinect), where the displacement of an object is determined by variations of the geometry of a projected pattern. The other camera system is based on a Time of Flight (ToF) depth camera. The results of the structural light camera system shows that the camera system requires a stronger light source with a similar operating wavelength and bandwidth to achieve a desirable working distance in water. This approach might not be robust enough for our proposed underwater RGB-D camera system, as it will require a complete re-design of the light source component. The ToF camera system instead, allows an arbitrary placement of light source and camera. The intensity output of the broadband LED light source in the ToF camera system can be increased by putting them into an array configuration and the LEDs can be modulated comfortably with any waveform and frequencies required by the ToF camera. In this paper, both camera were evaluated and experiments were conducted to demonstrate the versatility of the ToF camera.

  11. Photogrammetric camera calibration

    USGS Publications Warehouse

    Tayman, W.P.; Ziemann, H.

    1984-01-01

    Section 2 (Calibration) of the document "Recommended Procedures for Calibrating Photogrammetric Cameras and Related Optical Tests" from the International Archives of Photogrammetry, Vol. XIII, Part 4, is reviewed in the light of recent practical work, and suggestions for changes are made. These suggestions are intended as a basis for a further discussion. ?? 1984.

  12. Make a Pinhole Camera

    ERIC Educational Resources Information Center

    Fisher, Diane K.; Novati, Alexander

    2009-01-01

    On Earth, using ordinary visible light, one can create a single image of light recorded over time. Of course a movie or video is light recorded over time, but it is a series of instantaneous snapshots, rather than light and time both recorded on the same medium. A pinhole camera, which is simple to make out of ordinary materials and using ordinary…

  13. Anger Camera Firmware

    SciTech Connect

    2010-11-19

    The firmware is responsible for the operation of Anger Camera Electronics, calculation of position, time of flight and digital communications. It provides a first stage analysis of 48 signals from 48 analog signals that have been converted to digital values using A/D convertors.

  14. Snapshot polarimeter fundus camera.

    PubMed

    DeHoog, Edward; Luo, Haitao; Oka, Kazuhiko; Dereniak, Eustace; Schwiegerling, James

    2009-03-20

    A snapshot imaging polarimeter utilizing Savart plates is integrated into a fundus camera for retinal imaging. Acquired retinal images can be processed to reconstruct Stokes vector images, giving insight into the polarization properties of the retina. Results for images from a normal healthy retina and retinas with pathology are examined and compared.

  15. Jack & the Video Camera

    ERIC Educational Resources Information Center

    Charlan, Nathan

    2010-01-01

    This article narrates how the use of video camera has transformed the life of Jack Williams, a 10-year-old boy from Colorado Springs, Colorado, who has autism. The way autism affected Jack was unique. For the first nine years of his life, Jack remained in his world, alone. Functionally non-verbal and with motor skill problems that affected his…

  16. Spas color camera

    NASA Technical Reports Server (NTRS)

    Toffales, C.

    1983-01-01

    The procedures to be followed in assessing the performance of the MOS color camera are defined. Aspects considered include: horizontal and vertical resolution; value of the video signal; gray scale rendition; environmental (vibration and temperature) tests; signal to noise ratios; and white balance correction.

  17. Advanced Virgo phase cameras

    NASA Astrophysics Data System (ADS)

    van der Schaaf, L.; Agatsuma, K.; van Beuzekom, M.; Gebyehu, M.; van den Brand, J.

    2016-05-01

    A century after the prediction of gravitational waves, detectors have reached the sensitivity needed to proof their existence. One of them, the Virgo interferometer in Pisa, is presently being upgraded to Advanced Virgo (AdV) and will come into operation in 2016. The power stored in the interferometer arms raises from 20 to 700 kW. This increase is expected to introduce higher order modes in the beam, which could reduce the circulating power in the interferometer, limiting the sensitivity of the instrument. To suppress these higher-order modes, the core optics of Advanced Virgo is equipped with a thermal compensation system. Phase cameras, monitoring the real-time status of the beam constitute a critical component of this compensation system. These cameras measure the phases and amplitudes of the laser-light fields at the frequencies selected to control the interferometer. The measurement combines heterodyne detection with a scan of the wave front over a photodetector with pin-hole aperture. Three cameras observe the phase front of these laser sidebands. Two of them monitor the in-and output of the interferometer arms and the third one is used in the control of the aberrations introduced by the power recycling cavity. In this paper the working principle of the phase cameras is explained and some characteristic parameters are described.

  18. Communities, Cameras, and Conservation

    ERIC Educational Resources Information Center

    Patterson, Barbara

    2012-01-01

    Communities, Cameras, and Conservation (CCC) is the most exciting and valuable program the author has seen in her 30 years of teaching field science courses. In this citizen science project, students and community volunteers collect data on mountain lions ("Puma concolor") at four natural areas and public parks along the Front Range of Colorado.…

  19. The LSST Camera Overview

    SciTech Connect

    Gilmore, Kirk; Kahn, Steven A.; Nordby, Martin; Burke, David; O'Connor, Paul; Oliver, John; Radeka, Veljko; Schalk, Terry; Schindler, Rafe; /SLAC

    2007-01-10

    The LSST camera is a wide-field optical (0.35-1um) imager designed to provide a 3.5 degree FOV with better than 0.2 arcsecond sampling. The detector format will be a circular mosaic providing approximately 3.2 Gigapixels per image. The camera includes a filter mechanism and, shuttering capability. It is positioned in the middle of the telescope where cross-sectional area is constrained by optical vignetting and heat dissipation must be controlled to limit thermal gradients in the optical beam. The fast, f/1.2 beam will require tight tolerances on the focal plane mechanical assembly. The focal plane array operates at a temperature of approximately -100 C to achieve desired detector performance. The focal plane array is contained within an evacuated cryostat, which incorporates detector front-end electronics and thermal control. The cryostat lens serves as an entrance window and vacuum seal for the cryostat. Similarly, the camera body lens serves as an entrance window and gas seal for the camera housing, which is filled with a suitable gas to provide the operating environment for the shutter and filter change mechanisms. The filter carousel can accommodate 5 filters, each 75 cm in diameter, for rapid exchange without external intervention.

  20. Ultraminiature television camera

    NASA Technical Reports Server (NTRS)

    Deterville, R. J.; Drago, N.

    1967-01-01

    Ultraminiature television camera with a total volume of 20.25 cubic inches, requires 28 vdc power, operates on UHF and accommodates standard 8-mm optics. It uses microelectronic assembly packaging techniques and contains a magnetically deflected and electrostatically focused vidicon, automatic gain control circuit, power supply, and transmitter.

  1. Wide-Angle Quasar Feedback

    NASA Astrophysics Data System (ADS)

    Chartas, George; Strickland, Sarah

    We present results from the detection of relativistic winds launched near the innermost stable circular orbits of supermassive black holes. A recent detection of a powerful wind in the X-ray-bright narrow absorption line (NAL) z=1.51 quasar HS 0810+2554 strengthens the case that quasars play a significant role in feedback. In both deep Chandra and XMM-Newton observations of HS 0810 we detected blueshifted absorption lines implying outflowing velocities ranging from 0.1c and 0.4c. The presence of both an emission line at 6.8 keV and an absorption line at 7.8 keV in the spectral line profile of HS 0810 is a characteristic feature of a P-Cygni profile supporting the presence of an expanding outflowing highly ionized Fe absorber. A hard excess component is detected in the XMM-Newton observation of HS 0810 possibly originating from reflection off the disk. Modelling of the XMM-Newton spectrum constrains the inclination angle to be < 35° (68% confidence). The presence of relativistic winds in both low inclination angle NAL quasars as well as in high inclination angle BAL quasars implies that the solid angle of quasar winds may be quite large. The larger solid angle of quasar winds would also indicate that their contribution to the regulation of the host galaxy may be more important than previously thought.

  2. Image Sensors Enhance Camera Technologies

    NASA Technical Reports Server (NTRS)

    2010-01-01

    In the 1990s, a Jet Propulsion Laboratory team led by Eric Fossum researched ways of improving complementary metal-oxide semiconductor (CMOS) image sensors in order to miniaturize cameras on spacecraft while maintaining scientific image quality. Fossum s team founded a company to commercialize the resulting CMOS active pixel sensor. Now called the Aptina Imaging Corporation, based in San Jose, California, the company has shipped over 1 billion sensors for use in applications such as digital cameras, camera phones, Web cameras, and automotive cameras. Today, one of every three cell phone cameras on the planet feature Aptina s sensor technology.

  3. Do Speed Cameras Reduce Collisions?

    PubMed Central

    Skubic, Jeffrey; Johnson, Steven B.; Salvino, Chris; Vanhoy, Steven; Hu, Chengcheng

    2013-01-01

    We investigated the effects of speed cameras along a 26 mile segment in metropolitan Phoenix, Arizona. Motor vehicle collisions were retrospectively identified according to three time periods – before cameras were placed, while cameras were in place and after cameras were removed. A 14 mile segment in the same area without cameras was used for control purposes. Five cofounding variables were eliminated. In this study, the placement or removal of interstate highway speed cameras did not independently affect the incidence of motor vehicle collisions. PMID:24406979

  4. What convention is used for the illumination and view angles?

    Atmospheric Science Data Center

    2014-12-08

    ... Azimuth angles are measured clockwise from the direction of travel to local north. For both the Sun and cameras, azimuth describes the ... to the equator, because of its morning equator crossing time. Additionally, the difference in view and solar azimuth angle will be near ...

  5. Quality criterion for digital still camera

    NASA Astrophysics Data System (ADS)

    Bezryadin, Sergey

    2007-02-01

    The main quality requirements for a digital still camera are color capturing accuracy, low noise level, and quantum efficiency. Different consumers assign different priorities to the listed parameters, and camera designers need clearly formulated methods for their evaluation. While there are procedures providing noise level and quantum efficiency estimation, there are no effective means for color capturing accuracy estimation. Introduced in this paper criterion allows to fill this gap. Luther-Ives condition for correct color reproduction system became known in the beginning of the last century. However, since no detector system satisfies Luther-Ives condition, there are always stimuli that are distinctly different for an observer, but which detectors are unable to distinguish. To estimate conformity of a detector set with Luther-Ives condition and calculate a measure of discrepancy, an angle between detector sensor sensitivity and Cohen's Fundamental Color Space may be used. In this paper, the divergence angle is calculated for some typical CCD sensors and a demonstration provided on how this angle might be reduced with a corrective filter. In addition, it is shown that with a specific corrective filter Foveon sensors turn into a detector system with a good Luther-Ives condition compliance.

  6. Dual cameras acquisition and display system of retina-like sensor camera and rectangular sensor camera

    NASA Astrophysics Data System (ADS)

    Cao, Nan; Cao, Fengmei; Lin, Yabin; Bai, Tingzhu; Song, Shengyu

    2015-04-01

    For a new kind of retina-like senor camera and a traditional rectangular sensor camera, dual cameras acquisition and display system need to be built. We introduce the principle and the development of retina-like senor. Image coordinates transformation and interpolation based on sub-pixel interpolation need to be realized for our retina-like sensor's special pixels distribution. The hardware platform is composed of retina-like senor camera, rectangular sensor camera, image grabber and PC. Combined the MIL and OpenCV library, the software program is composed in VC++ on VS 2010. Experience results show that the system can realizes two cameras' acquisition and display.

  7. Non-uniform sampling knife-edge method for camera modulation transfer function measurement

    NASA Astrophysics Data System (ADS)

    Duan, Yaxuan; Xue, Xun; Chen, Yongquan; Tian, Liude; Zhao, Jianke; Gao, Limin

    2016-11-01

    Traditional slanted knife-edge method experiences large errors in the camera modulation transfer function (MTF) due to tilt angle error in the knife-edge resulting in non-uniform sampling of the edge spread function. In order to resolve this problem, a non -uniform sampling knife-edge method for camera MTF measurement is proposed. By applying a simple direct calculation of the Fourier transform of the derivative for the non-uniform sampling data, the camera super-sampled MTF results are obtained. Theoretical simulations for images with and without noise under different tilt angle errors are run using the proposed method. It is demonstrated that the MTF results are insensitive to tilt angle errors. To verify the accuracy of the proposed method, an experimental setup for camera MTF measurement is established. Measurement results show that the proposed method is superior to traditional methods, and improves the universality of the slanted knife-edge method for camera MTF measurement.

  8. Depth estimation and camera calibration of a focused plenoptic camera for visual odometry

    NASA Astrophysics Data System (ADS)

    Zeller, Niclas; Quint, Franz; Stilla, Uwe

    2016-08-01

    This paper presents new and improved methods of depth estimation and camera calibration for visual odometry with a focused plenoptic camera. For depth estimation we adapt an algorithm previously used in structure-from-motion approaches to work with images of a focused plenoptic camera. In the raw image of a plenoptic camera, scene patches are recorded in several micro-images under slightly different angles. This leads to a multi-view stereo-problem. To reduce the complexity, we divide this into multiple binocular stereo problems. For each pixel with sufficient gradient we estimate a virtual (uncalibrated) depth based on local intensity error minimization. The estimated depth is characterized by the variance of the estimate and is subsequently updated with the estimates from other micro-images. Updating is performed in a Kalman-like fashion. The result of depth estimation in a single image of the plenoptic camera is a probabilistic depth map, where each depth pixel consists of an estimated virtual depth and a corresponding variance. Since the resulting image of the plenoptic camera contains two plains: the optical image and the depth map, camera calibration is divided into two separate sub-problems. The optical path is calibrated based on a traditional calibration method. For calibrating the depth map we introduce two novel model based methods, which define the relation of the virtual depth, which has been estimated based on the light-field image, and the metric object distance. These two methods are compared to a well known curve fitting approach. Both model based methods show significant advantages compared to the curve fitting method. For visual odometry we fuse the probabilistic depth map gained from one shot of the plenoptic camera with the depth data gained by finding stereo correspondences between subsequent synthesized intensity images of the plenoptic camera. These images can be synthesized totally focused and thus finding stereo correspondences is enhanced

  9. Automated Camera Array Fine Calibration

    NASA Technical Reports Server (NTRS)

    Clouse, Daniel; Padgett, Curtis; Ansar, Adnan; Cheng, Yang

    2008-01-01

    Using aerial imagery, the JPL FineCalibration (JPL FineCal) software automatically tunes a set of existing CAHVOR camera models for an array of cameras. The software finds matching features in the overlap region between images from adjacent cameras, and uses these features to refine the camera models. It is not necessary to take special imagery of a known target and no surveying is required. JPL FineCal was developed for use with an aerial, persistent surveillance platform.

  10. Streak camera receiver definition study

    NASA Technical Reports Server (NTRS)

    Johnson, C. B.; Hunkler, L. T., Sr.; Letzring, S. A.; Jaanimagi, P.

    1990-01-01

    Detailed streak camera definition studies were made as a first step toward full flight qualification of a dual channel picosecond resolution streak camera receiver for the Geoscience Laser Altimeter and Ranging System (GLRS). The streak camera receiver requirements are discussed as they pertain specifically to the GLRS system, and estimates of the characteristics of the streak camera are given, based upon existing and near-term technological capabilities. Important problem areas are highlighted, and possible corresponding solutions are discussed.

  11. LSST Camera Optics

    SciTech Connect

    Olivier, S S; Seppala, L; Gilmore, K; Hale, L; Whistler, W

    2006-06-05

    The Large Synoptic Survey Telescope (LSST) is a unique, three-mirror, modified Paul-Baker design with an 8.4m primary, a 3.4m secondary, and a 5.0m tertiary feeding a camera system that includes corrector optics to produce a 3.5 degree field of view with excellent image quality (<0.3 arcsecond 80% encircled diffracted energy) over the entire field from blue to near infra-red wavelengths. We describe the design of the LSST camera optics, consisting of three refractive lenses with diameters of 1.6m, 1.0m and 0.7m, along with a set of interchangeable, broad-band, interference filters with diameters of 0.75m. We also describe current plans for fabricating, coating, mounting and testing these lenses and filters.

  12. Combustion pinhole camera system

    DOEpatents

    Witte, Arvel B.

    1984-02-21

    A pinhole camera system utilizing a sealed optical-purge assembly which provides optical access into a coal combustor or other energy conversion reactors. The camera system basically consists of a focused-purge pinhole optical port assembly, a conventional TV vidicon receiver, an external, variable density light filter which is coupled electronically to the vidicon automatic gain control (agc). The key component of this system is the focused-purge pinhole optical port assembly which utilizes a purging inert gas to keep debris from entering the port and a lens arrangement which transfers the pinhole to the outside of the port assembly. One additional feature of the port assembly is that it is not flush with the interior of the combustor.

  13. Combustion pinhole camera system

    DOEpatents

    Witte, A.B.

    1984-02-21

    A pinhole camera system is described utilizing a sealed optical-purge assembly which provides optical access into a coal combustor or other energy conversion reactors. The camera system basically consists of a focused-purge pinhole optical port assembly, a conventional TV vidicon receiver, an external, variable density light filter which is coupled electronically to the vidicon automatic gain control (agc). The key component of this system is the focused-purge pinhole optical port assembly which utilizes a purging inert gas to keep debris from entering the port and a lens arrangement which transfers the pinhole to the outside of the port assembly. One additional feature of the port assembly is that it is not flush with the interior of the combustor. 2 figs.

  14. Hemispherical Laue camera

    DOEpatents

    Li, James C. M.; Chu, Sungnee G.

    1980-01-01

    A hemispherical Laue camera comprises a crystal sample mount for positioning a sample to be analyzed at the center of sphere of a hemispherical, X-radiation sensitive film cassette, a collimator, a stationary or rotating sample mount and a set of standard spherical projection spheres. X-radiation generated from an external source is directed through the collimator to impinge onto the single crystal sample on the stationary mount. The diffracted beam is recorded on the hemispherical X-radiation sensitive film mounted inside the hemispherical film cassette in either transmission or back-reflection geometry. The distances travelled by X-radiation diffracted from the crystal to the hemispherical film are the same for all crystal planes which satisfy Bragg's Law. The recorded diffraction spots or Laue spots on the film thereby preserve both the symmetry information of the crystal structure and the relative intensities which are directly related to the relative structure factors of the crystal orientations. The diffraction pattern on the exposed film is compared with the known diffraction pattern on one of the standard spherical projection spheres for a specific crystal structure to determine the orientation of the crystal sample. By replacing the stationary sample support with a rotating sample mount, the hemispherical Laue camera can be used for crystal structure determination in a manner previously provided in conventional Debye-Scherrer cameras.

  15. Gamma ray camera

    DOEpatents

    Perez-Mendez, V.

    1997-01-21

    A gamma ray camera is disclosed for detecting rays emanating from a radiation source such as an isotope. The gamma ray camera includes a sensor array formed of a visible light crystal for converting incident gamma rays to a plurality of corresponding visible light photons, and a photosensor array responsive to the visible light photons in order to form an electronic image of the radiation therefrom. The photosensor array is adapted to record an integrated amount of charge proportional to the incident gamma rays closest to it, and includes a transparent metallic layer, photodiode consisting of a p-i-n structure formed on one side of the transparent metallic layer, and comprising an upper p-type layer, an intermediate layer and a lower n-type layer. In the preferred mode, the scintillator crystal is composed essentially of a cesium iodide (CsI) crystal preferably doped with a predetermined amount impurity, and the p-type upper intermediate layers and said n-type layer are essentially composed of hydrogenated amorphous silicon (a-Si:H). The gamma ray camera further includes a collimator interposed between the radiation source and the sensor array, and a readout circuit formed on one side of the photosensor array. 6 figs.

  16. Gamma ray camera

    DOEpatents

    Perez-Mendez, Victor

    1997-01-01

    A gamma ray camera for detecting rays emanating from a radiation source such as an isotope. The gamma ray camera includes a sensor array formed of a visible light crystal for converting incident gamma rays to a plurality of corresponding visible light photons, and a photosensor array responsive to the visible light photons in order to form an electronic image of the radiation therefrom. The photosensor array is adapted to record an integrated amount of charge proportional to the incident gamma rays closest to it, and includes a transparent metallic layer, photodiode consisting of a p-i-n structure formed on one side of the transparent metallic layer, and comprising an upper p-type layer, an intermediate layer and a lower n-type layer. In the preferred mode, the scintillator crystal is composed essentially of a cesium iodide (CsI) crystal preferably doped with a predetermined amount impurity, and the p-type upper intermediate layers and said n-type layer are essentially composed of hydrogenated amorphous silicon (a-Si:H). The gamma ray camera further includes a collimator interposed between the radiation source and the sensor array, and a readout circuit formed on one side of the photosensor array.

  17. Orbiter Camera Payload System

    NASA Technical Reports Server (NTRS)

    1980-01-01

    Components for an orbiting camera payload system (OCPS) include the large format camera (LFC), a gas supply assembly, and ground test, handling, and calibration hardware. The LFC, a high resolution large format photogrammetric camera for use in the cargo bay of the space transport system, is also adaptable to use on an RB-57 aircraft or on a free flyer satellite. Carrying 4000 feet of film, the LFC is usable over the visible to near IR, at V/h rates of from 11 to 41 milliradians per second, overlap of 10, 60, 70 or 80 percent and exposure times of from 4 to 32 milliseconds. With a 12 inch focal length it produces a 9 by 18 inch format (long dimension in line of flight) with full format low contrast resolution of 88 lines per millimeter (AWAR), full format distortion of less than 14 microns and a complement of 45 Reseau marks and 12 fiducial marks. Weight of the OCPS as supplied, fully loaded is 944 pounds and power dissipation is 273 watts average when in operation, 95 watts in standby. The LFC contains an internal exposure sensor, or will respond to external command. It is able to photograph starfields for inflight calibration upon command.

  18. Driven tracers in narrow channels

    NASA Astrophysics Data System (ADS)

    Cividini, J.; Mukamel, D.; Posch, H. A.

    2017-01-01

    Steady-state properties of a driven tracer moving in a narrow two-dimensional (2D) channel of quiescent medium are studied. The tracer drives the system out of equilibrium, perturbs the density and pressure fields, and gives the bath particles a nonzero average velocity, creating a current in the channel. Three models in which the confining effect of the channel is probed are analyzed and compared in this study: the first is the simple symmetric exclusion process (SSEP), for which the stationary density profile and the pressure on the walls in the frame of the tracer are computed. We show that the tracer acts like a dipolar source in an average velocity field. The spatial structure of this 2D strip is then simplified to a one-dimensional (1D) SSEP, in which exchanges of position between the tracer and the bath particles are allowed. Using a combination of mean-field theory and exact solution in the limit where no exchange is allowed gives good predictions of the velocity of the tracer and the density field. Finally, we show that results obtained for the 1D SSEP with exchanges also apply to a gas of overdamped hard disks in a narrow channel. The correspondence between the parameters of the SSEP and of the gas of hard disks is systematic and follows from simple intuitive arguments. Our analytical results are checked numerically.

  19. Polarization and spectral action of narrow slit

    NASA Astrophysics Data System (ADS)

    Oleksyuk, M. V.; Felde, Ch. V.; Polyanskii, P. V.

    2012-01-01

    Experimental study of diffraction of coherent (laser, completely polarized) and incoherent (temporal, polychromatic, unpolarized) light radiation at slits whose width is restricted by a few wavelengths is made. It is shown that for diffraction at the edge of metallic half-plane screen, the angular dependences of diffraction field intensity are considerably different for orientation of the electric field vector parallel and perpendicular to the screen edge, so that metallic screen causes polarization action on the probing beam. It is shown that as the width of a slit formed by two metallic half-planes becomes less than ten wavelengths (being left larger than a wavelength), as the polarization effect is considerable even for the forward direction, i.e. for the zero diffraction angle. It causes also spectral effect, if polychromatic radiation diffracts at narrow slit at metallic screen. Namely, one observes pronounced blue shift of the maximum of the spectral distribution of the forward diffracted polychromatic ('white-light') beam. We apply for the first time the Berry's chromascopic technique for experimental data processing to elucidate such diffraction induced spectral changes.

  20. Polarization and spectral action of narrow slit

    NASA Astrophysics Data System (ADS)

    Oleksyuk, M. V.; Felde, Ch. V.; Polyanskii, P. V.

    2011-09-01

    Experimental study of diffraction of coherent (laser, completely polarized) and incoherent (temporal, polychromatic, unpolarized) light radiation at slits whose width is restricted by a few wavelengths is made. It is shown that for diffraction at the edge of metallic half-plane screen, the angular dependences of diffraction field intensity are considerably different for orientation of the electric field vector parallel and perpendicular to the screen edge, so that metallic screen causes polarization action on the probing beam. It is shown that as the width of a slit formed by two metallic half-planes becomes less than ten wavelengths (being left larger than a wavelength), as the polarization effect is considerable even for the forward direction, i.e. for the zero diffraction angle. It causes also spectral effect, if polychromatic radiation diffracts at narrow slit at metallic screen. Namely, one observes pronounced blue shift of the maximum of the spectral distribution of the forward diffracted polychromatic ('white-light') beam. We apply for the first time the Berry's chromascopic technique for experimental data processing to elucidate such diffraction induced spectral changes.

  1. Efficacy of novel robotic camera vs a standard laparoscopic camera.

    PubMed

    Strong, Vivian E M; Hogle, Nancy J; Fowler, Dennis L

    2005-12-01

    To improve visualization during minimal access surgery, a novel robotic camera has been developed. The prototype camera is totally insertable, has 5 degrees of freedom, and is remotely controlled. This study compared the performance of laparoscopic surgeons using both a laparoscope and the robotic camera. The MISTELS (McGill Inanimate System for the Training and Evaluation of Laparoscopic Skill) tasks were used to test six laparoscopic fellows and attending surgeons. Half the surgeons used the laparoscope first and half used the robotic camera first. Total scores from the MISTELS sessions in which the laparoscope was used were compared with the sessions in which the robotic camera was used and then analyzed with a paired t test (P < .05 was considered significant). All six surgeons tested showed no significant difference in their MISTELS task performance on the robotic camera compared with the standard laparoscopic camera. The mean MISTELS score of 963 for all subjects who used a laparoscope and camera was not significantly different than the mean score of 904 for the robotic camera (P = .17). This new robotic camera prototype allows for equivalent performance on a validated laparoscopic assessment tool when compared with performance using a standard laparoscope.

  2. Lens and Camera Arrays for Sky Surveys and Space Surveillance

    NASA Astrophysics Data System (ADS)

    Ackermann, M.; Cox, D.; McGraw, J.; Zimmer, P.

    2016-09-01

    In recent years, a number of sky survey projects have chosen to use arrays of commercial cameras coupled with commercial photographic lenses to enable low-cost, wide-area observation. Projects such as SuperWASP, FAVOR, RAPTOR, Lotis, PANOPTES, and DragonFly rely on multiple cameras with commercial lenses to image wide areas of the sky each night. The sensors are usually commercial astronomical charge coupled devices (CCDs) or digital single reflex (DSLR) cameras, while the lenses are large-aperture, highend consumer items intended for general photography. While much of this equipment is very capable and relatively inexpensive, this approach comes with a number of significant limitations that reduce sensitivity and overall utility of the image data. The most frequently encountered limitations include lens vignetting, narrow spectral bandpass, and a relatively large point spread function. Understanding these limits helps to assess the utility of the data, and identify areas where advanced optical designs could significantly improve survey performance.

  3. Universal ICT Picosecond Camera

    NASA Astrophysics Data System (ADS)

    Lebedev, Vitaly B.; Syrtzev, V. N.; Tolmachyov, A. M.; Feldman, Gregory G.; Chernyshov, N. A.

    1989-06-01

    The paper reports on the design of an ICI camera operating in the mode of linear or three-frame image scan. The camera incorporates two tubes: time-analyzing ICI PIM-107 1 with cathode S-11, and brightness amplifier PMU-2V (gain about 104) for the image shaped by the first tube. The camera is designed on the basis of streak camera AGAT-SF3 2 with almost the same power sources, but substantially modified pulse electronics. Schematically, the design of tube PIM-107 is depicted in the figure. The tube consists of cermet housing 1, photocathode 2 made in a separate vacuum volume and introduced into the housing by means of a manipulator. In a direct vicinity of the photocathode, accelerating electrode is located made of a fine-structure grid. An electrostatic lens formed by focusing electrode 4 and anode diaphragm 5 produces a beam of electrons with a "remote crossover". The authors have suggested this term for an electron beam whose crossover is 40 to 60 mm away from the anode diaphragm plane which guarantees high sensitivity of scan plates 6 with respect to multiaperture framing diaphragm 7. Beyond every diaphragm aperture, a pair of deflecting plates 8 is found shielded from compensation plates 10 by diaphragm 9. The electronic image produced by the photocathode is focused on luminescent screen 11. The tube is controlled with the help of two saw-tooth voltages applied in antiphase across plates 6 and 10. Plates 6 serve for sweeping the electron beam over the surface of diaphragm 7. The beam is either allowed toward the screen, or delayed by the diaphragm walls. In such a manner, three frames are obtained, the number corresponding to that of the diaphragm apertures. Plates 10 serve for stopping the compensation of the image streak sweep on the screen. To avoid overlapping of frames, plates 8 receive static potentials responsible for shifting frames on the screen. Changing the potentials applied to plates 8, one can control the spacing between frames and partially or

  4. Neutron Imaging Camera

    NASA Technical Reports Server (NTRS)

    Hunter, Stanley; deNolfo, G. A.; Barbier, L. M.; Link, J. T.; Son, S.; Floyd, S. R.; Guardala, N.; Skopec, M.; Stark, B.

    2008-01-01

    The Neutron Imaging Camera (NIC) is based on the Three-dimensional Track Imager (3DTI) technology developed at GSFC for gamma-ray astrophysics applications. The 3-DTI, a large volume time-projection chamber, provides accurate, approximately 0.4 mm resolution, 3-D tracking of charged particles. The incident direction of fast neutrons, En > 0.5 MeV, are reconstructed from the momenta and energies of the proton and triton fragments resulting from (sup 3)He(n,p) (sup 3)H interactions in the 3-DTI volume. The performance of the NIC from laboratory and accelerator tests is presented.

  5. Mars Science Laboratory Engineering Cameras

    NASA Technical Reports Server (NTRS)

    Maki, Justin N.; Thiessen, David L.; Pourangi, Ali M.; Kobzeff, Peter A.; Lee, Steven W.; Dingizian, Arsham; Schwochert, Mark A.

    2012-01-01

    NASA's Mars Science Laboratory (MSL) Rover, which launched to Mars in 2011, is equipped with a set of 12 engineering cameras. These cameras are build-to-print copies of the Mars Exploration Rover (MER) cameras, which were sent to Mars in 2003. The engineering cameras weigh less than 300 grams each and use less than 3 W of power. Images returned from the engineering cameras are used to navigate the rover on the Martian surface, deploy the rover robotic arm, and ingest samples into the rover sample processing system. The navigation cameras (Navcams) are mounted to a pan/tilt mast and have a 45-degree square field of view (FOV) with a pixel scale of 0.82 mrad/pixel. The hazard avoidance cameras (Haz - cams) are body-mounted to the rover chassis in the front and rear of the vehicle and have a 124-degree square FOV with a pixel scale of 2.1 mrad/pixel. All of the cameras utilize a frame-transfer CCD (charge-coupled device) with a 1024x1024 imaging region and red/near IR bandpass filters centered at 650 nm. The MSL engineering cameras are grouped into two sets of six: one set of cameras is connected to rover computer A and the other set is connected to rover computer B. The MSL rover carries 8 Hazcams and 4 Navcams.

  6. PAU camera: detectors characterization

    NASA Astrophysics Data System (ADS)

    Casas, Ricard; Ballester, Otger; Cardiel-Sas, Laia; Castilla, Javier; Jiménez, Jorge; Maiorino, Marino; Pío, Cristóbal; Sevilla, Ignacio; de Vicente, Juan

    2012-07-01

    The PAU Camera (PAUCam) [1,2] is a wide field camera that will be mounted at the corrected prime focus of the William Herschel Telescope (Observatorio del Roque de los Muchachos, Canary Islands, Spain) in the next months. The focal plane of PAUCam is composed by a mosaic of 18 CCD detectors of 2,048 x 4,176 pixels each one with a pixel size of 15 microns, manufactured by Hamamatsu Photonics K. K. This mosaic covers a field of view (FoV) of 60 arcmin (minutes of arc), 40 of them are unvignetted. The behaviour of these 18 devices, plus four spares, and their electronic response should be characterized and optimized for the use in PAUCam. This job is being carried out in the laboratories of the ICE/IFAE and the CIEMAT. The electronic optimization of the CCD detectors is being carried out by means of an OG (Output Gate) scan and maximizing it CTE (Charge Transfer Efficiency) while the read-out noise is minimized. The device characterization itself is obtained with different tests. The photon transfer curve (PTC) that allows to obtain the electronic gain, the linearity vs. light stimulus, the full-well capacity and the cosmetic defects. The read-out noise, the dark current, the stability vs. temperature and the light remanence.

  7. Stereoscopic camera design

    NASA Astrophysics Data System (ADS)

    Montgomery, David J.; Jones, Christopher K.; Stewart, James N.; Smith, Alan

    2002-05-01

    It is clear from the literature that the majority of work in stereoscopic imaging is directed towards the development of modern stereoscopic displays. As costs come down, wider public interest in this technology is expected to increase. This new technology would require new methods of image formation. Advances in stereo computer graphics will of course lead to the creation of new stereo computer games, graphics in films etc. However, the consumer would also like to see real-world stereoscopic images, pictures of family, holiday snaps etc. Such scenery would have wide ranges of depth to accommodate and would need also to cope with moving objects, such as cars, and in particular other people. Thus, the consumer acceptance of auto/stereoscopic displays and 3D in general would be greatly enhanced by the existence of a quality stereoscopic camera. This paper will cover an analysis of existing stereoscopic camera designs and show that they can be categorized into four different types, with inherent advantages and disadvantages. A recommendation is then made with regard to 3D consumer still and video photography. The paper will go on to discuss this recommendation and describe its advantages and how it can be realized in practice.

  8. HONEY -- The Honeywell Camera

    NASA Astrophysics Data System (ADS)

    Clayton, C. A.; Wilkins, T. N.

    The Honeywell model 3000 colour graphic recorder system (hereafter referred to simply as Honeywell) has been bought by Starlink for producing publishable quality photographic hardcopy from the IKON image displays. Full colour and black & white images can be recorded on positive or negative 35mm film. The Honeywell consists of a built-in high resolution flat-faced monochrome video monitor, a red/green/blue colour filter mechanism and a 35mm camera. The device works on the direct video signals from the IKON. This means that changing the brightness or contrast on the IKON monitor will not affect any photographs that you take. The video signals from the IKON consist of separate red, green and blue signals. When you take a picture, the Honeywell takes the red, green and blue signals in turn and displays three pictures consecutively on its internal monitor. It takes an exposure through each of three filters (red, green and blue) onto the film in the camera. This builds up the complete colour picture on the film. Honeywell systems are installed at nine Starlink sites, namely Belfast (locally funded), Birmingham, Cambridge, Durham, Leicester, Manchester, Rutherford, ROE and UCL.

  9. Cryogenic Detectors (Narrow Field Instruments)

    NASA Astrophysics Data System (ADS)

    Hoevers, H.; Verhoeve, P.

    Two cryogenic imaging spectrometer arrays are currently considered as focal plane instruments for XEUS. The narrow field imager 1 (NFI 1) will cover the energy range from 0.05 to 3 keV with an energy resolution of 2 eV, or better, at 500 eV. A second narrow field imager (NFI 2) covers the energy range from 1 to 15 keV with an energy resolution of 2 eV (at 1 keV) and 5 eV (at 7 keV), creating some overlap with part of the NFI 1 energy window. Both narrow field imagers have a 0.5 arcmin field of view. Their imaging capabilities are matched to the XEUS optics of 2 to 5 arcsec leading to 1 arcsec pixels. The detector arrays will be cooled by a closed cycle system comprising a mechanical cooler with a base temperature of 2.5 K and either a low temperature 3He sorption pump providing the very low temperature stage and/or an Adiabatic Demagnetization Refrigerator (ADR). The ADR cooler is explicitly needed to cool the NFI 2 array. The narrow field imager 1} Currently a 48 times 48 element array of superconducting tunnel junctions (STJ) is envisaged. Its operating temperature is in the range between 30 and 350 mK. Small, single Ta STJs (20-50 mum on a side) have shown 3.5 eV (FWHM) resolution at E = 525 eV and small arrays have been successfully demonstrated (6 times 6 pixels), or are currently tested (10 times 12 pixels). Alternatively, a prototype Distributed Read-Out Imaging Device (DROID), consisting of a linear superconducting Ta absorber of 20 times 100 mum2, including a 20 times 20 mum STJ for readout at either end, has shown a measured energy resolution of 2.4 eV (FWHM) at E = 500 eV. Simulations involving the diffusion properties as well as loss and tunnel rates have shown that the performance can be further improved by slight modifications in the geometry, and that the size of the DROIDS can be increased to 0.5-1.0 mm without loss in energy resolution. The relatively large areas and good energy resolution compared to single STJs make DROIDS good candidates for the

  10. 100-ps framing-camera tube.

    PubMed

    Kalibjian, R

    1978-07-01

    The optoelectronic framing-camera tube described is capable of recording two-dimensional image frames with high spatial resolution in the <100-ps range. Framing is performed by streaking a two-dimensional electron image across narrow slits. The resulting dissected electron line images from the slits are restored into framed images by a restorer deflector operating synchronously with the dissector deflector. The number of framed images on the tube's viewing screen equals the number of dissecting slits in the tube. Performance has been demonstrated in a prototype tube by recording 135-ps-duration framed images of 2.5-mm patterns at the cathode. The limitation in the framing speed is in the external drivers for the deflectors and not in the tube design characteristics. Faster frame speeds in the <100-ps range can be obtained by use of faster deflection drivers.

  11. The Orbiter camera payload system's large-format camera and attitude reference system

    NASA Technical Reports Server (NTRS)

    Schardt, B. B.; Mollberg, B. H.

    1985-01-01

    The Orbiter camera payload system (OCPS) is an integrated photographic system carried into earth orbit as a payload in the Space Transportation System (STS) Orbiter vehicle's cargo bay. The major component of the OCPS is a large-format camera (LFC), a precision wide-angle cartographic instrument capable of producing high-resolution stereophotography of great geometric fidelity in multiple base-to-height ratios. A secondary and supporting system to the LFC is the attitude reference system (ARS), a dual-lens stellar camera array (SCA) and camera support structure. The SCA is a 70 mm film system that is rigidly mounted to the LFC lens support structure and, through the simultaneous acquisition of two star fields with each earth viewing LFC frame, makes it possible to precisely determine the pointing of the LFC optical axis with reference to the earth nadir point. Other components complete the current OCPS configuration as a high-precision cartographic data acquisition system. The primary design objective for the OCPS was to maximize system performance characteristics while maintaining a high level of reliability compatible with rocket launch conditions and the on-orbit environment. The full OCPS configuration was launched on a highly successful maiden voyage aboard the STS Orbiter vehicle Challenger on Oct. 5, 1984, as a major payload aboard the STS-41G mission.

  12. Wide and narrow dual image guidance system for ground vehicle on fast focusing and stereo matching operation

    NASA Astrophysics Data System (ADS)

    Akiyama, Akira; Kobayashi, Nobuaki; Mutoh, Eiichiro; Kumagai, Hideo; Ishii, Hiromitsu

    2011-10-01

    We have developed the wide and narrow dual image guidance system for ground vehicle on fast focusing and stereo matching operation. The fast focusing catches the distance information of outside world. The stereo matching operation on the focused two wide images finds the characteristic position to detect the fine distance information through the fast focusing using the narrow images due to the camera with the long focal length. Our fast focusing algorithm works precisely on the differential image such as the Daubechies wavelet transformed high pass image, the Roberts image, Prewitt image, Sobel image and the Laplacian image. After the stereo matching operation on the focused wide images, the two cameras serves the narrow image focusing operation. This procedure establishes the reliability of the detection of the object and gives the fine image information of the object. The pointing operation of the long focal length camera of the narrow image uses the related pixel address information due to the stereo matching and the 2 axes gimbal equipment of the precise resolution. We experimented the detection of the object by stereo matching and ranging the fine distance by narrow image focusing. The experiment gives the appropriate detection and fine pointing of the narrow image focusing to meet the guidance capability of the ground vehicle.

  13. Transmission electron microscope CCD camera

    DOEpatents

    Downing, Kenneth H.

    1999-01-01

    In order to improve the performance of a CCD camera on a high voltage electron microscope, an electron decelerator is inserted between the microscope column and the CCD. This arrangement optimizes the interaction of the electron beam with the scintillator of the CCD camera while retaining optimization of the microscope optics and of the interaction of the beam with the specimen. Changing the electron beam energy between the specimen and camera allows both to be optimized.

  14. Tunable polarization plasma channel undulator for narrow bandwidth photon emission

    NASA Astrophysics Data System (ADS)

    Rykovanov, S. G.; Wang, J. W.; Kharin, V. Yu.; Lei, B.; Schroeder, C. B.; Geddes, C. G. R.; Esarey, E.; Leemans, W. P.

    2016-09-01

    The theory of a plasma undulator excited by a short intense laser pulse in a parabolic plasma channel is presented. The undulator fields are generated either by the laser pulse incident off-axis and/or under the angle with respect to the channel axis. Linear plasma theory is used to derive the wakefield structure. It is shown that the electrons injected into the plasma wakefields experience betatron motion and undulator oscillations. Optimal electron beam injection conditions are derived for minimizing the amplitude of the betatron motion, producing narrow-bandwidth undulator radiation. Polarization control is readily achieved by varying the laser pulse injection conditions.

  15. Initiation of granular surface flows in a narrow channel

    NASA Astrophysics Data System (ADS)

    Jop, Pierre; Forterre, Yoël; Pouliquen, Olivier

    2007-08-01

    We experimentally investigate how a long granular pile confined in a narrow channel destabilizes when it is inclined above the angle of repose. A uniform flow then develops, which is localized at the free surface. It first accelerates before reaching a steady uniform regime. During this process, an apparent erosion is observed and the thickness of the flowing layer increases. We precisely study the evolution of the vertical velocity profile in this transient regime. The measurements are compared with the prediction of a viscoplastic model [Jop et al., Nature 441, 727 (2006)].

  16. On 3D radar data visualization and merging with camera images

    NASA Astrophysics Data System (ADS)

    Kjellgren, J.

    2008-10-01

    The possibilities to support the interpretation of spatial 3D-radar data visually both with and without camera images are studied. Radar measurements and camera pictures of a person are analyzed. First, the received signal amplitudes distributed in three dimensions, spherical range and two angles, are fed to a selection procedure using amplitude and the scene volume of interest. A number of resolution cells will then form images based on a volume representation depending upon the amplitude and location. Projecting the images of all the cells upon an imaging plane then forms the total image. Different images of a radar data set are performed for different projecting planes. The images were studied to find efficient aspect angles to get the target information of most interest. Rotating the target data around a suitable axis may perform such search. In addition, a visualization method for presenting radar data merged with a camera picture has been developed. An aim in this part of the work has been to keep the high information content of the camera image in the merged image. From the 3D-radar measurements the radar data may be projected upon the imaging plane of a camera with an arbitrary viewing center. This possibility is presented in examples with one camera looking at the target scene from the radar location and another camera looking from an aspect angle differing 45° relative to the aspect angle of the radar.

  17. Wide-angle imaging system with fiberoptic components providing angle-dependent virtual material stops

    NASA Technical Reports Server (NTRS)

    Vaughan, Arthur H. (Inventor)

    1993-01-01

    A strip imaging wide angle optical system is provided. The optical system is provided with a 'virtual' material stop to avoid aberrational effects inherent in wide angle optical systems. The optical system includes a spherical mirror section for receiving light from a 180 deg strip or arc of a target image. Light received by the spherical mirror section is reflected to a frustoconical mirror section for subsequent rereflection to a row of optical fibers. Each optical fiber transmits a portion of the received light to a detector. The optical system exploits the narrow cone of acceptance associated with optical fibers to substantially eliminate vignetting effects inherent in wide angle systems. Further, the optical system exploits the narrow cone of acceptance of the optical fibers to substantially limit spherical aberration. The optical system is ideally suited for any application wherein a 180 deg strip image need be detected, and is particularly well adapted for use in hostile environments such as in planetary exploration.

  18. Neutron Imaging Camera

    NASA Technical Reports Server (NTRS)

    Hunter, Stanley D.; DeNolfo, Georgia; Floyd, Sam; Krizmanic, John; Link, Jason; Son, Seunghee; Guardala, Noel; Skopec, Marlene; Stark, Robert

    2008-01-01

    We describe the Neutron Imaging Camera (NIC) being developed for DTRA applications by NASA/GSFC and NSWC/Carderock. The NIC is based on the Three-dimensional Track Imager (3-DTI) technology developed at GSFC for gamma-ray astrophysics applications. The 3-DTI, a large volume time-projection chamber, provides accurate, approximately 0.4 mm resolution. 3-D tracking of charged particles. The incident direction of fast neutrons, E(sub N) > 0.5 MeV. arc reconstructed from the momenta and energies of the proton and triton fragments resulting from 3He(n,p)3H interactions in the 3-DTI volume. We present angular and energy resolution performance of the NIC derived from accelerator tests.

  19. LRO Camera Imaging of Potential Landing Sites in the South Pole-Aitken Basin

    NASA Astrophysics Data System (ADS)

    Jolliff, B. L.; Wiseman, S. M.; Gibson, K. E.; Lauber, C.; Robinson, M.; Gaddis, L. R.; Scholten, F.; Oberst, J.; LROC Science; Operations Team

    2010-12-01

    We show results of WAC (Wide Angle Camera) and NAC (Narrow Angle Camera) imaging of candidate landing sites within the South Pole-Aitken (SPA) basin of the Moon obtained by the Lunar Reconnaissance Orbiter during the first full year of operation. These images enable a greatly improved delineation of geologic units, determination of unit thicknesses and stratigraphy, and detailed surface characterization that has not been possible with previous data. WAC imaging encompasses the entire SPA basin, located within an area ranging from ~ 130-250 degrees east longitude and ~15 degrees south latitude to the South Pole, at different incidence angles, with the specific range of incidence dependent on latitude. The WAC images show morphology and surface detail at better than 100 m per pixel, with spatial coverage and quality unmatched by previous data sets. NAC images reveal details at the sub-meter pixel scale that enable new ways to evaluate the origins and stratigraphy of deposits. Key among new results is the capability to discern extents of ancient volcanic deposits that are covered by later crater ejecta (cryptomare) [see Petro et al., this conference] using new, complementary color data from Kaguya and Chandrayaan-1. Digital topographic models derived from WAC and NAC geometric stereo coverage show broad intercrater-plains areas where slopes are acceptably low for high-probability safe landing [see Archinal et al., this conference]. NAC images allow mapping and measurement of small, fresh craters that excavated boulders and thus provide information on surface roughness and depth to bedrock beneath regolith and plains deposits. We use these data to estimate deposit thickness in areas of interest for landing and potential sample collection to better understand the possible provenance of samples. Also, small regions marked by fresh impact craters and their associated boulder fields are readily identified by their bright ejecta patterns and marked as lander keep-out zones

  20. Narrow gap electronegative capacitive discharges

    SciTech Connect

    Kawamura, E.; Lieberman, M. A.; Lichtenberg, A. J.

    2013-10-15

    Narrow gap electronegative (EN) capacitive discharges are widely used in industry and have unique features not found in conventional discharges. In this paper, plasma parameters are determined over a range of decreasing gap length L from values for which an electropositive (EP) edge exists (2-region case) to smaller L-values for which the EN region connects directly to the sheath (1-region case). Parametric studies are performed at applied voltage V{sub rf}=500 V for pressures of 10, 25, 50, and 100 mTorr, and additionally at 50 mTorr for 1000 and 2000 V. Numerical results are given for a parallel plate oxygen discharge using a planar 1D3v (1 spatial dimension, 3 velocity components) particle-in-cell (PIC) code. New interesting phenomena are found for the case in which an EP edge does not exist. This 1-region case has not previously been investigated in detail, either numerically or analytically. In particular, attachment in the sheaths is important, and the central electron density n{sub e0} is depressed below the density n{sub esh} at the sheath edge. The sheath oscillations also extend into the EN core, creating an edge region lying within the sheath and not characterized by the standard diffusion in an EN plasma. An analytical model is developed using minimal inputs from the PIC results, and compared to the PIC results for a base case at V{sub rf}=500 V and 50 mTorr, showing good agreement. Selected comparisons are made at the other voltages and pressures. A self-consistent model is also developed and compared to the PIC results, giving reasonable agreement.

  1. A view of the ET camera on STS-112

    NASA Technical Reports Server (NTRS)

    2002-01-01

    KENNEDY SPACE CENTER, FLA. - A view of the camera mounted on the external tank of Space Shuttle Atlantis. The color video camera mounted to the top of Atlantis' external tank will provide a view of the front and belly of the orbiter and a portion of the solid rocket boosters (SRBs) and external tank during the launch of Atlantis on mission STS-112. It will offer the STS-112 team an opportunity to monitor the shuttle's performance from a new angle. The camera will be turned on fifteen minutes prior to launch and will show the orbiter and solid rocket boosters on the launch pad. The video will be downlinked from the external tank during flight to several NASA data-receiving sites and then relayed to the live television broadcast. The camera is expected to operate for about 15 minutes following liftoff. At liftoff, viewers will see the shuttle clearing the launch tower and, at two minutes after liftoff, see the right SRB separate from the external tank. When the external tank separates from Atlantis about eight minutes into the flight, the camera is expected to continue its live feed for about six more minutes although NASA may be unable to pick up the camera's signal because the tank may have moved out of range.

  2. A view of the ET camera on STS-112

    NASA Technical Reports Server (NTRS)

    2002-01-01

    KENNEDY SPACE CENTER, FLA. - A closeup view of the camera mounted on the external tank of Space Shuttle Atlantis. The color video camera mounted to the top of Atlantis' external tank will provide a view of the front and belly of the orbiter and a portion of the solid rocket boosters (SRBs) and external tank during the launch of Atlantis on mission STS-112. It will offer the STS-112 team an opportunity to monitor the shuttle's performance from a new angle. The camera will be turned on fifteen minutes prior to launch and will show the orbiter and solid rocket boosters on the launch pad. The video will be downlinked from the external tank during flight to several NASA data-receiving sites and then relayed to the live television broadcast. The camera is expected to operate for about 15 minutes following liftoff. At liftoff, viewers will see the shuttle clearing the launch tower and, at two minutes after liftoff, see the right SRB separate from the external tank. When the external tank separates from Atlantis about eight minutes into the flight, the camera is expected to continue its live feed for about six more minutes although NASA may be unable to pick up the camera's signal because the tank may have moved out of range.

  3. Narrow band 3 × 3 Mueller polarimetric endoscopy

    PubMed Central

    Qi, Ji; Ye, Menglong; Singh, Mohan; Clancy, Neil T.; Elson, Daniel S.

    2013-01-01

    Mueller matrix polarimetric imaging has shown potential in tissue diagnosis but is challenging to implement endoscopically. In this work, a narrow band 3 × 3 Mueller matrix polarimetric endoscope was designed by rotating the endoscope to generate 0°, 45° and 90° linearly polarized illumination and positioning a rotating filter wheel in front of the camera containing three polarisers to permit polarization state analysis for backscattered light. The system was validated with a rotating linear polarizer and a diffuse reflection target. Initial measurements of 3 × 3 Mueller matrices on a rat are demonstrated, followed by matrix decomposition into the depolarization and retardance matrices for further analysis. Our work shows the feasibility of implementing polarimetric imaging in a rigid endoscope conveniently and economically in order to reveal diagnostic information. PMID:24298405

  4. CPAPIR: a wide-field infrared camera for the Observatoire du Mont Megantic

    NASA Astrophysics Data System (ADS)

    Artigau, Etienne; Doyon, Rene; Vallee, Philippe; Riopel, Martin; Nadeau, Daniel

    2004-09-01

    CPAPIR is a wide-field infrared camera for use at the Observatoire du mont Megantic and CTIO 1.5 m telescopes. The camera will be primarily a survey instrument with a half-degree field of view, making it one of the most efficient of its kind. CPAPIR will provide broad and narrow band filters within its 0.8 to 2.5 μm bandpass. The camera is based on a Hawaii-2 2048x2048 HgCdTe detector.

  5. Mars Exploration Rover engineering cameras

    USGS Publications Warehouse

    Maki, J.N.; Bell, J.F.; Herkenhoff, K. E.; Squyres, S. W.; Kiely, A.; Klimesh, M.; Schwochert, M.; Litwin, T.; Willson, R.; Johnson, Aaron H.; Maimone, M.; Baumgartner, E.; Collins, A.; Wadsworth, M.; Elliot, S.T.; Dingizian, A.; Brown, D.; Hagerott, E.C.; Scherr, L.; Deen, R.; Alexander, D.; Lorre, J.

    2003-01-01

    NASA's Mars Exploration Rover (MER) Mission will place a total of 20 cameras (10 per rover) onto the surface of Mars in early 2004. Fourteen of the 20 cameras are designated as engineering cameras and will support the operation of the vehicles on the Martian surface. Images returned from the engineering cameras will also be of significant importance to the scientific community for investigative studies of rock and soil morphology. The Navigation cameras (Navcams, two per rover) are a mast-mounted stereo pair each with a 45?? square field of view (FOV) and an angular resolution of 0.82 milliradians per pixel (mrad/pixel). The Hazard Avoidance cameras (Hazcams, four per rover) are a body-mounted, front- and rear-facing set of stereo pairs, each with a 124?? square FOV and an angular resolution of 2.1 mrad/pixel. The Descent camera (one per rover), mounted to the lander, has a 45?? square FOV and will return images with spatial resolutions of ???4 m/pixel. All of the engineering cameras utilize broadband visible filters and 1024 x 1024 pixel detectors. Copyright 2003 by the American Geophysical Union.

  6. Camera artifacts in IUE spectra

    NASA Technical Reports Server (NTRS)

    Bruegman, O. W.; Crenshaw, D. M.

    1994-01-01

    This study of emission line mimicking features in the IUE cameras has produced an atlas of artifiacts in high-dispersion images with an accompanying table of prominent artifacts and a table of prominent artifacts in the raw images along with a medium image of the sky background for each IUE camera.

  7. An Educational PET Camera Model

    ERIC Educational Resources Information Center

    Johansson, K. E.; Nilsson, Ch.; Tegner, P. E.

    2006-01-01

    Positron emission tomography (PET) cameras are now in widespread use in hospitals. A model of a PET camera has been installed in Stockholm House of Science and is used to explain the principles of PET to school pupils as described here.

  8. The "All Sky Camera Network"

    ERIC Educational Resources Information Center

    Caldwell, Andy

    2005-01-01

    In 2001, the "All Sky Camera Network" came to life as an outreach program to connect the Denver Museum of Nature and Science (DMNS) exhibit "Space Odyssey" with Colorado schools. The network is comprised of cameras placed strategically at schools throughout Colorado to capture fireballs--rare events that produce meteorites.…

  9. SEOS frame camera applications study

    NASA Technical Reports Server (NTRS)

    1974-01-01

    A research and development satellite is discussed which will provide opportunities for observation of transient phenomena that fall within the fixed viewing circle of the spacecraft. The evaluation of possible applications for frame cameras, for SEOS, are studied. The computed lens characteristics for each camera are listed.

  10. Radiation camera motion correction system

    DOEpatents

    Hoffer, P.B.

    1973-12-18

    The device determines the ratio of the intensity of radiation received by a radiation camera from two separate portions of the object. A correction signal is developed to maintain this ratio at a substantially constant value and this correction signal is combined with the camera signal to correct for object motion. (Official Gazette)

  11. Narrow-band radiation wavelength measurement by processing digital photographs in RAW format

    SciTech Connect

    Kraiskii, A V; Mironova, T V; Sultanov, T T

    2012-12-31

    The technique of measuring the mean wavelength of narrow-band radiation in the 455 - 625-nm range using the image of the emitting surface is presented. The data from the camera array unprocessed by the built-in processor (RAW format) are used. The method is applied for determining the parameters of response of holographic sensors. Depending on the wavelength and brightness of the image fragment, the mean square deviation of the wavelength amounts to 0.3 - 3 nm. (experimental techniques)

  12. Coherent infrared imaging camera (CIRIC)

    SciTech Connect

    Hutchinson, D.P.; Simpson, M.L.; Bennett, C.A.; Richards, R.K.; Emery, M.S.; Crutcher, R.I.; Sitter, D.N. Jr.; Wachter, E.A.; Huston, M.A.

    1995-07-01

    New developments in 2-D, wide-bandwidth HgCdTe (MCT) and GaAs quantum-well infrared photodetectors (QWIP) coupled with Monolithic Microwave Integrated Circuit (MMIC) technology are now making focal plane array coherent infrared (IR) cameras viable. Unlike conventional IR cameras which provide only thermal data about a scene or target, a coherent camera based on optical heterodyne interferometry will also provide spectral and range information. Each pixel of the camera, consisting of a single photo-sensitive heterodyne mixer followed by an intermediate frequency amplifier and illuminated by a separate local oscillator beam, constitutes a complete optical heterodyne receiver. Applications of coherent IR cameras are numerous and include target surveillance, range detection, chemical plume evolution, monitoring stack plume emissions, and wind shear detection.

  13. In-flight calibration of the Dawn Framing Camera II: Flat fields and stray light correction

    NASA Astrophysics Data System (ADS)

    Schröder, S. E.; Mottola, S.; Matz, K.-D.; Roatsch, T.

    2014-05-01

    The NASA Dawn spacecraft acquired thousands of images of asteroid Vesta during its year-long orbital tour, and is now on its way to asteroid Ceres. A method for calibrating images acquired by the onboard Framing Camera was described by Schröder et al. (Schröder et al. [2013]. Icarus 226, 1304). However, their method is only valid for point sources. In this paper we extend the calibration to images of extended sources like Vesta. For this, we devise a first-order correction for in-field stray light, which is known to plague images taken through the narrow band filters, and revise the flat fields that were acquired in an integrating sphere before launch. We used calibrated images of the Vesta surface to construct simple photometric models for all filters, that allow us to study how the spectrum changes with increasing phase angle (phase reddening). In combination with these models, our calibration method can be used to create near-seamless mosaics that are radiometrically accurate to a few percent. Such mosaics are provided in JVesta, the Vesta version of the JMARS geographic information system.

  14. A survey of Martian dust devil activity using Mars Global Surveyor Mars Orbiter Camera images

    NASA Astrophysics Data System (ADS)

    Fisher, Jenny A.; Richardson, Mark I.; Newman, Claire E.; Szwast, Mark A.; Graf, Chelsea; Basu, Shabari; Ewald, Shawn P.; Toigo, Anthony D.; Wilson, R. John

    2005-03-01

    A survey of dust devils using the Mars Global Surveyor (MGS) Mars Orbiter Camera (MOC) wide- and narrow-angle (WA and NA) images has been undertaken. The survey comprises two parts: (1) sampling of nine broad regions from September 1997 to July 2001 and (2) a focused seasonal monitoring of variability in the Amazonis region, an active dust devil site, from March 2001 to April 2004. For part 1, dust devils were identified in NA and WA images, and dust devil tracks were identified in NA images. Great spatial variability in dust devil occurrence is highlighted, with Amazonis Planitia being the most active region examined. Other active regions included Cimmerium, Sinai, and Solis. Numerous dust devil tracks, but very few dust devils, were observed in Casius. This may suggest dust devils here occur at local times other than that of the MGS orbit (~2 pm). Alternatively, variations in surface properties may affect the ability of dust devils to leave visible tracks. The seasonal campaign within Amazonis shows a relatively smooth variation of dust devil activity with season, peaking in mid northern summer and falling to zero in southern spring and summer. This pattern of activity correlates well with the boundary layer maximum depth and hence the vigor of convection. Global maps of boundary layer depth and surface temperature do not predict that Amazonis should be especially active, potentially suggesting a role for mesoscale circulations. Measurement of observed dust devils yields heights of up to 8 km and widths in excess of 0.5 km.

  15. Sublimation of icy aggregates in the coma of comet 67P/Churyumov-Gerasimenko detected with the OSIRIS cameras on board Rosetta

    NASA Astrophysics Data System (ADS)

    Gicquel, A.; Vincent, J.-B.; Agarwal, J.; A'Hearn, M. F.; Bertini, I.; Bodewits, D.; Sierks, H.; Lin, Z.-Y.; Barbieri, C.; Lamy, P. L.; Rodrigo, R.; Koschny, D.; Rickman, H.; Keller, H. U.; Barucci, M. A.; Bertaux, J.-L.; Besse, S.; Cremonese, G.; Da Deppo, V.; Davidsson, B.; Debei, S.; Deller, J.; De Cecco, M.; Frattin, E.; El-Maarry, M. R.; Fornasier, S.; Fulle, M.; Groussin, O.; Gutiérrez, P. J.; Gutiérrez-Marquez, P.; Güttler, C.; Höfner, S.; Hofmann, M.; Hu, X.; Hviid, S. F.; Ip, W.-H.; Jorda, L.; Knollenberg, J.; Kovacs, G.; Kramm, J.-R.; Kührt, E.; Küppers, M.; Lara, L. M.; Lazzarin, M.; Moreno, J. J. Lopez; Lowry, S.; Marzari, F.; Masoumzadeh, N.; Massironi, M.; Moreno, F.; Mottola, S.; Naletto, G.; Oklay, N.; Pajola, M.; Pommerol, A.; Preusker, F.; Scholten, F.; Shi, X.; Thomas, N.; Toth, I.; Tubiana, C.

    2016-11-01

    Beginning in 2014 March, the OSIRIS (Optical, Spectroscopic, and Infrared Remote Imaging System) cameras began capturing images of the nucleus and coma (gas and dust) of comet 67P/Churyumov-Gerasimenko using both the wide angle camera (WAC) and the narrow angle camera (NAC). The many observations taken since July of 2014 have been used to study the morphology, location, and temporal variation of the comet's dust jets. We analysed the dust monitoring observations shortly after the southern vernal equinox on 2015 May 30 and 31 with the WAC at the heliocentric distance Rh = 1.53 AU, where it is possible to observe that the jet rotates with the nucleus. We found that the decline of brightness as a function of the distance of the jet is much steeper than the background coma, which is a first indication of sublimation. We adapted a model of sublimation of icy aggregates and studied the effect as a function of the physical properties of the aggregates (composition and size). The major finding of this paper was that through the sublimation of the aggregates of dirty grains (radius a between 5 and 50 μm) we were able to completely reproduce the radial brightness profile of a jet beyond 4 km from the nucleus. To reproduce the data, we needed to inject a number of aggregates between 8.5 × 1013 and 8.5 × 1010 for a = 5 and 50 μm, respectively, or an initial mass of H2O ice around 22 kg.

  16. Comparison and evaluation of datasets for off-angle iris recognition

    NASA Astrophysics Data System (ADS)

    Kurtuncu, Osman M.; Cerme, Gamze N.; Karakaya, Mahmut

    2016-05-01

    In this paper, we investigated the publicly available iris recognition datasets and their data capture procedures in order to determine if they are suitable for the stand-off iris recognition research. Majority of the iris recognition datasets include only frontal iris images. Even if a few datasets include off-angle iris images, the frontal and off-angle iris images are not captured at the same time. The comparison of the frontal and off-angle iris images shows not only differences in the gaze angle but also change in pupil dilation and accommodation as well. In order to isolate the effect of the gaze angle from other challenging issues including dilation and accommodation, the frontal and off-angle iris images are supposed to be captured at the same time by using two different cameras. Therefore, we developed an iris image acquisition platform by using two cameras in this work where one camera captures frontal iris image and the other one captures iris images from off-angle. Based on the comparison of Hamming distance between frontal and off-angle iris images captured with the two-camera- setup and one-camera-setup, we observed that Hamming distance in two-camera-setup is less than one-camera-setup ranging from 0.05 to 0.001. These results show that in order to have accurate results in the off-angle iris recognition research, two-camera-setup is necessary in order to distinguish the challenging issues from each other.

  17. Camera sensitivity study

    NASA Astrophysics Data System (ADS)

    Schlueter, Jonathan; Murphey, Yi L.; Miller, John W. V.; Shridhar, Malayappan; Luo, Yun; Khairallah, Farid

    2004-12-01

    As the cost/performance Ratio of vision systems improves with time, new classes of applications become feasible. One such area, automotive applications, is currently being investigated. Applications include occupant detection, collision avoidance and lane tracking. Interest in occupant detection has been spurred by federal automotive safety rules in response to injuries and fatalities caused by deployment of occupant-side air bags. In principle, a vision system could control airbag deployment to prevent this type of mishap. Employing vision technology here, however, presents a variety of challenges, which include controlling costs, inability to control illumination, developing and training a reliable classification system and loss of performance due to production variations due to manufacturing tolerances and customer options. This paper describes the measures that have been developed to evaluate the sensitivity of an occupant detection system to these types of variations. Two procedures are described for evaluating how sensitive the classifier is to camera variations. The first procedure is based on classification accuracy while the second evaluates feature differences.

  18. Proportional counter radiation camera

    DOEpatents

    Borkowski, C.J.; Kopp, M.K.

    1974-01-15

    A gas-filled proportional counter camera that images photon emitting sources is described. A two-dimensional, positionsensitive proportional multiwire counter is provided as the detector. The counter consists of a high- voltage anode screen sandwiched between orthogonally disposed planar arrays of multiple parallel strung, resistively coupled cathode wires. Two terminals from each of the cathode arrays are connected to separate timing circuitry to obtain separate X and Y coordinate signal values from pulse shape measurements to define the position of an event within the counter arrays which may be recorded by various means for data display. The counter is further provided with a linear drift field which effectively enlarges the active gas volume of the counter and constrains the recoil electrons produced from ionizing radiation entering the counter to drift perpendicularly toward the planar detection arrays. A collimator is interposed between a subject to be imaged and the counter to transmit only the radiation from the subject which has a perpendicular trajectory with respect to the planar cathode arrays of the detector. (Official Gazette)

  19. Critical Heat Flux In Inclined Rectangular Narrow Long Channel

    SciTech Connect

    J. L. Rempe; S. W. Noh; Y. H. Kim; K. Y. Suh; F.B.Cheung; S. B. Kim

    2005-05-01

    In the TMI-2 accident, the lower part of the reactor pressure vessel had been overheated and then rather rapidly cooled down, as was later identified in a vessel investigation project. This accounted for the possibility of gap cooling feasibility. For this reason, several investigations were performed to determine the critical heat flux (CHF) from the standpoint of invessel retention. The experiments are conducted to investigate the general boiling phenomena, and the triggering mechanism for the CHF in a narrow gap using a 5 x 105 mm2 crevice type heater assembly and de-mineralized water. The test parameters include the gap size of 5 mm, and the surface orientation angles from the downward facing position (180o) to the vertical position (90o). The orientation angle affects the bubble layer and escape from the narrow gap. The CHF is less than that in a shorter channel, compared with the previous experiments having a heated length of 35 mmin the copper test section.

  20. Infants Experience Perceptual Narrowing for Nonprimate Faces

    ERIC Educational Resources Information Center

    Simpson, Elizabeth A.; Varga, Krisztina; Frick, Janet E.; Fragaszy, Dorothy

    2011-01-01

    Perceptual narrowing--a phenomenon in which perception is broad from birth, but narrows as a function of experience--has previously been tested with primate faces. In the first 6 months of life, infants can discriminate among individual human and monkey faces. Though the ability to discriminate monkey faces is lost after about 9 months, infants…

  1. Narrow band gap amorphous silicon semiconductors

    DOEpatents

    Madan, A.; Mahan, A.H.

    1985-01-10

    Disclosed is a narrow band gap amorphous silicon semiconductor comprising an alloy of amorphous silicon and a band gap narrowing element selected from the group consisting of Sn, Ge, and Pb, with an electron donor dopant selected from the group consisting of P, As, Sb, Bi and N. The process for producing the narrow band gap amorphous silicon semiconductor comprises the steps of forming an alloy comprising amorphous silicon and at least one of the aforesaid band gap narrowing elements in amount sufficient to narrow the band gap of the silicon semiconductor alloy below that of amorphous silicon, and also utilizing sufficient amounts of the aforesaid electron donor dopant to maintain the amorphous silicon alloy as an n-type semiconductor.

  2. Optimal Number of Angle Images for Calculating Anterior Angle Volume and Iris Volume Measurements

    PubMed Central

    Blieden, Lauren S.; Chuang, Alice Z.; Baker, Laura A.; Bell, Nicholas P.; Fuller, Timothy S.; Mankiewicz, Kimberly A.; Feldman, Robert M.

    2015-01-01

    Purpose. We determined the optimal number of angle images required to obtain reliable measurements of trabecular-iris circumferential volume (TICV) and iris volume (IV) using swept-source Fourier domain anterior segment optical coherence tomography (SSFD-ASOCT) scans in narrow angle eyes. Methods. Scleral spur landmarks (SSL) were manually identified on ASOCT angle images from 128 meridians from each of 24 eyes with chronic primary angle closure (PAC) spectrum of disease. The anterior and posterior corneal curves, and the anterior and posterior iris surfaces were identified automatically by the anterior chamber analysis and interpretation (ACAI) software, then manually examined and edited by the reader if required. Trabecular-iris circumferential volume at 750 μm from SSL (TICV750) and IV were subsequently calculated using varying numbers of angle images. Threshold error was determined to be less than the lower 95% confidence limit of mean absolute percent error (MAPE) of the change in TICV or IV resulting from laser peripheral iridotomy, which would be 17% for TICV and 5% for IV, based on previous studies. The optimal number of angle images was the smallest number of images where MAPE was less than this threshold for TICV and IV. Results. A total of 32 equally-spaced angle images (16 meridians) was required to estimate TICV750 and 16 angle images (8 meridians) to estimate IV. Both were within 4.6% and 1.6% of MAPE, respectively. Conclusions. It is possible to determine TICV and IV parameters reliably in narrow angles without evaluating all 128 meridians obtained with SSFD-ASOCT. PMID:25829412

  3. Camera evidence: visibility analysis through a multicamera viewpoint

    NASA Astrophysics Data System (ADS)

    Bajuk, Mark

    1992-06-01

    A major criterion in the design of backhoes (and other heavy machinery) is the ability of the operator to see all critical portions of the vehicle and the surrounding environment. Computer graphics provides a method for analyzing this ability prior to the building of full-scale wooden models. By placing the computer graphic camera at the operator's eyepoint, designers can detect poor placement of supports, blind spots, etc. In this type of analysis, the camera becomes an active, yet somewhat imperfect, participant in our understanding of what an operator of the backhoe 'sees'. In order to simulate a backhoe operator's vision from within a cab, one needs to expand the angle of view of the camera to mimic unfocused, peripheral vision. A traditional wide-angle lens creates extreme distortions that are not present in 'natural' vision, and is therefore hardly an adequate representation. The solution we arrived at uses seven cameras fanned out horizontally in order to capture a relatively undistorted 155 degree angle of view. In addition, another camera displays and numerically analyzes the percentage of the loader bucket visible and blocked. These two views are presented simultaneously in order to address both the 'naturalistic' and quantitative needs of the designers, as well as to point to the incompleteness of any one representation of a scene. In the next phase of this project we will bring this type of analysis into a machine environment more conducive to interactivity: a backhoe simulator with levers to control the vehicle and bucket positions, viewed through a virtual reality environment.

  4. CCD Camera Observations

    NASA Astrophysics Data System (ADS)

    Buchheim, Bob; Argyle, R. W.

    One night late in 1918, astronomer William Milburn, observing the region of Cassiopeia from Reverend T.H.E.C. Espin's observatory in Tow Law (England), discovered a hitherto unrecorded double star (Wright 1993). He reported it to Rev. Espin, who measured the pair using his 24-in. reflector: the fainter star was 6.0 arcsec from the primary, at position angle 162.4 ^{circ } (i.e. the fainter star was south-by-southeast from the primary) (Espin 1919). Some time later, it was recognized that the astrograph of the Vatican Observatory had taken an image of the same star-field a dozen years earlier, in late 1906. At that earlier epoch, the fainter star had been separated from the brighter one by only 4.8 arcsec, at position angle 186.2 ^{circ } (i.e. almost due south). Were these stars a binary pair, or were they just two unrelated stars sailing past each other? Some additional measurements might have begun to answer this question. If the secondary star was following a curved path, that would be a clue of orbital motion; if it followed a straight-line path, that would be a clue that these are just two stars passing in the night. Unfortunately, nobody took the trouble to re-examine this pair for almost a century, until the 2MASS astrometric/photometric survey recorded it in late 1998. After almost another decade, this amateur astronomer took some CCD images of the field in 2007, and added another data point on the star's trajectory, as shown in Fig. 15.1.

  5. Plume propagation direction determination with SO2 cameras

    NASA Astrophysics Data System (ADS)

    Klein, Angelika; Lübcke, Peter; Bobrowski, Nicole; Kuhn, Jonas; Platt, Ulrich

    2017-03-01

    SO2 cameras are becoming an established tool for measuring sulfur dioxide (SO2) fluxes in volcanic plumes with good precision and high temporal resolution. The primary result of SO2 camera measurements are time series of two-dimensional SO2 column density distributions (i.e. SO2 column density images). However, it is frequently overlooked that, in order to determine the correct SO2 fluxes, not only the SO2 column density, but also the distance between the camera and the volcanic plume, has to be precisely known. This is because cameras only measure angular extents of objects while flux measurements require knowledge of the spatial plume extent. The distance to the plume may vary within the image array (i.e. the field of view of the SO2 camera) since the plume propagation direction (i.e. the wind direction) might not be parallel to the image plane of the SO2 camera. If the wind direction and thus the camera-plume distance are not well known, this error propagates into the determined SO2 fluxes and can cause errors exceeding 50 %. This is a source of error which is independent of the frequently quoted (approximate) compensation of apparently higher SO2 column densities and apparently lower plume propagation velocities at non-perpendicular plume observation angles.Here, we propose a new method to estimate the propagation direction of the volcanic plume directly from SO2 camera image time series by analysing apparent flux gradients along the image plane. From the plume propagation direction and the known location of the SO2 source (i.e. volcanic vent) and camera position, the camera-plume distance can be determined. Besides being able to determine the plume propagation direction and thus the wind direction in the plume region directly from SO2 camera images, we additionally found that it is possible to detect changes of the propagation direction at a time resolution of the order of minutes. In addition to theoretical studies we applied our method to SO2 flux

  6. Influence analysis of the scroll on the image quality of the satellite camera

    NASA Astrophysics Data System (ADS)

    Fan, Chao; Yi, Hong-wei; Liang, Yi-tao

    2009-07-01

    The object distance of the high-resolution satellite camera will be changed when the camera is scroll imaging, which will cause not only the alteration of the image scale, but also the variation of the velocity-height ratio (V/H) of the satellite. The change of the V/H of the camera will induce the asynchronization between the image motion and the traveling of the charge packet on the focal plane, which will deteriorate the image quality of camera seriously. Thus, the variable regulation of the relative velocity and the height of the scroll imaging of the satellite were researched, and the expression the V/H was deduced. Based on this, the influence of the V/H on the image quality was studied from two variable factors: the latitude and the scroll angle. To illustrate this effect quantitatively, using a given round polar orbit, the deterioration of the image quality caused by the scroll imaging was calculated for different integral number of the camera, and regulation interval of the row integration time and the range of the scroll angle were computed. The results showed that, when the integral number of the camera is equal to 32 and 64, the permitted scroll angle are equal to 29.5° and 16° respectively for MTFimage motion >0.95, which will give some helpful engineering reference to learn how the image quality changes during scroll imaging of the satellite camera.

  7. Dark energy survey and camera

    SciTech Connect

    William Wester

    2004-08-16

    The authors describe the Dark Energy Survey and Camera. The survey will image 5000 sq. deg. in the southern sky to collect 300 million galaxies, 30,000 galaxy clusters and 2000 Type Ia supernovae. They expect to derive a value for the dark energy equation of state parameters, w, to a precision of 5% by combining four distinct measurement techniques. They describe the mosaic camera that will consist of CCDs with enhanced sensitivity in the near infrared. The camera will be mounted at the prime focus of the 4m Blanco telescope.

  8. Vision Sensors and Cameras

    NASA Astrophysics Data System (ADS)

    Hoefflinger, Bernd

    Silicon charge-coupled-device (CCD) imagers have been and are a specialty market ruled by a few companies for decades. Based on CMOS technologies, active-pixel sensors (APS) began to appear in 1990 at the 1 μm technology node. These pixels allow random access, global shutters, and they are compatible with focal-plane imaging systems combining sensing and first-level image processing. The progress towards smaller features and towards ultra-low leakage currents has provided reduced dark currents and μm-size pixels. All chips offer Mega-pixel resolution, and many have very high sensitivities equivalent to ASA 12.800. As a result, HDTV video cameras will become a commodity. Because charge-integration sensors suffer from a limited dynamic range, significant processing effort is spent on multiple exposure and piece-wise analog-digital conversion to reach ranges >10,000:1. The fundamental alternative is log-converting pixels with an eye-like response. This offers a range of almost a million to 1, constant contrast sensitivity and constant colors, important features in professional, technical and medical applications. 3D retino-morphic stacking of sensing and processing on top of each other is being revisited with sub-100 nm CMOS circuits and with TSV technology. With sensor outputs directly on top of neurons, neural focal-plane processing will regain momentum, and new levels of intelligent vision will be achieved. The industry push towards thinned wafers and TSV enables backside-illuminated and other pixels with a 100% fill-factor. 3D vision, which relies on stereo or on time-of-flight, high-speed circuitry, will also benefit from scaled-down CMOS technologies both because of their size as well as their higher speed.

  9. Accuracy in fixing ship's positions by camera survey of bearings

    NASA Astrophysics Data System (ADS)

    Naus, Krzysztof; Wąż, Mariusz

    2011-01-01

    The paper presents the results of research on the possibilities of fixing ship position coordinates based on results of surveying bearings on navigational marks with the use of the CCD camera. Accuracy of the determination of ship position coordinates, expressed in terms of the mean error, was assumed to be the basic criterion of this estimation. The first part of the paper describes the method of the determination of the resolution and the mean error of the angle measurement, taken with a camera, and also the method of the determination of the mean error of position coordinates when two or more bearings were measured. There have been defined three software applications assigned for the development of navigational sea charts with accuracy areas mapped on. The second part contains the results of studying accuracy in fixing ship position coordinates, carried out in the Gulf of Gdansk, with the use of bearings taken obtained with the Rolleiflex and Sony cameras. The results are presented in a form of diagrams of the mean error of angle measurement, also in the form of navigational charts with accuracy fields mapped on. In the final part, basing on results obtained, the applicability of CCD cameras in automation of coastal navigation performance process is discussed.

  10. Observation of Planetary Motion Using a Digital Camera

    ERIC Educational Resources Information Center

    Meyn, Jan-Peter

    2008-01-01

    A digital SLR camera with a standard lens (50 mm focal length, f/1.4) on a fixed tripod is used to obtain photographs of the sky which contain stars up to 8[superscript m] apparent magnitude. The angle of view is large enough to ensure visual identification of the photograph with a large sky region in a stellar map. The resolution is sufficient to…

  11. Focal Plane Metrology for the LSST Camera

    SciTech Connect

    A Rasmussen, Andrew P.; Hale, Layton; Kim, Peter; Lee, Eric; Perl, Martin; Schindler, Rafe; Takacs, Peter; Thurston, Timothy; /SLAC

    2007-01-10

    Meeting the science goals for the Large Synoptic Survey Telescope (LSST) translates into a demanding set of imaging performance requirements for the optical system over a wide (3.5{sup o}) field of view. In turn, meeting those imaging requirements necessitates maintaining precise control of the focal plane surface (10 {micro}m P-V) over the entire field of view (640 mm diameter) at the operating temperature (T {approx} -100 C) and over the operational elevation angle range. We briefly describe the hierarchical design approach for the LSST Camera focal plane and the baseline design for assembling the flat focal plane at room temperature. Preliminary results of gravity load and thermal distortion calculations are provided, and early metrological verification of candidate materials under cold thermal conditions are presented. A detailed, generalized method for stitching together sparse metrology data originating from differential, non-contact metrological data acquisition spanning multiple (non-continuous) sensor surfaces making up the focal plane, is described and demonstrated. Finally, we describe some in situ alignment verification alternatives, some of which may be integrated into the camera's focal plane.

  12. X-ray Pinhole Camera Measurements

    SciTech Connect

    Nelson, D. S.; Berninger, M. J.; Flores, P. A.; Good, D. E.; Henderson, D. J.; Hogge, K. W.; Huber, S. R.; Lutz, S. S.; Mitchell, S. E.; Howe, R. A.; Mitton, C. V.; Molina, I.; Bozman, D. R.; Cordova, S. R.; Mitchell, D. R.; Oliver, B. V.; Ormond, E. C.

    2013-07-01

    The development of the rod pinch diode [1] has led to high-resolution radiography for dynamic events such as explosive tests. Rod pinch diodes use a small diameter anode rod, which extends through the aperture of a cathode plate. Electrons borne off the aperture surface can self-insulate and pinch onto the tip of the rod, creating an intense, small x-ray source (Primary Pinch). This source has been utilized as the main diagnostic on numerous experiments that include high-value, single-shot events. In such applications there is an emphasis on machine reliability, x-ray reproducibility, and x-ray quality [2]. In tests with the baseline rod pinch diode, we have observed that an additional pinch (Secondary Pinch) occurs at the interface near the anode rod and the rod holder. This suggests that stray electrons exist that are not associated with the Primary Pinch. In this paper we present measurements on both pinches using an x-ray pinhole camera. The camera is placed downstream of the Primary Pinch at an angle of 60° with respect to the diode centerline. This diagnostic will be employed to diagnose x-ray reproducibility and quality. In addition, we will investigate the performance of hybrid diodes relating to the formation of the Primary and Secondary Pinches.

  13. Mass movement slope streaks imaged by the Mars Orbiter Camera

    NASA Astrophysics Data System (ADS)

    Sullivan, Robert; Thomas, Peter; Veverka, Joseph; Malin, Michael; Edgett, Kenneth S.

    2001-10-01

    Narrow, fan-shaped dark streaks on steep Martian slopes were originally observed in Viking Orbiter images, but a definitive explanation was not possible because of resolution limitations. Pictures acquired by the Mars Orbiter Camera (MOC) aboard the Mars Global Surveyor (MGS) spacecraft show innumerable examples of dark slope streaks distributed widely, but not uniformly, across the brighter equatorial regions, as well as individual details of these features that were not visible in Viking Orbiter data. Dark slope streaks (as well as much rarer bright slope streaks) represent one of the most widespread and easily recognized styles of mass movement currently affecting the Martian surface. New dark streaks have formed since Viking and even during the MGS mission, confirming earlier suppositions that higher contrast dark streaks are younger, and fade (brighten) with time. The darkest slope streaks represent ~10% contrast with surrounding slope materials. No small outcrops supplying dark material (or bright material, for bright streaks) have been found at streak apexes. Digitate downslope ends indicate slope streak formation involves a ground-hugging flow subject to deflection by minor topographic obstacles. The model we favor explains most dark slope streaks as scars from dust avalanches following oversteepening of air fall deposits. This process is analogous to terrestrial avalanches of oversteepened dry, loose snow which produce shallow avalanche scars with similar morphologies. Low angles of internal friction typically 10-30¡ for terrestrial loess and clay materials suggest that mass movement of (low-cohesion) Martian dusty air fall is possible on a wide range of gradients. Martian gravity, presumed low density of the air fall deposits, and thin (unresolved by MOC) failed layer depths imply extremely low cohesive strength at time of failure, consistent with expectations for an air fall deposit of dust particles. As speed increases during a dust avalanche, a

  14. Auto-preview camera orientation for environment perception on a mobile robot

    NASA Astrophysics Data System (ADS)

    Radovnikovich, Micho; Vempaty, Pavan K.; Cheok, Ka C.

    2010-01-01

    Using wide-angle or omnidirectional camera lenses to increase a mobile robot's field of view introduces nonlinearity in the image due to the 'fish-eye' effect. This complicates distance perception, and increases image processing overhead. Using multiple cameras avoids the fish-eye complications, but involves using more electrical and processing power to interface them to a computer. Being able to control the orientation of a single camera, both of these disadvantages are minimized while still allowing the robot to preview a wider area. In addition, controlling the orientation allows the robot to optimize its environment perception by only looking where the most useful information can be discovered. In this paper, a technique is presented that creates a two dimensional map of objects of interest surrounding a mobile robot equipped with a panning camera on a telescoping shaft. Before attempting to negotiate a difficult path planning situation, the robot takes snapshots at different camera heights and pan angles and then produces a single map of the surrounding area. Distance perception is performed by making calibration measurements of the camera and applying coordinate transformations to project the camera's findings into the vehicle's coordinate frame. To test the system, obstacles and lines were placed to form a chicane. Several snapshots were taken with different camera orientations, and the information from each were stitched together to yield a very useful map of the surrounding area for the robot to use to plan a path through the chicane.

  15. SMART-1/AMIE Camera System

    NASA Astrophysics Data System (ADS)

    Josset, J.-L.; Beauvivre, S.; Cerroni, P.; de Sanctis, M. C.; Pinet, P.; Chevrel, S.; Langevin, Y.; Barucci, M. A.; Plancke, P.; Koschny, D.; Almeida, M.; Sodnik, Z.; Mancuso, S.; Hofmann, B. A.; Muinonen, K.; Shevchenko, V.; Shkuratov, Y.; Ehrenfreund, P.; Foing, B. H.

    2006-03-01

    The Advanced Moon micro-Imager Experiment (AMIE), on board ESA SMART-1, the first European mission to the Moon (launched on 27th September 2003), is a camera system with scientific, technical and public outreach oriented objectives.

  16. An Inexpensive Digital Infrared Camera

    ERIC Educational Resources Information Center

    Mills, Allan

    2012-01-01

    Details are given for the conversion of an inexpensive webcam to a camera specifically sensitive to the near infrared (700-1000 nm). Some experiments and practical applications are suggested and illustrated. (Contains 9 figures.)

  17. Multilayer dielectric narrow band mangin mirror

    NASA Astrophysics Data System (ADS)

    Ahmed, K.; Khan, A. N.; Rauf, A.; Gul, A.

    2014-06-01

    The design of multilayer stack of dielectric films for narrow band mirror is developed using thin film coating software. The proposed design is materialized by employing thin film coating (PVD) method and reflectance in narrow band spectrum range is achieved. Thickness of high and low refractive index material is taken precisely up to nanometer level. The curved coated substrate is cemented with another K9 matching substrate that forms a Mangin mirror for wavelength 650nm. Narrow band mirrors with reflectivity more than 90% has been produced by properly stacking of 21 layers and advantage of the use of this type of mirror as an interference filter is discussed.

  18. Solid State Television Camera (CID)

    NASA Technical Reports Server (NTRS)

    Steele, D. W.; Green, W. T.

    1976-01-01

    The design, development and test are described of a charge injection device (CID) camera using a 244x248 element array. A number of video signal processing functions are included which maximize the output video dynamic range while retaining the inherently good resolution response of the CID. Some of the unique features of the camera are: low light level performance, high S/N ratio, antiblooming, geometric distortion, sequential scanning and AGC.

  19. The future of consumer cameras

    NASA Astrophysics Data System (ADS)

    Battiato, Sebastiano; Moltisanti, Marco

    2015-03-01

    In the last two decades multimedia, and in particular imaging devices (camcorders, tablets, mobile phones, etc.) have been dramatically diffused. Moreover the increasing of their computational performances, combined with an higher storage capability, allows them to process large amount of data. In this paper an overview of the current trends of consumer cameras market and technology will be given, providing also some details about the recent past (from Digital Still Camera up today) and forthcoming key issues.

  20. A Three-Line Stereo Camera Concept for Planetary Exploration

    NASA Technical Reports Server (NTRS)

    Sandau, Rainer; Hilbert, Stefan; Venus, Holger; Walter, Ingo; Fang, Wai-Chi; Alkalai, Leon

    1997-01-01

    This paper presents a low-weight stereo camera concept for planetary exploration. The camera uses three CCD lines within the image plane of one single objective. Some of the main features of the camera include: focal length-90 mm, FOV-18.5 deg, IFOV-78 (mu)rad, convergence angles-(+/-)10 deg, radiometric dynamics-14 bit, weight-2 kg, and power consumption-12.5 Watts. From an orbit altitude of 250 km the ground pixel size is 20m x 20m and the swath width is 82 km. The CCD line data is buffered in the camera internal mass memory of 1 Gbit. After performing radiometric correction and application-dependent preprocessing the data is compressed and ready for downlink. Due to the aggressive application of advanced technologies in the area of microelectronics and innovative optics, the low mass and power budgets of 2 kg and 12.5 Watts is achieved, while still maintaining high performance. The design of the proposed light-weight camera is also general purpose enough to be applicable to other planetary missions such as the exploration of Mars, Mercury, and the Moon. Moreover, it is an example of excellent international collaboration on advanced technology concepts developed at DLR, Germany, and NASA's Jet Propulsion Laboratory, USA.

  1. Bio-inspired fluidic lens surgical camera for MIS.

    PubMed

    Tsai, Frank S; Johnson, Daniel; Cho, Sung Hwan; Qiao, Wen; Arianpour, Ashkan; Lo, Yu-Hwa

    2009-01-01

    We report a new type of surgical camera that will greatly improve minimally invasive surgery (MIS). The key enabling technology for this camera is a unique type of lens-bio-inspired fluidic lens, which is a bio-mimetic lens that can change its curvature, just like the way human crystalline lens can accommodate. Because of its curvature changing capability, it is now possible to design a new regime of optical systems where auto-focusing and optical zoom can be performed without moving the lens positions, as is done in typical cameras. Hence, miniaturized imaging system with high functionality can be achieved with such technology. MIS is a surgical technique where small incisions are made on the abdominal wall as opposed to a large cut in open surgery. This type of surgery ensures faster patient recovery. The key tool for MIS is its surgical camera, or laparoscope. Traditional laparoscope is long and rigid and limits the field of view. To further advance MIS technology, we utilized bio-inspired fluidic lens to design a highly versatile imager that is small, can change its field of view or zoom optically, works in low light conditions, and varies the viewing angles. The surgical camera prototype is small (total track<17 mm), possesses 3X optical zoom, operates with light emitting diode (LED) lighting, among many other unique features.

  2. Two-Step Camera Calibration Method Developed for Micro UAV'S

    NASA Astrophysics Data System (ADS)

    Gašparović, M.; Gajski, D.

    2016-06-01

    The development of unmanned aerial vehicles (UAVs) and continuous price reduction of unmanned systems attracted us to this research. Professional measuring systems are dozens of times more expensive and often heavier than "amateur", non-metric UAVs. For this reason, we tested the DJI Phantom 2 Vision Plus UAV. Phantom's smaller mass and velocity can develop less kinetic energy in relation to the professional measurement platforms, which makes it potentially less dangerous for use in populated areas. In this research, we wanted to investigate the ability of such non-metric UAV and find the procedures under which this kind of UAV may be used for the photogrammetric survey. It is important to emphasize that UAV is equipped with an ultra wide-angle camera with 14MP sensor. Calibration of such cameras is a complex process. In the research, a new two-step process is presented and developed, and the results are compared with standard one-step camera calibration procedure. Two-step process involves initially removed distortion on all images, and then uses these images in the phototriangulation with self-calibration. The paper presents statistical indicators which proved that the proposed two-step process is better and more accurate procedure for calibrating those types of cameras than standard one-step calibration. Also, we suggest two-step calibration process as the standard for ultra-wideangle cameras for unmanned aircraft.

  3. Fast camera objective designs for spectrograph of Mont Megantique telescope

    NASA Astrophysics Data System (ADS)

    Thibault, Simon; Wang, Min

    2004-02-01

    All-reflective optics is conventionally required for extended spectral region observations in astronomical spectrograph. But the spatial resolution is usually not good enough while the large-size CCD will be used for observation in all-reflective optics. In this paper, all-refractive optics has been investigated to design a fast (F/1.55) and wide angle camera objective for large spectral coverage, from UV to VIS and up to NIR, when a large-size CCD is used on the focal plane of the spectrograph of Mont Megantique telescope. The case of achromatic and apochromatic condition has been investigated for axial and lateral color controls. The new proposed solutions have been optimized from two to three different glass combinations in order to have higher throughputs for large spectral coverage, especially in UV region. The used components have been minimized to reduce the light inherent lost. The monochromatic aberrations have been corrected and controlled by using optimized lens bending and shapes to make the camera have the CCD pixel resolution. Ray tracing results displayed the good optical performance of the camera to cover from 350 nm to 1000 nm spectral region with high resolution. The broadband AR coating, enhanced on UV region, will be used on each surface of the lenses in the camera. Final throughputs for the designed camera has been estimated and given in the paper.

  4. Design of motion compensation mechanism of satellite remote sensing camera

    NASA Astrophysics Data System (ADS)

    Gu, Song; Yan, Yong; Xu, Kai; Jin, Guang

    2011-08-01

    With the development of aerospace remote sensing technology, the ground resolution of remote sensing camera enhances continuously. Since there is relative motion between camera and ground target when taking pictures, the target image recorded in recording media is moved and blurred. In order to enhance the imaging quality and resolution of the camera, the image motion had to be compensated. In order to abate the effect of image motion to image quality of space camera and improve the resolution of the camera, the compensation method of image motion to space camera is researched. First, the reason of producing drift angle and adjustment principle are analyzed in this paper. This paper introduce the composition and transmission principle of image motion compensation mechanism. Second, the system adopts 80C31 as controller of drift angle, and adopts stepping motor for actuators, and adopts absolute photoelectric encoder as the drift Angle measuring element. Then the control mathematical model of the image motion compensation mechanism are deduced, and it achieve the closed-loop control of the drift angle position. At the last, this paper analyses the transmission precision of the mechanism. Through the experiment, we measured the actual precision of the image motion compensation mechanism, and compared with the theoretical analysis.There are two major contributions in this paper. First, the traditional image motion compensation mechanism is big volume and quality heavy. This has not fit for the development trend of space camera miniaturization and lightweight. But if reduce the volume and quality of mechanism, it will bring adverse effects for the precision and stiffness of mechanism. For this problem, This paper designed a image motion compensation that have some advantages such as small size, light weight at the same time, high precision, stiffness and so on. This image motion compensation can be applicable to the small optics cameras with high resolution. Second

  5. Science, conservation, and camera traps

    USGS Publications Warehouse

    Nichols, James D.; Karanth, K. Ullas; O'Connel, Allan F.; O'Connell, Allan F.; Nichols, James D.; Karanth, K. Ullas

    2011-01-01

    Biologists commonly perceive camera traps as a new tool that enables them to enter the hitherto secret world of wild animals. Camera traps are being used in a wide range of studies dealing with animal ecology, behavior, and conservation. Our intention in this volume is not to simply present the various uses of camera traps, but to focus on their use in the conduct of science and conservation. In this chapter, we provide an overview of these two broad classes of endeavor and sketch the manner in which camera traps are likely to be able to contribute to them. Our main point here is that neither photographs of individual animals, nor detection history data, nor parameter estimates generated from detection histories are the ultimate objective of a camera trap study directed at either science or management. Instead, the ultimate objectives are best viewed as either gaining an understanding of how ecological systems work (science) or trying to make wise decisions that move systems from less desirable to more desirable states (conservation, management). Therefore, we briefly describe here basic approaches to science and management, emphasizing the role of field data and associated analyses in these processes. We provide examples of ways in which camera trap data can inform science and management.

  6. The virtual gamma camera room.

    PubMed

    Penrose, J M; Trowbridge, E A; Tindale, W B

    1996-05-01

    The installation of a gamma camera is time-consuming and costly and, once installed, the camera position is unlikely to be altered during its working life. Poor choice of camera position therefore has long-term consequences. Additional equipment such as collimators and carts, the operator's workstation and wall-mounted display monitors must also be situated to maximize access and ease of use. The layout of a gamma camera room can be optimized prior to installation by creating a virtual environment. Super-Scape VRT software running on an upgraded 486 PC microprocessor was used to create a 'virtual camera room'. The simulation included an operator's viewpoint and a controlled tour of the room. Equipment could be repositioned as required, allowing potential problems to be identified at the design stage. Access for bed-ridden patients, operator ergonomics, operator and patient visibility were addressed. The display can also be used for patient education. Creation of a virtual environment is a valuable tool which allows different camera systems to be compared interactively in terms of dimensions, extent of movement and use of a defined space. Such a system also has applications in radiopharmacy design and simulation.

  7. Streak camera dynamic range optimization

    SciTech Connect

    Wiedwald, J.D.; Lerche, R.A.

    1987-09-01

    The LLNL optical streak camera is used by the Laser Fusion Program in a wide range of applications. Many of these applications require a large recorded dynamic range. Recent work has focused on maximizing the dynamic range of the streak camera recording system. For our streak cameras, image intensifier saturation limits the upper end of the dynamic range. We have developed procedures to set the image intensifier gain such that the system dynamic range is maximized. Specifically, the gain is set such that a single streak tube photoelectron is recorded with an exposure of about five times the recording system noise. This ensures detection of single photoelectrons, while not consuming intensifier or recording system dynamic range through excessive intensifier gain. The optimum intensifier gain has been determined for two types of film and for a lens-coupled CCD camera. We have determined that by recording the streak camera image with a CCD camera, the system is shot-noise limited up to the onset of image intensifier nonlinearity. When recording on film, the film determines the noise at high exposure levels. There is discussion of the effects of slit width and image intensifier saturation on dynamic range. 8 refs.

  8. The camera convergence problem revisited

    NASA Astrophysics Data System (ADS)

    Allison, Robert S.

    2004-05-01

    Convergence of the real or virtual stereoscopic cameras is an important operation in stereoscopic display systems. For example, convergence can shift the range of portrayed depth to improve visual comfort; can adjust the disparity of targets to bring them nearer to the screen and reduce accommodation-vergence conflict; or can bring objects of interest into the binocular field-of-view. Although camera convergence is acknowledged as a useful function, there has been considerable debate over the transformation required. It is well known that rotational camera convergence or 'toe-in' distorts the images in the two cameras producing patterns of horizontal and vertical disparities that can cause problems with fusion of the stereoscopic imagery. Behaviorally, similar retinal vertical disparity patterns are known to correlate with viewing distance and strongly affect perception of stereoscopic shape and depth. There has been little analysis of the implications of recent findings on vertical disparity processing for the design of stereoscopic camera and display systems. We ask how such distortions caused by camera convergence affect the ability to fuse and perceive stereoscopic images.

  9. Mars Orbiter Camera Views the 'Face on Mars' - Best View from Viking

    NASA Technical Reports Server (NTRS)

    1998-01-01

    Shortly after midnight Sunday morning (5 April 1998 12:39 AM PST), the Mars Orbiter Camera (MOC) on the Mars Global Surveyor (MGS) spacecraft successfully acquired a high resolution image of the 'Face on Mars' feature in the Cydonia region. The image was transmitted to Earth on Sunday, and retrieved from the mission computer data base Monday morning (6 April 1998). The image was processed at the Malin Space Science Systems (MSSS) facility 9:15 AM and the raw image immediately transferred to the Jet Propulsion Laboratory (JPL) for release to the Internet. The images shown here were subsequently processed at MSSS.

    The picture was acquired 375 seconds after the spacecraft's 220th close approach to Mars. At that time, the 'Face', located at approximately 40.8o N, 9.6o W, was 275 miles (444 km) from the spacecraft. The 'morning' sun was 25o above the horizon. The picture has a resolution of 14.1 feet (4.3 meters) per pixel, making it ten times higher resolution than the best previous image of the feature, which was taken by the Viking Mission in the mid-1970's. The full image covers an area 2.7 miles (4.4 km) wide and 25.7 miles (41.5 km) long.

    This Viking Orbiter image is one of the best Viking pictures of the area Cydonia where the 'Face' is located. Marked on the image are the 'footprint' of the high resolution (narrow angle) Mars Orbiter Camera image and the area seen in enlarged views (dashed box). See PIA01440-1442 for these images in raw and processed form.

    Malin Space Science Systems and the California Institute of Technology built the MOC using spare hardware from the Mars Observer mission. MSSS operates the camera from its facilities in San Diego, CA. The Jet Propulsion Laboratory's Mars Surveyor Operations Project operates the Mars Global Surveyor spacecraft with its industrial partner, Lockheed Martin Astronautics, from facilities in Pasadena, CA and Denver, CO.

  10. Cylindrical holographic radar camera

    NASA Astrophysics Data System (ADS)

    McMakin, Douglas L.; Sheen, David M.; Hall, Thomas E.; Severtsen, Ronald H.

    1998-12-01

    A novel personnel surveillance system has been developed to rapidly obtain 360 degree, full-body images of humans for the detection and identification of concealed threats. Detectable threats include weapons fabricated with metal, plastic, and ceramic, as well as explosive solids and liquids. This new system uses a cylindrical mechanical scanner to move a seven-foot, 384 element, Ka band (26 - 30 GHz) array circumferentially around a person in four to seven seconds. Low power millimeter-waves, which are nonionizing and not harmful to humans, are employed because they readily penetrate clothing barriers and reflect from concealed threats. The reflected waves provide information that is reconstructed into 3-D cylindrical holographic images with high-speed, digital signal processing (DSP) boards. This system is capable of displaying in an animation format eight, sixteen, thirty-two or sixty-four image frames at various aspect angles around the person under surveillance. This new prototype surveillance system is operational and is presently under laboratory testing and evaluation.

  11. Narrow deeply bound K- atomic states

    NASA Astrophysics Data System (ADS)

    Friedman, E.; Gal, A.

    1999-07-01

    Using optical potentials fitted to a comprehensive set of strong interaction level shifts and widths in K- atoms, we predict that the K- atomic levels which are inaccessible in the atomic cascade process are generally narrow, spanning a range of widths about 50-1500 keV over the entire periodic table. The mechanism for this narrowing is different from the mechanism for narrowing of pionic atom levels. Examples of such `deeply bound' K- atomic states are given, showing that in many cases these states should be reasonably well resolved. Several reactions which could be used to form these `deeply bound' states are mentioned. Narrow deeply bound states are expected also in overlinep atoms.

  12. Person re-identification over camera networks using multi-task distance metric learning.

    PubMed

    Ma, Lianyang; Yang, Xiaokang; Tao, Dacheng

    2014-08-01

    Person reidentification in a camera network is a valuable yet challenging problem to solve. Existing methods learn a common Mahalanobis distance metric by using the data collected from different cameras and then exploit the learned metric for identifying people in the images. However, the cameras in a camera network have different settings and the recorded images are seriously affected by variability in illumination conditions, camera viewing angles, and background clutter. Using a common metric to conduct person reidentification tasks on different camera pairs overlooks the differences in camera settings; however, it is very time-consuming to label people manually in images from surveillance videos. For example, in most existing person reidentification data sets, only one image of a person is collected from each of only two cameras; therefore, directly learning a unique Mahalanobis distance metric for each camera pair is susceptible to over-fitting by using insufficiently labeled data. In this paper, we reformulate person reidentification in a camera network as a multitask distance metric learning problem. The proposed method designs multiple Mahalanobis distance metrics to cope with the complicated conditions that exist in typical camera networks. We address the fact that these Mahalanobis distance metrics are different but related, and learned by adding joint regularization to alleviate over-fitting. Furthermore, by extending, we present a novel multitask maximally collapsing metric learning (MtMCML) model for person reidentification in a camera network. Experimental results demonstrate that formulating person reidentification over camera networks as multitask distance metric learning problem can improve performance, and our proposed MtMCML works substantially better than other current state-of-the-art person reidentification methods.

  13. Frequency-narrowed diode array bar.

    PubMed

    Babcock, Earl; Chann, Bien; Nelson, Ian A; Walker, Thad G

    2005-05-20

    We describe a method to frequency narrow multielement high-power diode bars. Using a commercial 60-W, 49-element, 1-cm-long diode array bar at 795 nm running at 45 W, we narrow the linewidth from 1000 to 64 GHz with only a loss of 33% in output power. The resulting laser light is well suited for spin-exchange optical pumping of noble gas nuclei.

  14. Hot Wax Sweeps Debris From Narrow Passages

    NASA Technical Reports Server (NTRS)

    Ricklefs, Steven K.

    1990-01-01

    Safe and effective technique for removal of debris and contaminants from narrow passages involves entrainment of undesired material in thermoplastic casting material. Semisolid wax slightly below melting temperature pushed along passage by pressurized nitrogen to remove debris. Devised to clean out fuel passages in main combustion chamber of Space Shuttle main engine. Also applied to narrow, intricate passages in internal-combustion-engine blocks, carburetors, injection molds, and other complicated parts.

  15. Design, modeling and performance of a hybrid portable gamma camera

    NASA Astrophysics Data System (ADS)

    Smith, Leon Eric

    The combination of a mechanically-collimated gamma-ray camera with an electronically-collimated gamma camera offers both the high efficiency and good angular resolution typical in a mechanically-collimated camera for lower photon energies and the uncoupling of spatial resolution and efficiency provided by an electronically-collimated camera at higher energies. The design, construction, performance modeling and measured performance of the Hybrid Portable Gamma Camera (HPGC) are presented here. Intended for industrial use, the HPGC offers good angular resolution and efficiency over a broad energy range (50 keV to 2 MeV) by combining a MURA coded aperture camera with a Compton scatter camera in a single system. The HPGC consists of two detector modules: (1) a NaI(Tl) scintillator with Anger logic readout and (2) a CsI(Na) pixellated crystal viewed by a position-sensitive photomultiplier tube. Analytical calculations of angular resolution components and efficiency for the HPGC were compared to Monte Carlo calculations of the same quantities. The predicted angular resolution performance for on-axis point sources, a central scattering angle of 45sp° and a detector separation distance of 35 cm ranges from 3.5-6sp° FWHM over the sensitive energy range. The mechanical collimation intrinsic efficiency for energies up to 800 keV varies from 0.50 to 0.05 while the electronic collimation intrinsic efficiency for energies above 400 keV is 7.0×10sp{-4} to 5×10sp{-5}. The experimentally measured angular resolution and efficiency values show good agreement with the modeling predictions for incident energies of 412 keV and 662 keV. Although work has been done on mechanical collimation cameras and electronic collimation cameras operating independently, no truly hybrid imaging system has been constructed that uses the same gamma ray for both mechanical collimation and electronic collimation information. This dissertation compares the relative information per photon for three

  16. The MC and LFC cameras. [metric camera (MC); large format camera (LFC)

    NASA Technical Reports Server (NTRS)

    Norton, Clarice L.; Schroeder, Manfried; Mollberg, Bernard

    1986-01-01

    The characteristics of the shuttle-borne Large Format Camera are listed. The LFC focal plane format was 23 by 46 cm, double the usual size, thereby acquiring approximately double the ground area. Forward motion compensation was employed. With the stable platform (shuttle) it was possible to use the slow exposure, high resolution, Kodak aerial films; 3414 and 3412 black and white, SO-242 color, and SO-131 aerochrome infrared. The camera was designed to maintain stability during varying temperature extremes of space.

  17. Sub-Camera Calibration of a Penta-Camera

    NASA Astrophysics Data System (ADS)

    Jacobsen, K.; Gerke, M.

    2016-03-01

    Penta cameras consisting of a nadir and four inclined cameras are becoming more and more popular, having the advantage of imaging also facades in built up areas from four directions. Such system cameras require a boresight calibration of the geometric relation of the cameras to each other, but also a calibration of the sub-cameras. Based on data sets of the ISPRS/EuroSDR benchmark for multi platform photogrammetry the inner orientation of the used IGI Penta DigiCAM has been analyzed. The required image coordinates of the blocks Dortmund and Zeche Zollern have been determined by Pix4Dmapper and have been independently adjusted and analyzed by program system BLUH. With 4.1 million image points in 314 images respectively 3.9 million image points in 248 images a dense matching was provided by Pix4Dmapper. With up to 19 respectively 29 images per object point the images are well connected, nevertheless the high number of images per object point are concentrated to the block centres while the inclined images outside the block centre are satisfying but not very strongly connected. This leads to very high values for the Student test (T-test) of the finally used additional parameters or in other words, additional parameters are highly significant. The estimated radial symmetric distortion of the nadir sub-camera corresponds to the laboratory calibration of IGI, but there are still radial symmetric distortions also for the inclined cameras with a size exceeding 5μm even if mentioned as negligible based on the laboratory calibration. Radial and tangential effects of the image corners are limited but still available. Remarkable angular affine systematic image errors can be seen especially in the block Zeche Zollern. Such deformations are unusual for digital matrix cameras, but it can be caused by the correlation between inner and exterior orientation if only parallel flight lines are used. With exception of the angular affinity the systematic image errors for corresponding

  18. Cross-ratio-based line scan camera calibration using a planar pattern

    NASA Astrophysics Data System (ADS)

    Li, Dongdong; Wen, Gongjian; Qiu, Shaohua

    2016-01-01

    A flexible new technique is proposed to calibrate the geometric model of line scan cameras. In this technique, the line scan camera is rigidly coupled to a calibrated frame camera to establish a pair of stereo cameras. The linear displacements and rotation angles between the two cameras are fixed but unknown. This technique only requires the pair of stereo cameras to observe a specially designed planar pattern shown at a few (at least two) different orientations. At each orientation, a stereo pair is obtained including a linear array image and a frame image. Radial distortion of the line scan camera is modeled. The calibration scheme includes two stages. First, point correspondences are established from the pattern geometry and the projective invariance of cross-ratio. Second, with a two-step calibration procedure, the intrinsic parameters of the line scan camera are recovered from several stereo pairs together with the rigid transform parameters between the pair of stereo cameras. Both computer simulation and real data experiments are conducted to test the precision and robustness of the calibration algorithm, and very good calibration results have been obtained. Compared with classical techniques which use three-dimensional calibration objects or controllable moving platforms, our technique is affordable and flexible in close-range photogrammetric applications.

  19. Method for pan-tilt camera calibration using single control point.

    PubMed

    Li, Yunting; Zhang, Jun; Hu, Wenwen; Tian, Jinwen

    2015-01-01

    The pan-tilt (PT) camera is widely used in video surveillance systems due to its rotatable property and low cost. The rough output of a PT camera may not satisfy the demand of practical applications; hence an accurate calibration method of a PT camera is desired. However, high-precision camera calibration methods usually require sufficient control points not guaranteed in some practical cases of a PT camera. In this paper, we present a novel method to online calibrate the rotation angles of a PT camera by using only one control point. This is achieved by assuming that the intrinsic parameters and position of the camera are known in advance. More specifically, we first build a nonlinear PT camera model with respect to two parameters Pan and Tilt. We then convert the nonlinear model into a linear model according to sine and cosine of Tilt, where each element in the augmented coefficient matrix is a function of the single variable Pan. A closed-form solution of Pan and Tilt can then be derived by solving a quadratic equation of tangent of Pan. Our method is noniterative and does not need features matching; thus its time efficiency is better. We evaluate our calibration method on various synthetic and real data. The quantitative results demonstrate that the proposed method outperforms other state-of-the-art methods if the intrinsic parameters and position of the camera are known in advance.

  20. Photometric Calibration of Consumer Video Cameras

    NASA Technical Reports Server (NTRS)

    Suggs, Robert; Swift, Wesley, Jr.

    2007-01-01

    Equipment and techniques have been developed to implement a method of photometric calibration of consumer video cameras for imaging of objects that are sufficiently narrow or sufficiently distant to be optically equivalent to point or line sources. Heretofore, it has been difficult to calibrate consumer video cameras, especially in cases of image saturation, because they exhibit nonlinear responses with dynamic ranges much smaller than those of scientific-grade video cameras. The present method not only takes this difficulty in stride but also makes it possible to extend effective dynamic ranges to several powers of ten beyond saturation levels. The method will likely be primarily useful in astronomical photometry. There are also potential commercial applications in medical and industrial imaging of point or line sources in the presence of saturation.This development was prompted by the need to measure brightnesses of debris in amateur video images of the breakup of the Space Shuttle Columbia. The purpose of these measurements is to use the brightness values to estimate relative masses of debris objects. In most of the images, the brightness of the main body of Columbia was found to exceed the dynamic ranges of the cameras. A similar problem arose a few years ago in the analysis of video images of Leonid meteors. The present method is a refined version of the calibration method developed to solve the Leonid calibration problem. In this method, one performs an endto- end calibration of the entire imaging system, including not only the imaging optics and imaging photodetector array but also analog tape recording and playback equipment (if used) and any frame grabber or other analog-to-digital converter (if used). To automatically incorporate the effects of nonlinearity and any other distortions into the calibration, the calibration images are processed in precisely the same manner as are the images of meteors, space-shuttle debris, or other objects that one seeks to

  1. The Clementine longwave infrared camera

    SciTech Connect

    Priest, R.E.; Lewis, I.T.; Sewall, N.R.; Park, H.S.; Shannon, M.J.; Ledebuhr, A.G.; Pleasance, L.D.; Massie, M.A.; Metschuleit, K.

    1995-04-01

    The Clementine mission provided the first ever complete, systematic surface mapping of the moon from the ultra-violet to the near-infrared regions. More than 1.7 million images of the moon, earth and space were returned from this mission. The longwave-infrared (LWIR) camera supplemented the UV/Visible and near-infrared mapping cameras providing limited strip coverage of the moon, giving insight to the thermal properties of the soils. This camera provided {approximately}100 m spatial resolution at 400 km periselene, and a 7 km across-track swath. This 2.1 kg camera using a 128 x 128 Mercury-Cadmium-Telluride (MCT) FPA viewed thermal emission of the lunar surface and lunar horizon in the 8.0 to 9.5 {micro}m wavelength region. A description of this light-weight, low power LWIR camera along with a summary of lessons learned is presented. Design goals and preliminary on-orbit performance estimates are addressed in terms of meeting the mission`s primary objective for flight qualifying the sensors for future Department of Defense flights.

  2. Traditional gamma cameras are preferred.

    PubMed

    DePuey, E Gordon

    2016-08-01

    Although the new solid-state dedicated cardiac cameras provide excellent spatial and energy resolution and allow for markedly reduced SPECT acquisition times and/or injected radiopharmaceutical activity, they have some distinct disadvantages compared to traditional sodium iodide SPECT cameras. They are expensive. Attenuation correction is not available. Cardio-focused collimation, advantageous to increase depth-dependent resolution and myocardial count density, accentuates diaphragmatic attenuation and scatter from subdiaphragmatic structures. Although supplemental prone imaging is therefore routinely advised, many patients cannot tolerate it. Moreover, very large patients cannot be accommodated in the solid-state camera gantries. Since data are acquired simultaneously with an arc of solid-state detectors around the chest, no temporally dependent "rotating" projection images are obtained. Therefore, patient motion can be neither detected nor corrected. In contrast, traditional sodium iodide SPECT cameras provide rotating projection images to allow technologists and physicians to detect and correct patient motion and to accurately detect the position of soft tissue attenuators and to anticipate associated artifacts. Very large patients are easily accommodated. Low-dose x-ray attenuation correction is widely available. Also, relatively inexpensive low-count density software is provided by many vendors, allowing shorter SPECT acquisition times and reduced injected activity approaching that achievable with solid-state cameras.

  3. Overview of Neutrino Mixing Models and Their Mixing Angle Predictions

    SciTech Connect

    Albright, Carl H.

    2009-11-01

    An overview of neutrino-mixing models is presented with emphasis on the types of horizontal flavor and vertical family symmetries that have been invoked. Distributions for the mixing angles of many models are displayed. Ways to differentiate among the models and to narrow the list of viable models are discussed.

  4. Cameras for semiconductor process control

    NASA Technical Reports Server (NTRS)

    Porter, W. A.; Parker, D. L.

    1977-01-01

    The application of X-ray topography to semiconductor process control is described, considering the novel features of the high speed camera and the difficulties associated with this technique. The most significant results on the effects of material defects on device performance are presented, including results obtained using wafers processed entirely within this institute. Defects were identified using the X-ray camera and correlations made with probe data. Also included are temperature dependent effects of material defects. Recent applications and improvements of X-ray topographs of silicon-on-sapphire and gallium arsenide are presented with a description of a real time TV system prototype and of the most recent vacuum chuck design. Discussion is included of our promotion of the use of the camera by various semiconductor manufacturers.

  5. Camera-on-a-Chip

    NASA Technical Reports Server (NTRS)

    1999-01-01

    Jet Propulsion Laboratory's research on a second generation, solid-state image sensor technology has resulted in the Complementary Metal- Oxide Semiconductor Active Pixel Sensor (CMOS), establishing an alternative to the Charged Coupled Device (CCD). Photobit Corporation, the leading supplier of CMOS image sensors, has commercialized two products of their own based on this technology: the PB-100 and PB-300. These devices are cameras on a chip, combining all camera functions. CMOS "active-pixel" digital image sensors offer several advantages over CCDs, a technology used in video and still-camera applications for 30 years. The CMOS sensors draw less energy, they use the same manufacturing platform as most microprocessors and memory chips, and they allow on-chip programming of frame size, exposure, and other parameters.

  6. Dark Energy Camera for Blanco

    SciTech Connect

    Binder, Gary A.; /Caltech /SLAC

    2010-08-25

    In order to make accurate measurements of dark energy, a system is needed to monitor the focus and alignment of the Dark Energy Camera (DECam) to be located on the Blanco 4m Telescope for the upcoming Dark Energy Survey. One new approach under development is to fit out-of-focus star images to a point spread function from which information about the focus and tilt of the camera can be obtained. As a first test of a new algorithm using this idea, simulated star images produced from a model of DECam in the optics software Zemax were fitted. Then, real images from the Mosaic II imager currently installed on the Blanco telescope were used to investigate the algorithm's capabilities. A number of problems with the algorithm were found, and more work is needed to understand its limitations and improve its capabilities so it can reliably predict camera alignment and focus.

  7. The GISMO-2 Bolometer Camera

    NASA Technical Reports Server (NTRS)

    Staguhn, Johannes G.; Benford, Dominic J.; Fixsen, Dale J.; Hilton, Gene; Irwin, Kent D.; Jhabvala, Christine A.; Kovacs, Attila; Leclercq, Samuel; Maher, Stephen F.; Miller, Timothy M.; Moseley, Samuel H.; Sharp, Elemer H.; Wollack, Edward J.

    2012-01-01

    We present the concept for the GISMO-2 bolometer camera) which we build for background-limited operation at the IRAM 30 m telescope on Pico Veleta, Spain. GISM0-2 will operate Simultaneously in the 1 mm and 2 mm atmospherical windows. The 1 mm channel uses a 32 x 40 TES-based Backshort Under Grid (BUG) bolometer array, the 2 mm channel operates with a 16 x 16 BUG array. The camera utilizes almost the entire full field of view provided by the telescope. The optical design of GISM0-2 was strongly influenced by our experience with the GISMO 2 mm bolometer camera which is successfully operating at the 30m telescope. GISMO is accessible to the astronomical community through the regular IRAM call for proposals.

  8. Reading Angles in Maps

    ERIC Educational Resources Information Center

    Izard, Véronique; O'Donnell, Evan; Spelke, Elizabeth S.

    2014-01-01

    Preschool children can navigate by simple geometric maps of the environment, but the nature of the geometric relations they use in map reading remains unclear. Here, children were tested specifically on their sensitivity to angle. Forty-eight children (age 47:15-53:30 months) were presented with fragments of geometric maps, in which angle sections…

  9. WIDE-FIELD ASTRONOMICAL MULTISCALE CAMERAS

    SciTech Connect

    Marks, Daniel L.; Brady, David J.

    2013-05-15

    In order to produce sufficiently low aberrations with a large aperture, telescopes have a limited field of view. Because of this narrow field, large areas of the sky at a given time are unobserved. We propose several telescopes based on monocentric reflective, catadioptric, and refractive objectives that may be scaled to wide fields of view and achieve 1.''1 resolution, which in most locations is the practical seeing limit of the atmosphere. The reflective and Schmidt catadioptric objectives have relatively simple configurations and enable large fields to be captured at the expense of the obscuration of the mirror by secondary optics, a defect that may be managed by image plane design. The refractive telescope design does not have an obscuration but the objective has substantial bulk. The refractive design is a 38 gigapixel camera which consists of a single monocentric objective and 4272 microcameras. Monocentric multiscale telescopes, with their wide fields of view, may observe phenomena that might otherwise be unnoticed, such as supernovae, glint from orbital space debris, and near-earth objects.

  10. A testbed for wide-field, high-resolution, gigapixel-class cameras

    NASA Astrophysics Data System (ADS)

    Kittle, David S.; Marks, Daniel L.; Son, Hui S.; Kim, Jungsang; Brady, David J.

    2013-05-01

    The high resolution and wide field of view (FOV) of the AWARE (Advanced Wide FOV Architectures for Image Reconstruction and Exploitation) gigapixel class cameras present new challenges in calibration, mechanical testing, and optical performance evaluation. The AWARE system integrates an array of micro-cameras in a multiscale design to achieve gigapixel sampling at video rates. Alignment and optical testing of the micro-cameras is vital in compositing engines, which require pixel-level accurate mappings over the entire array of cameras. A testbed has been developed to automatically calibrate and measure the optical performance of the entire camera array. This testbed utilizes translation and rotation stages to project a ray into any micro-camera of the AWARE system. A spatial light modulator is projected through a telescope to form an arbitrary object space pattern at infinity. This collimated source is then reflected by an elevation stage mirror for pointing through the aperture of the objective into the micro-optics and eventually the detector of the micro-camera. Different targets can be projected with the spatial light modulator for measuring the modulation transfer function (MTF) of the system, fiducials in the overlap regions for registration and compositing, distortion mapping, illumination profiles, thermal stability, and focus calibration. The mathematics of the testbed mechanics are derived for finding the positions of the stages to achieve a particular incident angle into the camera, along with calibration steps for alignment of the camera and testbed coordinate axes. Measurement results for the AWARE-2 gigapixel camera are presented for MTF, focus calibration, illumination profile, fiducial mapping across the micro-camera for registration and distortion correction, thermal stability, and alignment of the camera on the testbed.

  11. A testbed for wide-field, high-resolution, gigapixel-class cameras.

    PubMed

    Kittle, David S; Marks, Daniel L; Son, Hui S; Kim, Jungsang; Brady, David J

    2013-05-01

    The high resolution and wide field of view (FOV) of the AWARE (Advanced Wide FOV Architectures for Image Reconstruction and Exploitation) gigapixel class cameras present new challenges in calibration, mechanical testing, and optical performance evaluation. The AWARE system integrates an array of micro-cameras in a multiscale design to achieve gigapixel sampling at video rates. Alignment and optical testing of the micro-cameras is vital in compositing engines, which require pixel-level accurate mappings over the entire array of cameras. A testbed has been developed to automatically calibrate and measure the optical performance of the entire camera array. This testbed utilizes translation and rotation stages to project a ray into any micro-camera of the AWARE system. A spatial light modulator is projected through a telescope to form an arbitrary object space pattern at infinity. This collimated source is then reflected by an elevation stage mirror for pointing through the aperture of the objective into the micro-optics and eventually the detector of the micro-camera. Different targets can be projected with the spatial light modulator for measuring the modulation transfer function (MTF) of the system, fiducials in the overlap regions for registration and compositing, distortion mapping, illumination profiles, thermal stability, and focus calibration. The mathematics of the testbed mechanics are derived for finding the positions of the stages to achieve a particular incident angle into the camera, along with calibration steps for alignment of the camera and testbed coordinate axes. Measurement results for the AWARE-2 gigapixel camera are presented for MTF, focus calibration, illumination profile, fiducial mapping across the micro-camera for registration and distortion correction, thermal stability, and alignment of the camera on the testbed.

  12. Tailored slice selection in solid-state MRI by DANTE under magic-echo line narrowing.

    PubMed

    Matsui, Shigeru; Masumoto, Hidefumi; Hashimoto, Takeyuki

    2007-06-01

    We propose a method of slice selection in solid-state MRI by combining DANTE selective excitation with magic-echo (ME) line narrowing. The DANTE RF pulses applied at the ME peaks practically do not interfere with the ME line narrowing in the combined ME DANTE sequence. This allows straightforward tailoring of the slice profile simply by introducing an appropriate modulation, such as a sinc modulation, into the flip angles of the applied DANTE RF pulses. The utility of the method has been demonstrated by preliminary experiments performed on a test sample of adamantane.

  13. Distribution of illumination from a narrow light beam in a turbid atmosphere

    NASA Astrophysics Data System (ADS)

    Drofa, A. S.; Usachev, A. L.

    An analysis is presented of the illumination distribution in the cross section of an infinitely narrow unidirectional light beam in a polydisperse water aerosol as a function of the microstructure of the disperse medium, its optical thickness, and the distribution of the light-scattering characteristics of the medium along the beam propagation path. The scattered-light field is calculated on the basis of the Monte Carlo method with the use of the small-angle and single-scattering approximations. Attention is given to the possibility of using approximation methods to calculate the characteristics of a scattered field from a narrow beam propagating on an inhomogeneous path in a polydisperse medium.

  14. Enabling High Fidelity Measurements of Energy and Pitch Angle for Escaping Energetic Ions with a Fast Ion Loss Detector

    NASA Astrophysics Data System (ADS)

    Chaban, R.; Pace, D. C.; Marcy, G. R.; Taussig, D.

    2016-10-01

    Energetic ion losses must be minimized in burning plasmas to maintain fusion power, and existing tokamaks provide access to energetic ion parameter regimes that are relevant to burning machines. A new Fast Ion Loss Detector (FILD) probe on the DIII-D tokamak has been optimized to resolve beam ion losses across a range of 30 - 90 keV in energy and 40° to 80° in pitch angle, thereby providing valuable measurements during many different experiments. The FILD is a magnetic spectrometer; once inserted into the tokamak, the magnetic field allows energetic ions to pass through a collimating aperture and strike a scintillator plate that is imaged by a wide view camera and narrow view photomultiplier tubes (PMTs). The design involves calculating scintillator strike patterns while varying probe geometry. Calculated scintillator patterns are then used to design an optical system that allows adjustment of the focus regions for the 1 MS/s resolved PMTs. A synthetic diagnostic will be used to determine the energy and pitch angle resolution that can be attained in DIII-D experiments. Work supported in part by US DOE under the Science Undergraduate Laboratory Internship (SULI) program and under DE-FC02-04ER54698.

  15. Precision Narrow-Angle Astrometry of Binary Stars with the Navy Prototype Optical Interferometer

    DTIC Science & Technology

    2004-01-01

    precision into accuracy. Keywords: Optical interferometry, interferometric imaging, NPOI, binary stars, extrasolar planets 1. MOTIVATION As Michelson realized...interferometry, are equivalent to using ≈ 0.5(d/r0)2 baselines, where d is the telescope diameter and r0 is the Fried parameter. Extrasolar planet searches of...Dubarry Ave., Lanham, MD 20706, USA; ABSTRACT The Navy Prototype Optical Interferometry (NPOI) group has started an astrometric search for planets in

  16. 975 nm high power diode lasers with high efficiency and narrow vertical far field enabled by low index quantum barriers

    NASA Astrophysics Data System (ADS)

    Crump, P.; Pietrzak, A.; Bugge, F.; Wenzel, H.; Erbert, G.; Tränkle, G.

    2010-03-01

    For optimal coupled power into fiber, high power diode lasers should operate efficiently with smallest possible vertical far field emission angle. Although waveguide and cladding layers can be designed to achieve small angles, the refractive index profile of the active region itself restricts the minimum achievable value. We show that the use of low index quantum barrier layers leads to substantially reduced far field angles, while sustaining high power conversion efficiency. 90 μm stripe lasers that use such designs have narrow vertical far field angles of 30° (95% power content), power conversion efficiency of 58% and operate reliably at 10 W output.

  17. II-VI Narrow-Bandgap Semiconductors for Optoelectronics

    NASA Astrophysics Data System (ADS)

    Baker, Ian

    The field of narrow-gap II-VI materials is dominated by the compound semiconductor mercury cadmium telluride, (Hg1-x Cd x Te or MCT), which supports a large industry in infrared detectors, cameras and infrared systems. It is probably true to say that HgCdTe is the third most studied semiconductor after silicon and gallium arsenide. Hg1-x Cd x Te is the material most widely used in high-performance infrared detectors at present. By changing the composition x the spectral response of the detector can be made to cover the range from 1 μm to beyond 17 μm. The advantages of this system arise from a number of features, notably: close lattice matching, high optical absorption coefficient, low carrier generation rate, high electron mobility and readily available doping techniques. These advantages mean that very sensitive infrared detectors can be produced at relatively high operating temperatures. Hg1-x Cd x Te multilayers can be readily grown in vapor-phase epitaxial processes. This provides the device engineer with complex doping and composition profiles that can be used to further enhance the electro-optic performance, leading to low-cost, large-area detectors in the future. The main purpose of this chapter is to describe the applications, device physics and technology of II-VI narrow-bandgap devices, focusing on HgCdTe but also including Hg1-x Mn x Te and Hg1-x Zn x Te. It concludes with a review of the research and development programs into third-generation infrared detector technology (so-called GEN III detectors) being performed in centers around the world.

  18. Microlayer during boiling in narrow slot channels

    NASA Astrophysics Data System (ADS)

    Diev, Mikhail D.; Leontiev, Alexander I.

    1997-01-01

    An international space station Alpha will have a two-phase thermal control system. Boiling of a liquid ammonia will be a process of heat collection in evaporative heat exchangers. Unfortunately, only little data is available for boiling heat transfer in microgravity. Geometries of boiling channels working good in normal gravity are not appropriate in microgravity, and special means should be worked out to avoid some undesired events. From this point of view, the narrow slot channels may be assumed as a promising geometry for microgravity operation. During boiling in narrow slots, the vapor bubbles are flattened between the channel walls. The vapor phase and the channel wall are separated by a thin liquid film which is known as a microlayer. The paper presents the experimental results compared to the theoretical analysis, the paper also shows the narrow slot channels as a perspective configuration for microgravity applications.

  19. Congenital narrowing of the cervical spinal canal.

    PubMed Central

    Kessler, J T

    1975-01-01

    The clinical and laboratory findings in six patients with congenital narrowing of the cervical spinal canal and neurological symptoms are described. A variable age of onset and an entirely male occurrence were found. Signs and symptoms of spinal cord dysfunction predominated in all but one patient. Symptoms were produced in five patients by increased physical activity alone. Congenital narrowing of the cervical spinal canal may result in cord compression without a history of injury and occasionally without evidence of significant bony degenerative changes. The clinical features may be distinguishable from those found in cervical spondylosis without congenital narrowing. Intermittent claudication of the cervical spinal cord appears to be an important feature of this syndrome. Surgery improved four out of five people. PMID:1219087

  20. Does interest broaden or narrow attentional scope?

    PubMed

    Sung, Billy; Yih, Jennifer

    2015-08-10

    Theory proposes that interest is a positive emotion that may either broaden attention to facilitate processing of new information, or narrow attention to preserve engagement with new information. To our knowledge, no research has directly examined the effect of interest on attentional scope. Across four experiments, we show that traits associated with the propensity to experience interest-specifically, trait curiosity and internal boredom proneness-are associated with a narrower scope of attention. We also find that, instead of broadening, interest actually narrows attentional scope in comparison to a neutral state and awe. Challenging the conventional notion that all positive emotions broaden cognition and attention, our findings suggest that specific emotions influence attention in ways that extend beyond a general emotional valence effect.

  1. Discovery of a narrow line quasar

    NASA Technical Reports Server (NTRS)

    Stocke, J.; Liebert, J.; Maccacaro, T.; Griffiths, R. E.; Steiner, J. E.

    1982-01-01

    A stellar object is reported which, while having X-ray and optical luminosities typical of quasars, has narrow permitted and forbidden emission lines over the observed spectral range. The narrow-line spectrum is high-excitation, the Balmer lines seem to be recombinational, and a redder optical spectrum than that of most quasars is exhibited, despite detection as a weak radio source. The object does not conform to the relationships between H-beta parameters and X-ray flux previously claimed for a large sample of the active galactic nuclei. Because reddish quasars with narrow lines, such as the object identified, may not be found by the standard techniques for the discovery of quasars, the object may be a prototype of a new class of quasars analogous to high-luminosity Seyfert type 2 galaxies. It is suggested that these objects cannot comprise more than 10% of all quasars.

  2. Efficient, Narrow-Pass-Band Optical Filters

    NASA Technical Reports Server (NTRS)

    Sandford, Stephen P.

    1996-01-01

    Optical filters with both narrow pass bands and high efficiencies fabricated to design specifications. Offer tremendous improvements in performance for number of optical (including infrared) systems. In fiber-optic and free-space communication systems, precise frequency discrimination afforded by narrow pass bands of filters provide higher channel capacities. In active and passive remote sensors like lidar and gas-filter-correlation radiometers, increased efficiencies afforded by filters enhance detection of small signals against large background noise. In addition, sizes, weights, and power requirements of many optical and infrared systems reduced by taking advantage of gains in signal-to-noise ratios delivered by filters.

  3. Spectrally narrowed edge emission from leaky waveguide modes in organic light-emitting diodes

    SciTech Connect

    Gan, Zhengqing; Tian, Yun; Lynch, David W.; Kang, Ji-hun; Park, Q-Han; and Shinar, Joseph

    2009-11-03

    A dramatic spectral line narrowing of the edge emission at room temperature from tris(quinolinolate) Al (Alq{sub 3}), N,N{prime}-diphenyl-N,N{prime}-bis(1-naphthylphenyl)-1,1{prime}-biphenyl-4,4{prime}-diamine (NPD), 4,4{prime}-bis(2,2{prime}-diphenyl-vinyl)-,1{prime}-biphenyl (DPVBi), and some guest-host small molecular organic light-emitting diodes (OLEDs), fabricated on indium tin oxide (ITO)-coated glass, is described. In all but the DPVBi OLEDs, the narrowed emission band emerges above a threshold thickness of the emitting layer, and narrows down to a full width at half maximum of only 5-10 nm. The results demonstrate that this narrowed emission is due to irregular waveguide modes that leak from the ITO to the glass substrate at a grazing angle. While measurements of variable stripe length l devices exhibit an apparent weak optical gain 0 {le} g {le} 1.86 cm{sup -1}, there is no observable threshold current or bias associated with this spectral narrowing. In addition, in the phosphorescent guest-host OLEDs, there is no decrease in the emission decay time of the narrowed edge emission relative to the broad surface emission. It is suspected that the apparent weak optical gain is due to misalignment of the axis of the waveguided mode and the axis of the collection lens of the probe.

  4. The importance of craniovertebral and cervicomedullary angles in cervicogenic headache

    PubMed Central

    Çoban, Gökçen; Çöven, İlker; Çifçi, Bilal Egemen; Yıldırım, Erkan; Yazıcı, Ayşe Canan; Horasanlı, Bahriye

    2014-01-01

    PURPOSE Many studies have indicated that cervicogenic headache may originate from the cervical structures innervated by the upper cervical spinal nerves. To date, no study has investigated whether narrowing of the craniovertebral angle (CVA) or cervicomedullary angle (CMA) affects the three upper cervical spinal nerves. The aim of this study was to investigate the effect of CVA and/or CMA narrowing on the occurrence of cervicogenic headache. MATERIALS AND METHODS Two hundred and five patients diagnosed with cervicogenic headache were included in the study. The pain scores of patients were determined using a visual analog scale. The nonheadache control group consisted of 40 volunteers. CVA and CMA values were measured on sagittal T2-weighted magnetic resonance imaging (MRI), on two occasions by two radiologists. Angle values and categorized pain scores were compared statistically between the groups. RESULTS Intraobserver and interobserver agreement was over 97% for all measurements. Pain scores increased with decreasing CVA and CMA values. Mean angle values were significantly different among the pain categories (P < 0.001). The pain score was negatively correlated with CMA (Spearman correlation coefficient, rs, −0.676; P < 0.001) and CVA values (rs, −0.725; P < 0.001). CONCLUSION CVA or CMA narrowing affects the occurrence of cervicogenic headache. There is an inverse relationship between the angle values and pain scores. PMID:24317332

  5. Development of filter exchangeable 3CCD camera for multispectral imaging acquisition

    NASA Astrophysics Data System (ADS)

    Lee, Hoyoung; Park, Soo Hyun; Kim, Moon S.; Noh, Sang Ha

    2012-05-01

    There are a lot of methods to acquire multispectral images. Dynamic band selective and area-scan multispectral camera has not developed yet. This research focused on development of a filter exchangeable 3CCD camera which is modified from the conventional 3CCD camera. The camera consists of F-mounted lens, image splitter without dichroic coating, three bandpass filters, three image sensors, filer exchangeable frame and electric circuit for parallel image signal processing. In addition firmware and application software have developed. Remarkable improvements compared to a conventional 3CCD camera are its redesigned image splitter and filter exchangeable frame. Computer simulation is required to visualize a pathway of ray inside of prism when redesigning image splitter. Then the dimensions of splitter are determined by computer simulation which has options of BK7 glass and non-dichroic coating. These properties have been considered to obtain full wavelength rays on all film planes. The image splitter is verified by two line lasers with narrow waveband. The filter exchangeable frame is designed to make swap bandpass filters without displacement change of image sensors on film plane. The developed 3CCD camera is evaluated to application of detection to scab and bruise on Fuji apple. As a result, filter exchangeable 3CCD camera could give meaningful functionality for various multispectral applications which need to exchange bandpass filter.

  6. Stratoscope 2 integrating television camera

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The development, construction, test and delivery of an integrating television camera for use as the primary data sensor on Flight 9 of Stratoscope 2 is described. The system block diagrams are presented along with the performance data, and definition of the interface of the telescope with the power, telemetry, and communication system.

  7. OSIRIS camera barrel optomechanical design

    NASA Astrophysics Data System (ADS)

    Farah, Alejandro; Tejada, Carlos; Gonzalez, Jesus; Cobos, Francisco J.; Sanchez, Beatriz; Fuentes, Javier; Ruiz, Elfego

    2004-09-01

    A Camera Barrel, located in the OSIRIS imager/spectrograph for the Gran Telescopio Canarias (GTC), is described in this article. The barrel design has been developed by the Institute for Astronomy of the University of Mexico (IA-UNAM), in collaboration with the Institute for Astrophysics of Canarias (IAC), Spain. The barrel is being manufactured by the Engineering Center for Industrial Development (CIDESI) at Queretaro, Mexico. The Camera Barrel includes a set of eight lenses (three doublets and two singlets), with their respective supports and cells, as well as two subsystems: the Focusing Unit, which is a mechanism that modifies the first doublet relative position; and the Passive Displacement Unit (PDU), which uses the third doublet as thermal compensator to maintain the camera focal length and image quality when the ambient temperature changes. This article includes a brief description of the scientific instrument; describes the design criteria related with performance justification; and summarizes the specifications related with misalignment errors and generated stresses. The Camera Barrel components are described and analytical calculations, FEA simulations and error budgets are also included.

  8. Measuring Distances Using Digital Cameras

    ERIC Educational Resources Information Center

    Kendal, Dave

    2007-01-01

    This paper presents a generic method of calculating accurate horizontal and vertical object distances from digital images taken with any digital camera and lens combination, where the object plane is parallel to the image plane or tilted in the vertical plane. This method was developed for a project investigating the size, density and spatial…

  9. The Camera Comes to Court.

    ERIC Educational Resources Information Center

    Floren, Leola

    After the Lindbergh kidnapping trial in 1935, the American Bar Association sought to eliminate electronic equipment from courtroom proceedings. Eventually, all but two states adopted regulations applying that ban to some extent, and a 1965 Supreme Court decision encouraged the banning of television cameras at trials as well. Currently, some states…

  10. High speed multiwire photon camera

    NASA Technical Reports Server (NTRS)

    Lacy, Jeffrey L. (Inventor)

    1991-01-01

    An improved multiwire proportional counter camera having particular utility in the field of clinical nuclear medicine imaging. The detector utilizes direct coupled, low impedance, high speed delay lines, the segments of which are capacitor-inductor networks. A pile-up rejection test is provided to reject confused events otherwise caused by multiple ionization events occuring during the readout window.

  11. High speed multiwire photon camera

    NASA Technical Reports Server (NTRS)

    Lacy, Jeffrey L. (Inventor)

    1989-01-01

    An improved multiwire proportional counter camera having particular utility in the field of clinical nuclear medicine imaging. The detector utilizes direct coupled, low impedance, high speed delay lines, the segments of which are capacitor-inductor networks. A pile-up rejection test is provided to reject confused events otherwise caused by multiple ionization events occurring during the readout window.

  12. Directing Performers for the Cameras.

    ERIC Educational Resources Information Center

    Wilson, George P., Jr.

    An excellent way for an undergraduate, novice director of television and film to pick up background experience in directing performers for cameras is by participating in nonbroadcast-film activities, such as theatre, dance, and variety acts, both as performer and as director. This document describes the varieties of activities, including creative,…

  13. Toy Cameras and Color Photographs.

    ERIC Educational Resources Information Center

    Speight, Jerry

    1979-01-01

    The technique of using toy cameras for both black-and-white and color photography in the art class is described. The author suggests that expensive equipment can limit the growth of a beginning photographer by emphasizing technique and equipment instead of in-depth experience with composition fundamentals and ideas. (KC)

  14. A Robotic Wide-Angle H-alpha Survey of the Southern Sky

    NASA Astrophysics Data System (ADS)

    Rosing, W.; Gaustad, J. E.; McCullough, P. R.; Van Buren, D.

    2000-12-01

    We have completed a wide-angle imaging survey of the entire southern sky (δ < +10 degrees) at 656.3 nm wavelength, the Hα emission line of hydrogen. Each image of the survey covers an area of the sky 13 degrees square at an angular resolution of approximately one arcminute, and reaches a sensitivity level of 0.5 Rayleigh (3 x 10-18 erg cm-2 s-1 arcsec-2 ), corresponding to an emission measure of 1 cm-6 pc, and to a brightness temperature for microwave free-free emission of 3 μ K at 30 GHz. The survey was done with a CCD detector, a commercial camera lens, and narrow-band interference filters, mounted on a robotic telescope platform located at Cerro Tololo Inter-American Observatory in Chile. This work was supported by grants from Las Cumbres Observatory, NSF, NASA, JPL, Dudley Observatory, the Fund for Astrophysical Research, Research Corporation, the University of Illinois at Urbana-Champaign, and Swarthmore College.

  15. Photoelectric angle converter

    NASA Astrophysics Data System (ADS)

    Podzharenko, Volodymyr A.; Kulakov, Pavlo I.

    2001-06-01

    The photo-electric angle transmitter of rotation is offered, at which the output voltage is linear function of entering magnitude. In a transmitter the linear phototransducer is used on the basis of pair photo diode -- operating amplifier, which output voltage is linear function of the area of an illuminated photosensitive stratum, and modulator of a light stream of the special shape, which ensures a linear dependence of this area from an angle of rotation. The transmitter has good frequent properties and can be used for dynamic measurements of an angular velocity and angle of rotation, in systems of exact drives and systems of autocontrol.

  16. Meteorite incidence angles

    NASA Astrophysics Data System (ADS)

    Hughes, D. W.

    1993-06-01

    Think about an asteroid smashing into the surface of the Moon and excavating a crater; or hitting Earth and scattering meteorite fragments over a strewn field. Imagine a fragment of cometary dust burning out in the Earth's atmosphere and producing a meteor. These bodies have paths that are inclined at some angle to the vertical. But what is the predominant value of this angle of incidence, i? How does the number of incident bodies vary as a function of angle i? And how do both these affect the prevalence of non- circular lunar craters and the ellipticity of meteorite strewn fields?

  17. Characterization of a PET Camera Optimized for ProstateImaging

    SciTech Connect

    Huber, Jennifer S.; Choong, Woon-Seng; Moses, William W.; Qi,Jinyi; Hu, Jicun; Wang, G.C.; Wilson, David; Oh, Sang; Huesman, RonaldH.; Derenzo, Stephen E.

    2005-11-11

    We present the characterization of a positron emission tomograph for prostate imaging that centers a patient between a pair of external curved detector banks (ellipse: 45 cm minor, 70 cm major axis). The distance between detector banks adjusts to allow patient access and to position the detectors as closely as possible for maximum sensitivity with patients of various sizes. Each bank is composed of two axial rows of 20 HR+ block detectors for a total of 80 detectors in the camera. The individual detectors are angled in the transaxial plane to point towards the prostate to reduce resolution degradation in that region. The detectors are read out by modified HRRT data acquisition electronics. Compared to a standard whole-body PET camera, our dedicated-prostate camera has the same sensitivity and resolution, less background (less randoms and lower scatter fraction) and a lower cost. We have completed construction of the camera. Characterization data and reconstructed images of several phantoms are shown. Sensitivity of a point source in the center is 946 cps/mu Ci. Spatial resolution is 4 mm FWHM in the central region.

  18. Payload topography camera of Chang'e-3

    NASA Astrophysics Data System (ADS)

    Yu, Guo-Bin; Liu, En-Hai; Zhao, Ru-Jin; Zhong, Jie; Zhou, Xiang-Dong; Zhou, Wu-Lin; Wang, Jin; Chen, Yuan-Pei; Hao, Yong-Jie

    2015-11-01

    Chang'e-3 was China's first soft-landing lunar probe that achieved a successful roving exploration on the Moon. A topography camera functioning as the lander's “eye” was one of the main scientific payloads installed on the lander. It was composed of a camera probe, an electronic component that performed image compression, and a cable assembly. Its exploration mission was to obtain optical images of the lunar topography in the landing zone for investigation and research. It also observed rover movement on the lunar surface and finished taking pictures of the lander and rover. After starting up successfully, the topography camera obtained static images and video of rover movement from different directions, 360° panoramic pictures of the lunar surface around the lander from multiple angles, and numerous pictures of the Earth. All images of the rover, lunar surface, and the Earth were clear, and those of the Chinese national flag were recorded in true color. This paper describes the exploration mission, system design, working principle, quality assessment of image compression, and color correction of the topography camera. Finally, test results from the lunar surface are provided to serve as a reference for scientific data processing and application.

  19. Optimization of sharp and viewing-angle-independent structural color.

    PubMed

    Hsu, Chia Wei; Miller, Owen D; Johnson, Steven G; Soljačić, Marin

    2015-04-06

    Structural coloration produces some of the most brilliant colors in nature and has many applications. Motivated by the recently proposed transparent displays that are based on wavelength-selective scattering, here we consider the new problem of transparent structural color, where objects are transparent under omnidirectional broad-band illumination but scatter strongly with a directional narrow-band light source. Transparent structural color requires two competing properties, narrow bandwidth and broad viewing angle, that have not been demonstrated simultaneously previously. Here, we use numerical optimization to discover geometries where a sharp 7% bandwidth in scattering is achieved, yet the peak wavelength varies less than 1%, and the peak height and peak width vary less than 6% over broad viewing angles (0-90°) under a directional illumination. Our model system consists of dipole scatterers arranged into several rings; interference among the scattered waves is optimized to yield the wavelength-selective and angle-insensitive response.

  20. Exploring the Moon at High-Resolution: First Results From the Lunar Reconnaissance Orbiter Camera (LROC)

    NASA Astrophysics Data System (ADS)

    Robinson, Mark; Hiesinger, Harald; McEwen, Alfred; Jolliff, Brad; Thomas, Peter C.; Turtle, Elizabeth; Eliason, Eric; Malin, Mike; Ravine, A.; Bowman-Cisneros, Ernest

    The Lunar Reconnaissance Orbiter (LRO) spacecraft was launched on an Atlas V 401 rocket from the Cape Canaveral Air Force Station Launch Complex 41 on June 18, 2009. After spending four days in Earth-Moon transit, the spacecraft entered a three month commissioning phase in an elliptical 30×200 km orbit. On September 15, 2009, LRO began its planned one-year nominal mapping mission in a quasi-circular 50 km orbit. A multi-year extended mission in a fixed 30×200 km orbit is optional. The Lunar Reconnaissance Orbiter Camera (LROC) consists of a Wide Angle Camera (WAC) and two Narrow Angle Cameras (NACs). The WAC is a 7-color push-frame camera, which images the Moon at 100 and 400 m/pixel in the visible and UV, respectively, while the two NACs are monochrome narrow-angle linescan imagers with 0.5 m/pixel spatial resolution. LROC was specifically designed to address two of the primary LRO mission requirements and six other key science objectives, including 1) assessment of meter-and smaller-scale features in order to select safe sites for potential lunar landings near polar resources and elsewhere on the Moon; 2) acquire multi-temporal synoptic 100 m/pixel images of the poles during every orbit to unambiguously identify regions of permanent shadow and permanent or near permanent illumination; 3) meter-scale mapping of regions with permanent or near-permanent illumination of polar massifs; 4) repeat observations of potential landing sites and other regions to derive high resolution topography; 5) global multispectral observations in seven wavelengths to characterize lunar resources, particularly ilmenite; 6) a global 100-m/pixel basemap with incidence angles (60° -80° ) favorable for morphological interpretations; 7) sub-meter imaging of a variety of geologic units to characterize their physical properties, the variability of the regolith, and other key science questions; 8) meter-scale coverage overlapping with Apollo-era panoramic images (1-2 m/pixel) to document

  1. Versatility of the CFR algorithm for limited angle reconstruction

    SciTech Connect

    Fujieda, I.; Heiskanen, K.; Perez-Mendez, V. )

    1990-04-01

    The constrained Fourier reconstruction (CFR) algorithm and the iterative reconstruction-reprojection (IRR) algorithm are evaluated based on their accuracy for three types of limited angle reconstruction problems. The cFR algorithm performs better for problems such as Xray CT imaging of a nuclear reactor core with one large data gap due to structural blocking of the source and detector pair. For gated heart imaging by Xray CT, radioisotope distribution imaging by PET or SPECT, using a polygonal array of gamma cameras with insensitive gaps between camera boundaries, the IRR algorithm has a slight advantage over the CFR algorithm but the difference is not significant.

  2. Narrow band binary phase locked loops

    NASA Technical Reports Server (NTRS)

    Burhans, R. W.

    1976-01-01

    Very high Q digital filtering circuits for audio frequencies in the range of 1Hz to 15 KHz are implemented in simple CMOS hardware using a binary local reference clock frequency. The circuits have application to VLF navigation receivers and other narrow band audio range tracking problems.

  3. Narrow-headed garter snake (Thamnophis rufipunctatus)

    USGS Publications Warehouse

    Nowak, Erika M.

    2006-01-01

    The narrow-headed garter snake is a harmless, nonvenomous snake that is distinguished by its elongated, triangular-shaped head and the red or dark spots on its olive to tan body. Today, the narrow-headed garter snake is a species of special concern in the United States because of its decline over much of its historic range. Arizona's Oak Creek has historically contained the largest population of narrow-headed garter snakes in the United States. The U.S. Geological Survey (USGS) and the Arizona Game and Fish Department jointly funded research by USGS scientists in Oak Creek to shed light on the factors causing declining population numbers. The research resulted in better understanding of the snake's habitat needs, winter and summer range, and dietary habits. Based on the research findings, the U.S. Forest Service has developed recommendations that visitors and local residents can adopt to help slow the decline of the narrow-headed garter snake in Oak Creek.

  4. Narrow-Band Applications of Communications Satellites.

    ERIC Educational Resources Information Center

    Cowlan, Bert; Horowitz, Andrew

    This paper attempts to describe the advantages of "narrow-band" applications of communications satellites for education. It begins by discussing the general controversy surrounding the use of satellites in education, by placing the concern within the larger context of the general debate over the uses of new technologies in education, and by…

  5. Angles, Time, and Proportion

    ERIC Educational Resources Information Center

    Pagni, David L.

    2005-01-01

    This article describes an investigation making connections between the time on an analog clock and the angle between the minute hand and the hour hand. It was posed by a middle school mathematics teacher. (Contains 8 tables and 6 figures.)

  6. a Method for Self-Calibration in Satellite with High Precision of Space Linear Array Camera

    NASA Astrophysics Data System (ADS)

    Liu, Wei; Qian, Fangming; Miao, Yuzhe; Wang, Rongjian

    2016-06-01

    At present, the on-orbit calibration of the geometric parameters of a space surveying camera is usually processed by data from a ground calibration field after capturing the images. The entire process is very complicated and lengthy and cannot monitor and calibrate the geometric parameters in real time. On the basis of a large number of on-orbit calibrations, we found that owing to the influence of many factors, e.g., weather, it is often difficult to capture images of the ground calibration field. Thus, regular calibration using field data cannot be ensured. This article proposes a real time self-calibration method for a space linear array camera on a satellite using the optical auto collimation principle. A collimating light source and small matrix array CCD devices are installed inside the load system of the satellite; these use the same light path as the linear array camera. We can extract the location changes of the cross marks in the matrix array CCD to determine the real-time variations in the focal length and angle parameters of the linear array camera. The on-orbit status of the camera is rapidly obtained using this method. On one hand, the camera's change regulation can be mastered accurately and the camera's attitude can be adjusted in a timely manner to ensure optimal photography; in contrast, self-calibration of the camera aboard the satellite can be realized quickly, which improves the efficiency and reliability of photogrammetric processing.

  7. 'Magic Angle Precession'

    SciTech Connect

    Binder, Bernd

    2008-01-21

    An advanced and exact geometric description of nonlinear precession dynamics modeling very accurately natural and artificial couplings showing Lorentz symmetry is derived. In the linear description it is usually ignored that the geometric phase of relativistic motion couples back to the orbital motion providing for a non-linear recursive precession dynamics. The high coupling strength in the nonlinear case is found to be a gravitomagnetic charge proportional to the precession angle and angular velocity generated by geometric phases, which are induced by high-speed relativistic rotations and are relevant to propulsion technologies but also to basic interactions. In the quantum range some magic precession angles indicating strong coupling in a phase-locked chaotic system are identified, emerging from a discrete time dynamical system known as the cosine map showing bifurcations at special precession angles relevant to heavy nuclei stability. The 'Magic Angle Precession' (MAP) dynamics can be simulated and visualized by cones rolling in or on each other, where the apex and precession angles are indexed by spin, charge or precession quantum numbers, and corresponding magic angles. The most extreme relativistic warping and twisting effect is given by the Dirac spinor half spin constellation with 'Hyperdiamond' MAP, which resembles quark confinement.

  8. Depth perception with a rotationally symmetric coded camera

    NASA Astrophysics Data System (ADS)

    Chang, Chuan-Chung; Chen, Yung-Lin; Chang, Chir-Weei; Lee, Cheng-Chung

    2009-08-01

    A novel design of a phase coded depth-sensing camera is presented. A rotational symmetric phase mask is designed to discriminate the point spread functions (PSF) from different scene distances. The depth information can then be computationally obtained from a single captured photograph through a phase coded lens. The PSF must be carefully optimized at off-axis angles in order to create a restored image which is sharp over the required field of view. In this paper, a phase coded depth camera with a focal length 10.82mm, sensor size 2mm and F-number 5 is designed. Simulation data is exchanged between Matlab and Zemax for co-optimization of optical coding and digital decoding process. The simulation result shows that coarse depth information is investigated for object distance from 513 mm to 1000 mm.

  9. Method for shaping and aiming narrow beams. [sonar mapping and target identification

    NASA Technical Reports Server (NTRS)

    Heyser, R. C. (Inventor)

    1981-01-01

    A sonar method and apparatus is discribed which utilizes a linear frequency chirp in a transmitter/receiver having a correlator to synthesize a narrow beamwidth pattern from otherwise broadbeam transducers when there is relative velocity between the transmitter/receiver and the target. The chirp is so produced in a generator in bandwidth, B, and time, T, as to produce a time bandwidth product, TB, that is increased for a narrower angle. A replica of the chirp produced in a generator is time delayed and Doppler shifted for use as a reference in the receiver for correlation of received chirps from targets. This reference is Doppler shifted to select targets preferentially, thereby to not only synthesize a narrow beam but also aim the beam in azimuth and elevation.

  10. H-alpha wide- and narrow-band photometry of R Canis Majoris

    NASA Astrophysics Data System (ADS)

    Edalati, M. T.; Khalesse, B.; Riazi, N.

    1989-01-01

    H-alpha wide- and narrow-band photoelectric observations of the eclipsing binary R CMa were made at the Biruni Observatory between February 18 and March 2, 1983. The light curves obtained from the observations are analyzed according to Kopal's method for the orbital elements by using frequency-domain techniques. It appears likely that an exchange has taken place on the angle of the first contact and is probably continuing at the present time. New geometric and photometric elements are derived, and a new value for the angle of the first contact is also given.

  11. Measurement of the surface wavelength distribution of narrow-band radiation by a colorimetric method

    SciTech Connect

    Kraiskii, A V; Mironova, T V; Sultanov, T T

    2010-09-10

    A method is suggested for determining the wavelength of narrow-band light from a digital photograph of a radiating surface. The digital camera used should be appropriately calibrated. The accuracy of the wavelength measurement is better than 1 nm. The method was tested on the yellow doublet of mercury spectrum and on the adjacent continuum of the incandescent lamp radiation spectrum. By means of the method suggested the homogeneity of holographic sensor swelling was studied in stationary and transient cases. (laser applications and other topics in quantum electronics)

  12. Anterior chamber angle in the exfoliation syndrome.

    PubMed Central

    Wishart, P K; Spaeth, G L; Poryzees, E M

    1985-01-01

    The gonioscopic findings of 76 patients with the exfoliation syndrome were reviewed. A high frequency of narrowness of the anterior chamber (AC) angle was found (32%). 18% had angles considered occludable, and 14% had obvious angle-closure glaucoma as shown by the presence of peripheral anterior synechias (PAS). Increased pigmentation of the posterior trabecular meshwork (PTM) was noted in all cases. When this pigmentation was markedly asymmetrical, unilateral exfoliation with glaucoma was common in the more pigmented eye. In addition heavy angle pigmentation in the absence of exfoliation was noted in the fellow eye of patients with characteristic exfoliated material in the other eye. Increased pigmentation of the PTM may be the earliest detectable sign of the exfoliation syndrome (ES). The clinical significance of our estimating PTM pigmentation at the 12 o'clock position is discussed. In view of the accelerated optic nerve damage associated with the development of glaucoma secondary to ES, routine estimation of the pigmentation of the PTM at 12 o'clock is recommended in the hope of early detection of cases of otherwise inapparent ES. Images PMID:3966996

  13. Selecting a digital camera for telemedicine.

    PubMed

    Patricoski, Chris; Ferguson, A Stewart

    2009-06-01

    The digital camera is an essential component of store-and-forward telemedicine (electronic consultation). There are numerous makes and models of digital cameras on the market, and selecting a suitable consumer-grade camera can be complicated. Evaluation of digital cameras includes investigating the features and analyzing image quality. Important features include the camera settings, ease of use, macro capabilities, method of image transfer, and power recharging. Consideration needs to be given to image quality, especially as it relates to color (skin tones) and detail. It is important to know the level of the photographer and the intended application. The goal is to match the characteristics of the camera with the telemedicine program requirements. In the end, selecting a digital camera is a combination of qualitative (subjective) and quantitative (objective) analysis. For the telemedicine program in Alaska in 2008, the camera evaluation and decision process resulted in a specific selection based on the criteria developed for our environment.

  14. The wide field/planetary camera

    NASA Technical Reports Server (NTRS)

    Westphal, J. A.; Baum, W. A.; Code, A. D.; Currie, D. G.; Danielson, G. E.; Gunn, J. E.; Kelsall, T. F.; Kristian, J. A.; Lynds, C. R.; Seidelmann, P. K.

    1982-01-01

    A wide site of potential astronomical and solar system scientific studies using the wide field planetary camera on space telescope are described. The expected performance of the camera as it approaches final assembly and testing is also detailed.

  15. Flash photography by digital still camera

    NASA Astrophysics Data System (ADS)

    Yamamoto, Yoshitaka

    2001-04-01

    Recently, the number of commercially produced digital still cameras has increases rapidly. However, detailed performance of digital still camera had not been evaluated. One of the purposes of this paper is to devise the method of evaluating the performance of a new camera. Another purpose is to show possibility of taking a picture of a scientific high quality photograph with a camera on the market, and taking a picture of a high-speed phenomenon.

  16. Electronographic cameras for space astronomy.

    NASA Technical Reports Server (NTRS)

    Carruthers, G. R.; Opal, C. B.

    1972-01-01

    Magnetically-focused electronographic cameras have been under development at the Naval Research Laboratory for use in far-ultraviolet imagery and spectrography, primarily in astronomical and optical-geophysical observations from sounding rockets and space vehicles. Most of this work has been with cameras incorporating internal optics of the Schmidt or wide-field all-reflecting types. More recently, we have begun development of electronographic spectrographs incorporating an internal concave grating, operating at normal or grazing incidence. We also are developing electronographic image tubes of the conventional end-window-photo-cathode type, for far-ultraviolet imagery at the focus of a large space telescope, with image formats up to 120 mm in diameter.

  17. GRAVITY acquisition camera: characterization results

    NASA Astrophysics Data System (ADS)

    Anugu, Narsireddy; Garcia, Paulo; Amorim, Antonio; Wiezorrek, Erich; Wieprecht, Ekkehard; Eisenhauer, Frank; Ott, Thomas; Pfuhl, Oliver; Gordo, Paulo; Perrin, Guy; Brandner, Wolfgang; Straubmeier, Christian; Perraut, Karine

    2016-08-01

    GRAVITY acquisition camera implements four optical functions to track multiple beams of Very Large Telescope Interferometer (VLTI): a) pupil tracker: a 2×2 lenslet images four pupil reference lasers mounted on the spiders of telescope secondary mirror; b) field tracker: images science object; c) pupil imager: reimages telescope pupil; d) aberration tracker: images a Shack-Hartmann. The estimation of beam stabilization parameters from the acquisition camera detector image is carried out, for every 0.7 s, with a dedicated data reduction software. The measured parameters are used in: a) alignment of GRAVITY with the VLTI; b) active pupil and field stabilization; c) defocus correction and engineering purposes. The instrument is now successfully operational on-sky in closed loop. The relevant data reduction and on-sky characterization results are reported.

  18. Combustion pinhole-camera system

    DOEpatents

    Witte, A.B.

    1982-05-19

    A pinhole camera system is described utilizing a sealed optical-purge assembly which provides optical access into a coal combustor or other energy conversion reactors. The camera system basically consists of a focused-purge pinhole optical port assembly, a conventional TV vidicon receiver, an external, variable density light filter which is coupled electronically to the vidicon automatic gain control (agc). The key component of this system is the focused-purge pinhole optical port assembly which utilizes a purging inert gas to keep debris from entering the port and a lens arrangement which transfers the pinhole to the outside of the port assembly. One additional feature of the port assembly is that it is not flush with the interior of the combustor.

  19. A 10-microm infrared camera.

    PubMed

    Arens, J F; Jernigan, J G; Peck, M C; Dobson, C A; Kilk, E; Lacy, J; Gaalema, S

    1987-09-15

    An IR camera has been built at the University of California at Berkeley for astronomical observations. The camera has been used primarily for high angular resolution imaging at mid-IR wavelengths. It has been tested at the University of Arizona 61- and 90-in. telescopes near Tucson and the NASA Infrared Telescope Facility on Mauna Kea, HI. In the observations the system has been used as an imager with interference coated and Fabry-Perot filters. These measurements have demonstrated a sensitivity consistent with photon shot noise, showing that the system is limited by the radiation from the telescope and atmosphere. Measurements of read noise, crosstalk, and hysteresis have been made in our laboratory.

  20. Empirical Study on Designing of Gaze Tracking Camera Based on the Information of User's Head Movement.

    PubMed

    Pan, Weiyuan; Jung, Dongwook; Yoon, Hyo Sik; Lee, Dong Eun; Naqvi, Rizwan Ali; Lee, Kwan Woo; Park, Kang Ryoung

    2016-08-31

    Gaze tracking is the technology that identifies a region in space that a user is looking at. Most previous non-wearable gaze tracking systems use a near-infrared (NIR) light camera with an NIR illuminator. Based on the kind of camera lens used, the viewing angle and depth-of-field (DOF) of a gaze tracking camera can be different, which affects the performance of the gaze tracking system. Nevertheless, to our best knowledge, most previous researches implemented gaze tracking cameras without ground truth information for determining the optimal viewing angle and DOF of the camera lens. Eye-tracker manufacturers might also use ground truth information, but they do not provide this in public. Therefore, researchers and developers of gaze tracking systems cannot refer to such information for implementing gaze tracking system. We address this problem providing an empirical study in which we design an optimal gaze tracking camera based on experimental measurements of the amount and velocity of user's head movements. Based on our results and analyses, researchers and developers might be able to more easily implement an optimal gaze tracking system. Experimental results show that our gaze tracking system shows high performance in terms of accuracy, user convenience and interest.

  1. The OCA CCD Camera Controller

    DTIC Science & Technology

    1996-01-01

    blank) -2. REPORT DATE 3 . REPORT TYPE AND DATES COVERED •. . ..December 1996 , 1996 Final Report - Ř. TITLE AND SUBTITLE 5. FUNDING NUMBERS The OCA...Physical. implementation of a multi CCD camera Appendix 1: Contrbller schematics Appendix 2: Data sheets of the the major components Appendix 3 ...the final-report for EOARD cbntract ##SPC-93-4007. R? 3 %o-/ Ob. 7(, It contains the following sections: - Requirements analysis - Description of the

  2. The PS1 Gigapixel Camera

    NASA Astrophysics Data System (ADS)

    Tonry, John L.; Isani, S.; Onaka, P.

    2007-12-01

    The world's largest and most advanced digital camera has been installed on the Pan-STARRS-1 (PS1) telescope on Haleakala, Maui. Built at the University of Hawaii at Manoa's Institute for Astronomy (IfA) in Honolulu, the gigapixel camera will capture images that will be used to scan the skies for killer asteroids, and to create the most comprehensive catalog of stars and galaxies ever produced. The CCD sensors at the heart of the camera were developed in collaboration with Lincoln Laboratory of the Massachusetts Institute of Technology. The image area, which is about 40 cm across, contains 60 identical silicon chips, each of which contains 64 independent imaging circuits. Each of these imaging circuits contains approximately 600 x 600 pixels, for a total of about 1.4 gigapixels in the focal plane. The CCDs themselves employ the innovative technology called "orthogonal transfer." Splitting the image area into about 4,000 separate regions in this way has three advantages: data can be recorded more quickly, saturation of the image by a very bright star is confined to a small region, and any defects in the chips only affect only a small part of the image area. The CCD camera is controlled by an ultrafast 480-channel control system developed at the IfA. The individual CCD cells are grouped in 8 x 8 arrays on a single silicon chip called an orthogonal transfer array (OTA), which measures about 5 cm square. There are a total of 60 OTAs in the focal plane of each telescope.

  3. SPEIR: A Ge Compton Camera

    SciTech Connect

    Mihailescu, L; Vetter, K M; Burks, M T; Hull, E L; Craig, W W

    2004-02-11

    The SPEctroscopic Imager for {gamma}-Rays (SPEIR) is a new concept of a compact {gamma}-ray imaging system of high efficiency and spectroscopic resolution with a 4-{pi} field-of-view. The system behind this concept employs double-sided segmented planar Ge detectors accompanied by the use of list-mode photon reconstruction methods to create a sensitive, compact Compton scatter camera.

  4. 21 CFR 886.1120 - Opthalmic camera.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Opthalmic camera. 886.1120 Section 886.1120 Food... DEVICES OPHTHALMIC DEVICES Diagnostic Devices § 886.1120 Opthalmic camera. (a) Identification. An ophthalmic camera is an AC-powered device intended to take photographs of the eye and the surrounding...

  5. 16 CFR 501.1 - Camera film.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 16 Commercial Practices 1 2012-01-01 2012-01-01 false Camera film. 501.1 Section 501.1 Commercial... 500 § 501.1 Camera film. Camera film packaged and labeled for retail sale is exempt from the net... should be expressed, provided: (a) The net quantity of contents on packages of movie film and bulk...

  6. 16 CFR 501.1 - Camera film.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 16 Commercial Practices 1 2010-01-01 2010-01-01 false Camera film. 501.1 Section 501.1 Commercial... 500 § 501.1 Camera film. Camera film packaged and labeled for retail sale is exempt from the net... should be expressed, provided: (a) The net quantity of contents on packages of movie film and bulk...

  7. 16 CFR 501.1 - Camera film.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 16 Commercial Practices 1 2014-01-01 2014-01-01 false Camera film. 501.1 Section 501.1 Commercial... 500 § 501.1 Camera film. Camera film packaged and labeled for retail sale is exempt from the net... should be expressed, provided: (a) The net quantity of contents on packages of movie film and bulk...

  8. 16 CFR 501.1 - Camera film.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 16 Commercial Practices 1 2011-01-01 2011-01-01 false Camera film. 501.1 Section 501.1 Commercial... 500 § 501.1 Camera film. Camera film packaged and labeled for retail sale is exempt from the net... should be expressed, provided: (a) The net quantity of contents on packages of movie film and bulk...

  9. 16 CFR 501.1 - Camera film.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 16 Commercial Practices 1 2013-01-01 2013-01-01 false Camera film. 501.1 Section 501.1 Commercial... 500 § 501.1 Camera film. Camera film packaged and labeled for retail sale is exempt from the net... should be expressed, provided: (a) The net quantity of contents on packages of movie film and bulk...

  10. Blind camera fingerprinting and image clustering.

    PubMed

    Bloy, Greg J

    2008-03-01

    Previous studies have shown how to "fingerprint" a digital camera given a set of images known to come from the camera. A clustering technique is proposed to construct such fingerprints from a mixed set of images, enabling identification of each image's source camera without any prior knowledge of source.

  11. 21 CFR 886.1120 - Opthalmic camera.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Opthalmic camera. 886.1120 Section 886.1120 Food... DEVICES OPHTHALMIC DEVICES Diagnostic Devices § 886.1120 Opthalmic camera. (a) Identification. An ophthalmic camera is an AC-powered device intended to take photographs of the eye and the surrounding...

  12. Unassisted 3D camera calibration

    NASA Astrophysics Data System (ADS)

    Atanassov, Kalin; Ramachandra, Vikas; Nash, James; Goma, Sergio R.

    2012-03-01

    With the rapid growth of 3D technology, 3D image capture has become a critical part of the 3D feature set on mobile phones. 3D image quality is affected by the scene geometry as well as on-the-device processing. An automatic 3D system usually assumes known camera poses accomplished by factory calibration using a special chart. In real life settings, pose parameters estimated by factory calibration can be negatively impacted by movements of the lens barrel due to shaking, focusing, or camera drop. If any of these factors displaces the optical axes of either or both cameras, vertical disparity might exceed the maximum tolerable margin and the 3D user may experience eye strain or headaches. To make 3D capture more practical, one needs to consider unassisted (on arbitrary scenes) calibration. In this paper, we propose an algorithm that relies on detection and matching of keypoints between left and right images. Frames containing erroneous matches, along with frames with insufficiently rich keypoint constellations, are detected and discarded. Roll, pitch yaw , and scale differences between left and right frames are then estimated. The algorithm performance is evaluated in terms of the remaining vertical disparity as compared to the maximum tolerable vertical disparity.

  13. Coaxial fundus camera for opthalmology

    NASA Astrophysics Data System (ADS)

    de Matos, Luciana; Castro, Guilherme; Castro Neto, Jarbas C.

    2015-09-01

    A Fundus Camera for ophthalmology is a high definition device which needs to meet low light illumination of the human retina, high resolution in the retina and reflection free image1. Those constraints make its optical design very sophisticated, but the most difficult to comply with is the reflection free illumination and the final alignment due to the high number of non coaxial optical components in the system. Reflection of the illumination, both in the objective and at the cornea, mask image quality, and a poor alignment make the sophisticated optical design useless. In this work we developed a totally axial optical system for a non-midriatic Fundus Camera. The illumination is performed by a LED ring, coaxial with the optical system and composed of IR of visible LEDs. The illumination ring is projected by the objective lens in the cornea. The Objective, LED illuminator, CCD lens are coaxial making the final alignment easily to perform. The CCD + capture lens module is a CCTV camera with autofocus and Zoom built in, added to a 175 mm focal length doublet corrected for infinity, making the system easily operated and very compact.

  14. Automatic tracking sensor camera system

    NASA Astrophysics Data System (ADS)

    Tsuda, Takao; Kato, Daiichiro; Ishikawa, Akio; Inoue, Seiki

    2001-04-01

    We are developing a sensor camera system for automatically tracking and determining the positions of subjects moving in three-dimensions. The system is intended to operate even within areas as large as soccer fields. The system measures the 3D coordinates of the object while driving the pan and tilt movements of camera heads, and the degree of zoom of the lenses. Its principal feature is that it automatically zooms in as the object moves farther away and out as the object moves closer. This maintains the area of the object as a fixed position of the image. This feature makes stable detection by the image processing possible. We are planning to use the system to detect the position of a soccer ball during a soccer game. In this paper, we describe the configuration of the developing automatic tracking sensor camera system. We then give an analysis of the movements of the ball within images of games, the results of experiments on method of image processing used to detect the ball, and the results of other experiments to verify the accuracy of an experimental system. These results show that the system is sufficiently accurate in terms of obtaining positions in three-dimensions.

  15. Image quality testing of assembled IR camera modules

    NASA Astrophysics Data System (ADS)

    Winters, Daniel; Erichsen, Patrik

    2013-10-01

    Infrared (IR) camera modules for the LWIR (8-12_m) that combine IR imaging optics with microbolometer focal plane array (FPA) sensors with readout electronics are becoming more and more a mass market product. At the same time, steady improvements in sensor resolution in the higher priced markets raise the requirement for imaging performance of objectives and the proper alignment between objective and FPA. This puts pressure on camera manufacturers and system integrators to assess the image quality of finished camera modules in a cost-efficient and automated way for quality control or during end-of-line testing. In this paper we present recent development work done in the field of image quality testing of IR camera modules. This technology provides a wealth of additional information in contrast to the more traditional test methods like minimum resolvable temperature difference (MRTD) which give only a subjective overall test result. Parameters that can be measured are image quality via the modulation transfer function (MTF) for broadband or with various bandpass filters on- and off-axis and optical parameters like e.g. effective focal length (EFL) and distortion. If the camera module allows for refocusing the optics, additional parameters like best focus plane, image plane tilt, auto-focus quality, chief ray angle etc. can be characterized. Additionally, the homogeneity and response of the sensor with the optics can be characterized in order to calculate the appropriate tables for non-uniformity correction (NUC). The technology can also be used to control active alignment methods during mechanical assembly of optics to high resolution sensors. Other important points that are discussed are the flexibility of the technology to test IR modules with different form factors, electrical interfaces and last but not least the suitability for fully automated measurements in mass production.

  16. Silicone Contamination Camera for Developed for Shuttle Payloads

    NASA Technical Reports Server (NTRS)

    1996-01-01

    On many shuttle missions, silicone contamination from unknown sources from within or external to the shuttle payload bay has been a chronic problem plaguing experiment payloads. There is currently a wide range of silicone usage on the shuttle. Silicones are used to coat the shuttle tiles to enhance their ability to shed rain, and over 100 kg of RTV 560 silicone is used to seal white tiles to the shuttle surfaces. Silicones are also used in electronic components, potting compounds, and thermal control blankets. Efforts to date to identify and eliminate the sources of silicone contamination have not been highly successful and have created much controversy. To identify the sources of silicone contamination on the space shuttle, the NASA Lewis Research Center developed a contamination camera. This specially designed pinhole camera utilizes low-Earth-orbit atomic oxygen to develop a picture that identifies sources of silicone contamination on shuttle-launched payloads. The volatile silicone species travel through the aperture of the pinhole camera, and since volatile silicone species lose their hydrocarbon functionalities under atomic oxygen attack, the silicone adheres to the substrate as SiO_x. This glassy deposit should be spatially arranged in the image of the sources of silicone contamination. To view the contamination image, one can use ultrasensitive thickness measurement techniques, such as scanning variable-angle ellipsometry, to map the surface topography of the camera's substrate. The demonstration of a functional contamination camera would resolve the controversial debate concerning the amount and location of contamination sources, would allow corrective actions to be taken, and would demonstrate a useful tool for contamination documentation on future shuttle payloads, with near negligible effect on cost and weight.

  17. Creep turns linear in narrow ferromagnetic nanostrips

    PubMed Central

    Leliaert, Jonathan; Van de Wiele, Ben; Vansteenkiste, Arne; Laurson, Lasse; Durin, Gianfranco; Dupré, Luc; Van Waeyenberge, Bartel

    2016-01-01

    The motion of domain walls in magnetic materials is a typical example of a creep process, usually characterised by a stretched exponential velocity-force relation. By performing large-scale micromagnetic simulations, and analyzing an extended 1D model which takes the effects of finite temperatures and material defects into account, we show that this creep scaling law breaks down in sufficiently narrow ferromagnetic strips. Our analysis of current-driven transverse domain wall motion in disordered Permalloy nanostrips reveals instead a creep regime with a linear dependence of the domain wall velocity on the applied field or current density. This originates from the essentially point-like nature of domain walls moving in narrow, line- like disordered nanostrips. An analogous linear relation is found also by analyzing existing experimental data on field-driven domain wall motion in perpendicularly magnetised media. PMID:26843125

  18. Powerful narrow linewidth random fiber laser

    NASA Astrophysics Data System (ADS)

    Ye, Jun; Xu, Jiangming; Zhang, Hanwei; Zhou, Pu

    2016-11-01

    In this paper, we demonstrate a narrow linewidth random fiber laser, which employs a tunable pump laser to select the operating wavelength for efficiency optimization, a narrow-band fiber Bragg grating (FBG) and a section of single mode fiber to construct a half-open cavity, and a circulator to separate pump light input and random lasing output. Spectral linewidth down to 42.31 GHz is achieved through filtering by the FBG. When 8.97 W pump light centered at the optimized wavelength 1036.5 nm is launched into the half-open cavity, 1081.4 nm random lasing with the maximum output power of 2.15 W is achieved, which is more powerful than the previous reported results.

  19. Creep turns linear in narrow ferromagnetic nanostrips

    NASA Astrophysics Data System (ADS)

    Leliaert, Jonathan; van de Wiele, Ben; Vansteenkiste, Arne; Laurson, Lasse; Durin, Gianfranco; Dupré, Luc; van Waeyenberge, Bartel

    2016-02-01

    The motion of domain walls in magnetic materials is a typical example of a creep process, usually characterised by a stretched exponential velocity-force relation. By performing large-scale micromagnetic simulations, and analyzing an extended 1D model which takes the effects of finite temperatures and material defects into account, we show that this creep scaling law breaks down in sufficiently narrow ferromagnetic strips. Our analysis of current-driven transverse domain wall motion in disordered Permalloy nanostrips reveals instead a creep regime with a linear dependence of the domain wall velocity on the applied field or current density. This originates from the essentially point-like nature of domain walls moving in narrow, line- like disordered nanostrips. An analogous linear relation is found also by analyzing existing experimental data on field-driven domain wall motion in perpendicularly magnetised media.

  20. Exciton absorption in narrow armchair graphene nanoribbons

    NASA Astrophysics Data System (ADS)

    Monozon, B. S.; Schmelcher, P.

    2016-11-01

    We develop an analytical approach to the exciton optical absorption for narrow gap armchair graphene nanoribbons (AGNR). We focus on the regime of dominant size quantization in combination with the attractive electron-hole interaction. An adiabatic separation of slow and fast motions leads via the two-body Dirac equation to the isolated and coupled subband approximations. Discrete and continuous exciton states are in general coupled and form quasi-Rydberg series of purely discrete and resonance type character. The corresponding oscillator strengths and widths are derived. We show that the exciton peaks are blue-shifted, become broader and increase in magnitude upon narrowing the ribbon. At the edge of a subband the singularity related to the 1D density of states is transformed into finite absorption via the presence of the exciton. Our analytical results are in good agreement with those obtained by other methods including numerical approaches. Estimates of the expected experimental values are provided for realistic AGNR.

  1. Creep turns linear in narrow ferromagnetic nanostrips.

    PubMed

    Leliaert, Jonathan; Van de Wiele, Ben; Vansteenkiste, Arne; Laurson, Lasse; Durin, Gianfranco; Dupré, Luc; Van Waeyenberge, Bartel

    2016-02-04

    The motion of domain walls in magnetic materials is a typical example of a creep process, usually characterised by a stretched exponential velocity-force relation. By performing large-scale micromagnetic simulations, and analyzing an extended 1D model which takes the effects of finite temperatures and material defects into account, we show that this creep scaling law breaks down in sufficiently narrow ferromagnetic strips. Our analysis of current-driven transverse domain wall motion in disordered Permalloy nanostrips reveals instead a creep regime with a linear dependence of the domain wall velocity on the applied field or current density. This originates from the essentially point-like nature of domain walls moving in narrow, line- like disordered nanostrips. An analogous linear relation is found also by analyzing existing experimental data on field-driven domain wall motion in perpendicularly magnetised media.

  2. Powerful narrow linewidth random fiber laser

    NASA Astrophysics Data System (ADS)

    Ye, Jun; Xu, Jiangming; Zhang, Hanwei; Zhou, Pu

    2017-03-01

    In this paper, we demonstrate a narrow linewidth random fiber laser, which employs a tunable pump laser to select the operating wavelength for efficiency optimization, a narrow-band fiber Bragg grating (FBG) and a section of single mode fiber to construct a half-open cavity, and a circulator to separate pump light input and random lasing output. Spectral linewidth down to 42.31 GHz is achieved through filtering by the FBG. When 8.97 W pump light centered at the optimized wavelength 1036.5 nm is launched into the half-open cavity, 1081.4 nm random lasing with the maximum output power of 2.15 W is achieved, which is more powerful than the previous reported results.

  3. Bio-inspired hemispherical compound eye camera

    NASA Astrophysics Data System (ADS)

    Xiao, Jianliang; Song, Young Min; Xie, Yizhu; Malyarchuk, Viktor; Jung, Inhwa; Choi, Ki-Joong; Liu, Zhuangjian; Park, Hyunsung; Lu, Chaofeng; Kim, Rak-Hwan; Li, Rui; Crozier, Kenneth B.; Huang, Yonggang; Rogers, John A.

    2014-03-01

    Compound eyes in arthropods demonstrate distinct imaging characteristics from human eyes, with wide angle field of view, low aberrations, high acuity to motion and infinite depth of field. Artificial imaging systems with similar geometries and properties are of great interest for many applications. However, the challenges in building such systems with hemispherical, compound apposition layouts cannot be met through established planar sensor technologies and conventional optics. We present our recent progress in combining optics, materials, mechanics and integration schemes to build fully functional artificial compound eye cameras. Nearly full hemispherical shapes (about 160 degrees) with densely packed artificial ommatidia were realized. The number of ommatidia (180) is comparable to those of the eyes of fire ants and bark beetles. The devices combine elastomeric compound optical elements with deformable arrays of thin silicon photodetectors, which were fabricated in the planar geometries and then integrated and elastically transformed to hemispherical shapes. Imaging results and quantitative ray-tracing-based simulations illustrate key features of operation. These general strategies seem to be applicable to other compound eye devices, such as those inspired by moths and lacewings (refracting superposition eyes), lobster and shrimp (reflecting superposition eyes), and houseflies (neural superposition eyes).

  4. Cooling the dark energy camera instrument

    SciTech Connect

    Schmitt, R.L.; Cease, H.; DePoy, D.; Diehl, H.T.; Estrada, J.; Flaugher, B.; Kuhlmann, S.; Onal, Birce; Stefanik, A.; /Fermilab

    2008-06-01

    DECam, camera for the Dark Energy Survey (DES), is undergoing general design and component testing. For an overview see DePoy, et al in these proceedings. For a description of the imager, see Cease, et al in these proceedings. The CCD instrument will be mounted at the prime focus of the CTIO Blanco 4m telescope. The instrument temperature will be 173K with a heat load of 113W. In similar applications, cooling CCD instruments at the prime focus has been accomplished by three general methods. Liquid nitrogen reservoirs have been constructed to operate in any orientation, pulse tube cryocoolers have been used when tilt angles are limited and Joule-Thompson or Stirling cryocoolers have been used with smaller heat loads. Gifford-MacMahon cooling has been used at the Cassegrain but not at the prime focus. For DES, the combined requirements of high heat load, temperature stability, low vibration, operation in any orientation, liquid nitrogen cost and limited space available led to the design of a pumped, closed loop, circulating nitrogen system. At zenith the instrument will be twelve meters above the pump/cryocooler station. This cooling system expected to have a 10,000 hour maintenance interval. This paper will describe the engineering basis including the thermal model, unbalanced forces, cooldown time, the single and two-phase flow model.

  5. Speckle Camera Imaging of the Planet Pluto

    NASA Astrophysics Data System (ADS)

    Howell, Steve B.; Horch, Elliott P.; Everett, Mark E.; Ciardi, David R.

    2012-10-01

    We have obtained optical wavelength (692 nm and 880 nm) speckle imaging of the planet Pluto and its largest moon Charon. Using our DSSI speckle camera attached to the Gemini North 8 m telescope, we collected high resolution imaging with an angular resolution of ∼20 mas, a value at the Gemini-N telescope diffraction limit. We have produced for this binary system the first speckle reconstructed images, from which we can measure not only the orbital separation and position angle for Charon, but also the diameters of the two bodies. Our measurements of these parameters agree, within the uncertainties, with the current best values for Pluto and Charon. The Gemini-N speckle observations of Pluto are presented to illustrate the capabilities of our instrument and the robust production of high accuracy, high spatial resolution reconstructed images. We hope our results will suggest additional applications of high resolution speckle imaging for other objects within our solar system and beyond. Based on observations obtained at the Gemini Observatory, which is operated by the Association of Universities for Research in Astronomy, Inc., under a cooperative agreement with the National Science Foundation on behalf of the Gemini partnership: the National Science Foundation (United States), the Science and Technology Facilities Council (United Kingdom), the National Research Council (Canada), CONICYT (Chile), the Australian Research Council (Australia), Ministério da Ciência, Tecnologia e Inovação (Brazil) and Ministerio de Ciencia, Tecnología e Innovación Productiva (Argentina).

  6. On-Line Point Positioning with Single Frame Camera Data

    DTIC Science & Technology

    1992-03-15

    AD-A248 307 --. On-line point positioning with single frame camera data DTIC FnleoS ELECTE 1b. Kersten, K. R. Holm, A. Gruen PRV (1992 Institute of...until exhiausted StCUIRITY CLSif!CATION Of THIS PAGE All othier editions are obsolext UNLSS I F ED .... . . ... On-line point positioning with single... all work undertakm in this project. Pe a .. [ -4 IOn-line point poitonng with angle frame camr daon 2. Introduction Aerotriangulation is an efficient

  7. Multiwatts narrow linewidth fiber Raman amplifiers.

    PubMed

    Feng, Yan; Taylor, Luke; Bonaccini Calia, Domenico

    2008-07-21

    Up to 4.8 W, approximately 10 MHz, 1178 nm laser is obtained by Raman amplification of a distributed feedback diode laser in standard single mode fibers pumped by an 1120 nm Yb fiber laser. More than 10% efficiency and 27 dB amplification is achieved, limited by onset of stimulated Brillouin scattering. The ratio of Raman to Brillouin gain coefficient of a fiber is identified as a figure of merit for building a narrow linewidth fiber Raman amplifier.

  8. Acoustical Imaging Cameras for the Inspection and Condition Assessment of Hydraulic Structures

    DTIC Science & Technology

    2010-08-01

    feasibility of using acoustical imaging for underwater inspection of structures. INTRODUCTION: Visibility in clear water for the human eye and optical ...but higher resolution than sidescan or multibeam acoustical images • Nonhomogeneity of returned signal caused by variation in angles of signals...acoustical imaging. To obtain higher resolutions than other acoustical imaging technologies such as multibeam and sidescan systems, acoustical camera

  9. Future Planetary Surface Imager Development by the Beagle 2 Stereo Camera System Team

    NASA Astrophysics Data System (ADS)

    Griffiths, A. D.; Coates, A. J.; Josset, J.-L.; Paar, G.

    2004-03-01

    The Stereo Camera System provided Beagle 2 with wide-angle multi-spectral stereo imaging (IFOV=0.043°). The SCS team plans to build on this design heritage to provide improved stereo capabilities to the Pasteur payload of the Aurora ExoMars rover.

  10. Ortho-Rectification of Narrow Band Multi-Spectral Imagery Assisted by Dslr RGB Imagery Acquired by a Fixed-Wing Uas

    NASA Astrophysics Data System (ADS)

    Rau, J.-Y.; Jhan, J.-P.; Huang, C.-Y.

    2015-08-01

    Miniature Multiple Camera Array (MiniMCA-12) is a frame-based multilens/multispectral sensor composed of 12 lenses with narrow band filters. Due to its small size and light weight, it is suitable to mount on an Unmanned Aerial System (UAS) for acquiring high spectral, spatial and temporal resolution imagery used in various remote sensing applications. However, due to its wavelength range is only 10 nm that results in low image resolution and signal-to-noise ratio which are not suitable for image matching and digital surface model (DSM) generation. In the meantime, the spectral correlation among all 12 bands of MiniMCA images are low, it is difficult to perform tie-point matching and aerial triangulation at the same time. In this study, we thus propose the use of a DSLR camera to assist automatic aerial triangulation of MiniMCA-12 imagery and to produce higher spatial resolution DSM for MiniMCA12 ortho-image generation. Depending on the maximum payload weight of the used UAS, these two kinds of sensors could be collected at the same time or individually. In this study, we adopt a fixed-wing UAS to carry a Canon EOS 5D Mark2 DSLR camera and a MiniMCA-12 multi-spectral camera. For the purpose to perform automatic aerial triangulation between a DSLR camera and the MiniMCA-12, we choose one master band from MiniMCA-12 whose spectral range has overlap with the DSLR camera. However, all lenses of MiniMCA-12 have different perspective centers and viewing angles, the original 12 channels have significant band misregistration effect. Thus, the first issue encountered is to reduce the band misregistration effect. Due to all 12 MiniMCA lenses being frame-based, their spatial offsets are smaller than 15 cm and all images are almost 98% overlapped, we thus propose a modified projective transformation (MPT) method together with two systematic error correction procedures to register all 12 bands of imagery on the same image space. It means that those 12 bands of images acquired at

  11. Reductions in Injury Crashes Associated With Red Light Camera Enforcement in Oxnard, California

    PubMed Central

    Retting, Richard A.; Kyrychenko, Sergey Y.

    2002-01-01

    Objectives. This study estimated the impact of red light camera enforcement on motor vehicle crashes in one of the first US communities to employ such cameras—Oxnard, California. Methods. Crash data were analyzed for Oxnard and for 3 comparison cities. Changes in crash frequencies were compared for Oxnard and control cities and for signalized and nonsignalized intersections by means of a generalized linear regression model. Results. Overall, crashes at signalized intersections throughout Oxnard were reduced by 7% and injury crashes were reduced by 29%. Right-angle crashes, those most associated with red light violations, were reduced by 32%; right-angle crashes involving injuries were reduced by 68%. Conclusions. Because red light cameras can be a permanent component of the transportation infrastructure, crash reductions attributed to camera enforcement should be sustainable. (Am J Public Health. 2002;92:1822–1825) PMID:12406815

  12. Spinal canal narrowing during simulated frontal impact.

    PubMed

    Ivancic, Paul C; Panjabi, Manohar M; Tominaga, Yasuhiro; Pearson, Adam M; Elena Gimenez, S; Maak, Travis G

    2006-06-01

    Between 23 and 70% of occupants involved in frontal impacts sustain cervical spine injuries, many with neurological involvement. It has been hypothesized that cervical spinal cord compression and injury may explain the variable neurological profile described by frontal impact victims. The goals of the present study, using a biofidelic whole cervical spine model with muscle force replication, were to quantify canal pinch diameter (CPD) narrowing during frontal impact and to evaluate the potential for cord compression. The biofidelic model and a sled apparatus were used to simulate frontal impacts at 4, 6, 8, and 10 g horizontal accelerations of the T1 vertebra. The CPD was measured in the intact specimen in the neutral posture (neutral posture CPD), under static sagittal pure moments of 1.5 Nm (pre-impact CPD), during dynamic frontal impact (dynamic impact CPD), and again under static pure moments following each impact (post-impact CPD). Frontal impact caused significant (P<0.05) dynamic CPD narrowing at C0-dens, C2-C3, and C6-C7. The narrowest dynamic CPD was observed at C0-dens during the 10 g impact and was 25.9% narrower than the corresponding neutral posture CPD. Interpretation of the present results indicate that the neurological symptomatology reported by frontal impact victims is most likely not due to cervical spinal cord compression. Cord compression due to residual spinal instability is also not likely.

  13. Pinhole Camera For Viewing Electron Beam Materials Processing

    NASA Astrophysics Data System (ADS)

    Rushford, M. C.; Kuzmenko, P. J.

    1986-10-01

    surface intensity variation is contrast reduced if the observation wavelength is a narrow band as far red as high intensity blooming will allow an observable picture. A three eyepiece camera allows an image plane where photo gray glass functions as a neutral density filter only over the high intensity portion of the image, thus reducing blooming. This system is enclosed in a water-cooled housing which can dissipate 15 watts/cm2, keeping the camera below 40 degrees Celsius. Single frames of video output are acquired for feature enhancement and location by a Data Translation DT2803 image processing board housed in an IBM PC.

  14. Spectrometry with consumer-quality CMOS cameras.

    PubMed

    Scheeline, Alexander

    2015-01-01

    Many modern spectrometric instruments use diode arrays, charge-coupled arrays, or CMOS cameras for detection and measurement. As portable or point-of-use instruments are desirable, one would expect that instruments using the cameras in cellular telephones and tablet computers would be the basis of numerous instruments. However, no mass market for such devices has yet developed. The difficulties in using megapixel CMOS cameras for scientific measurements are discussed, and promising avenues for instrument development reviewed. Inexpensive alternatives to use of the built-in camera are also mentioned, as the long-term question is whether it is better to overcome the constraints of CMOS cameras or to bypass them.

  15. Passive Millimeter Wave Camera (PMMWC) at TRW

    NASA Technical Reports Server (NTRS)

    1997-01-01

    Engineers at TRW, Redondo Beach, California, inspect the Passive Millimeter Wave Camera, a weather-piercing camera designed to see through fog, clouds, smoke and dust. Operating in the millimeter wave portion of the electromagnetic spectrum, the camera creates visual-like video images of objects, people, runways, obstacles and the horizon. A demonstration camera (shown in photo) has been completed and is scheduled for checkout tests and flight demonstration. Engineer (left) holds a compact, lightweight circuit board containing 40 complete radiometers, including antenna, monolithic millimeter wave integrated circuit (MMIC) receivers and signal processing and readout electronics that forms the basis for the camera's 1040-element focal plane array.

  16. Passive Millimeter Wave Camera (PMMWC) at TRW

    NASA Technical Reports Server (NTRS)

    1997-01-01

    Engineers at TRW, Redondo Beach, California, inspect the Passive Millimeter Wave Camera, a weather-piercing camera designed to 'see' through fog, clouds, smoke and dust. Operating in the millimeter wave portion of the electromagnetic spectrum, the camera creates visual-like video images of objects, people, runways, obstacles and the horizon. A demonstration camera (shown in photo) has been completed and is scheduled for checkout tests and flight demonstration. Engineer (left) holds a compact, lightweight circuit board containing 40 complete radiometers, including antenna, monolithic millimeter wave integrated circuit (MMIC) receivers and signal processing and readout electronics that forms the basis for the camera's 1040-element focal plane array.

  17. Ultra-narrow laser linewidth measurement

    NASA Astrophysics Data System (ADS)

    Chen, Xiaopei

    In this report, we give a deeper investigation of the loss-compensated recirculating delayed self-heterodyne interferometer (LC-RDSHI) for ultra-narrow linewidth measurement, including the theoretical analysis, experimental implementation, further modification on the system and more applications. Recently, less than 1kHz linewidth fiber lasers have been commercialized. But even the manufacturers face a challenge on accurately measuring the linewidth of such lasers. There is a need to develop more accurate methods to characterize ultra-narrow laser linewidth and frequency noises. Compared with other currently available linewidth measurement techniques, the loss-compensated recirculating delayed-heterodyne interferometer (LC-RDSHI) technique is the most promising one. It overcomes the bottle-neck of the high resolution requirement on the delayed self-heterodyne interferometer (DSHI) by using a short length of fiber delay line. This method does not need another narrower and more stable laser as the reference which is the necessary component in heterodyne detection. The laser spectral lineshape can be observed directly instead of complicated interpretation in frequency discriminator techniques. The theoretical analysis of a LC-RDSHI gives us a guidance on choosing the optimal parameters of the system and assists us to interpret the recorded spectral lineshape. Laser linewidth as narrow as 700Hz has been proved to be measurable by using the LC-RDSHI method. The non-linear curve fitting of Voigt lineshape to separate Lorentzian and Gaussian components was investigated. Voigt curve fitting results give us a clear view on laser frequency noises and laser linewidth nature. It is also shown that for a ultra-narrow linewidth laser, simply taking 20dB down from the maximum value of the beat spectrum and dividing by 2 99 will over estimate the laser linewidth and coherent length. Besides laser linewidth measurement in the frequency domain, we also implemented time

  18. Mini gamma camera, camera system and method of use

    DOEpatents

    Majewski, Stanislaw; Weisenberger, Andrew G.; Wojcik, Randolph F.

    2001-01-01

    A gamma camera comprising essentially and in order from the front outer or gamma ray impinging surface: 1) a collimator, 2) a scintillator layer, 3) a light guide, 4) an array of position sensitive, high resolution photomultiplier tubes, and 5) printed circuitry for receipt of the output of the photomultipliers. There is also described, a system wherein the output supplied by the high resolution, position sensitive photomultipiler tubes is communicated to: a) a digitizer and b) a computer where it is processed using advanced image processing techniques and a specific algorithm to calculate the center of gravity of any abnormality observed during imaging, and c) optional image display and telecommunications ports.

  19. Phenology cameras observing boreal ecosystems of Finland

    NASA Astrophysics Data System (ADS)

    Peltoniemi, Mikko; Böttcher, Kristin; Aurela, Mika; Kolari, Pasi; Tanis, Cemal Melih; Linkosalmi, Maiju; Loehr, John; Metsämäki, Sari; Nadir Arslan, Ali

    2016-04-01

    Cameras have become useful tools for monitoring seasonality of ecosystems. Low-cost cameras facilitate validation of other measurements and allow extracting some key ecological features and moments from image time series. We installed a network of phenology cameras at selected ecosystem research sites in Finland. Cameras were installed above, on the level, or/and below the canopies. Current network hosts cameras taking time lapse images in coniferous and deciduous forests as well as at open wetlands offering thus possibilities to monitor various phenological and time-associated events and elements. In this poster, we present our camera network and give examples of image series use for research. We will show results about the stability of camera derived color signals, and based on that discuss about the applicability of cameras in monitoring time-dependent phenomena. We will also present results from comparisons between camera-derived color signal time series and daily satellite-derived time series (NVDI, NDWI, and fractional snow cover) from the Moderate Resolution Imaging Spectrometer (MODIS) at selected spruce and pine forests and in a wetland. We will discuss the applicability of cameras in supporting phenological observations derived from satellites, by considering the possibility of cameras to monitor both above and below canopy phenology and snow.

  20. Characterization of the Series 1000 Camera System

    SciTech Connect

    Kimbrough, J; Moody, J; Bell, P; Landen, O

    2004-04-07

    The National Ignition Facility requires a compact network addressable scientific grade CCD camera for use in diagnostics ranging from streak cameras to gated x-ray imaging cameras. Due to the limited space inside the diagnostic, an analog and digital input/output option in the camera controller permits control of both the camera and the diagnostic by a single Ethernet link. The system consists of a Spectral Instruments Series 1000 camera, a PC104+ controller, and power supply. The 4k by 4k CCD camera has a dynamic range of 70 dB with less than 14 electron read noise at a 1MHz readout rate. The PC104+ controller includes 16 analog inputs, 4 analog outputs and 16 digital input/output lines for interfacing to diagnostic instrumentation. A description of the system and performance characterization is reported.

  1. Automatic calibration method for plenoptic camera

    NASA Astrophysics Data System (ADS)

    Luan, Yinsen; He, Xing; Xu, Bing; Yang, Ping; Tang, Guomao

    2016-04-01

    An automatic calibration method is proposed for a microlens-based plenoptic camera. First, all microlens images on the white image are searched and recognized automatically based on digital morphology. Then, the center points of microlens images are rearranged according to their relative position relationships. Consequently, the microlens images are located, i.e., the plenoptic camera is calibrated without the prior knowledge of camera parameters. Furthermore, this method is appropriate for all types of microlens-based plenoptic cameras, even the multifocus plenoptic camera, the plenoptic camera with arbitrarily arranged microlenses, or the plenoptic camera with different sizes of microlenses. Finally, we verify our method by the raw data of Lytro. The experiments show that our method has higher intelligence than the methods published before.

  2. Research on evaluation method of CMOS camera

    NASA Astrophysics Data System (ADS)

    Zhang, Shaoqiang; Han, Weiqiang; Cui, Lanfang

    2014-09-01

    In some professional image application fields, we need to test some key parameters of the CMOS camera and evaluate the performance of the device. Aiming at this requirement, this paper proposes a perfect test method to evaluate the CMOS camera. Considering that the CMOS camera has a big fixed pattern noise, the method proposes the `photon transfer curve method' based on pixels to measure the gain and the read noise of the camera. The advantage of this method is that it can effectively wipe out the error brought by the response nonlinearity. Then the reason of photoelectric response nonlinearity of CMOS camera is theoretically analyzed, and the calculation formula of CMOS camera response nonlinearity is deduced. Finally, we use the proposed test method to test the CMOS camera of 2560*2048 pixels. In addition, we analyze the validity and the feasibility of this method.

  3. Automatic Kappa Angle Estimation for Air Photos Based on Phase Only Correlation

    NASA Astrophysics Data System (ADS)

    Xiong, Z.; Stanley, D.; Xin, Y.

    2016-06-01

    The approximate value of exterior orientation parameters is needed for air photo bundle adjustment. Usually the air borne GPS/IMU can provide the initial value for the camera position and attitude angle. However, in some cases, the camera's attitude angle is not available due to lack of IMU or other reasons. In this case, the kappa angle needs to be estimated for each photo before bundle adjustment. The kappa angle can be obtained from the Ground Control Points (GCPs) in the photo. Unfortunately it is not the case that enough GCPs are always available. In order to overcome this problem, an algorithm is developed to automatically estimate the kappa angle for air photos based on phase only correlation technique. This function has been embedded in PCI software. Extensive experiments show that this algorithm is fast, reliable, and stable.

  4. Contact Angle Measurements Using a Simplified Experimental Setup

    ERIC Educational Resources Information Center

    Lamour, Guillaume; Hamraoui, Ahmed; Buvailo, Andrii; Xing, Yangjun; Keuleyan, Sean; Prakash, Vivek; Eftekhari-Bafrooei, Ali; Borguet, Eric

    2010-01-01

    A basic and affordable experimental apparatus is described that measures the static contact angle of a liquid drop in contact with a solid. The image of the drop is made with a simple digital camera by taking a picture that is magnified by an optical lens. The profile of the drop is then processed with ImageJ free software. The ImageJ contact…

  5. Casting and Angling.

    ERIC Educational Resources Information Center

    Smith, Julian W.

    As part of a series of books and pamphlets on outdoor education, this manual consists of easy-to-follow instructions for fishing activities dealing with casting and angling. The manual may be used as a part of the regular physical education program in schools and colleges or as a club activity for the accomplished weekend fisherman or the…

  6. Casting and Angling.

    ERIC Educational Resources Information Center

    Little, Mildred J.; Bunting, Camille

    The self-contained packet contains background information, lesson plans, 15 transparency and student handout masters, drills and games, 2 objective examinations, and references for teaching a 15-day unit on casting and angling to junior high and senior high school students, either as part of a regular physical education program or as a club…

  7. Rover mast calibration, exact camera pointing, and camara handoff for visual target tracking

    NASA Technical Reports Server (NTRS)

    Kim, Won S.; Ansar, Adnan I.; Steele, Robert D.

    2005-01-01

    This paper presents three technical elements that we have developed to improve the accuracy of the visual target tracking for single-sol approach-and-instrument placement in future Mars rover missions. An accurate, straightforward method of rover mast calibration is achieved by using a total station, a camera calibration target, and four prism targets mounted on the rover. The method was applied to Rocky8 rover mast calibration and yielded a 1.1-pixel rms residual error. Camera pointing requires inverse kinematic solutions for mast pan and tilt angles such that the target image appears right at the center of the camera image. Two issues were raised. Mast camera frames are in general not parallel to the masthead base frame. Further, the optical axis of the camera model in general does not pass through the center of the image. Despite these issues, we managed to derive non-iterative closed-form exact solutions, which were verified with Matlab routines. Actual camera pointing experiments aver 50 random target image paints yielded less than 1.3-pixel rms pointing error. Finally, a purely geometric method for camera handoff using stereo views of the target has been developed. Experimental test runs show less than 2.5 pixels error on high-resolution Navcam for Pancam-to-Navcam handoff, and less than 4 pixels error on lower-resolution Hazcam for Navcam-to-Hazcam handoff.

  8. Photometric Calibration and Image Stitching for a Large Field of View Multi-Camera System

    PubMed Central

    Lu, Yu; Wang, Keyi; Fan, Gongshu

    2016-01-01

    A new compact large field of view (FOV) multi-camera system is introduced. The camera is based on seven tiny complementary metal-oxide-semiconductor sensor modules covering over 160° × 160° FOV. Although image stitching has been studied extensively, sensor and lens differences have not been considered in previous multi-camera devices. In this study, we have calibrated the photometric characteristics of the multi-camera device. Lenses were not mounted on the sensor in the process of radiometric response calibration to eliminate the influence of the focusing effect of uniform light from an integrating sphere. Linearity range of the radiometric response, non-linearity response characteristics, sensitivity, and dark current of the camera response function are presented. The R, G, and B channels have different responses for the same illuminance. Vignetting artifact patterns have been tested. The actual luminance of the object is retrieved by sensor calibration results, and is used to blend images to make panoramas reflect the objective luminance more objectively. This compensates for the limitation of stitching images that are more realistic only through the smoothing method. The dynamic range limitation of can be resolved by using multiple cameras that cover a large field of view instead of a single image sensor with a wide-angle lens. The dynamic range is expanded by 48-fold in this system. We can obtain seven images in one shot with this multi-camera system, at 13 frames per second. PMID:27077857

  9. Miniature magnetically anchored and controlled camera system for trocar-less laparoscopy

    PubMed Central

    Dong, Ding-Hui; Zhu, Hao-Yang; Luo, Yu; Zhang, Hong-Ke; Xiang, Jun-Xi; Xue, Fei; Wu, Rong-Qian; Lv, Yi

    2017-01-01

    AIM To design a miniature magnetically anchored and controlled camera system to reduce the number of trocars which are required for laparoscopy. METHODS The system consists of a miniature magnetically anchored camera with a 30° downward angle, an external magnetically anchored unit, and a vision output device. The camera weighs 12 g, measures Φ10.5 mm × 55 mm and has two magnets, a vision model, a light source, and a metal hexagonal nut. To test the prototype, the camera was inserted through a 12-mm conventional trocar in an ex vivo real liver laparoscopic training system. A trocar-less laparoscopic cholecystectomy was performed 6 times using a 12-mm and a 5-mm conventional trocar. In addition, the same procedure was performed in four canine models. RESULTS Both procedures were successfully performed using only two conventional laparoscopic trocars. The cholecystectomy was completed without any major complication in 42 min (38-45 min) in vitro and in 50 min (45-53 min) using an animal model. This camera was anchored and controlled by an external unit magnetically anchored on the abdominal wall. The camera could generate excellent image. with no instrument collisions. CONCLUSION The camera system we designed provides excellent optics and can be easily maneuvered. The number of conventional trocars is reduced without adding technical difficulties.

  10. Photometric Calibration and Image Stitching for a Large Field of View Multi-Camera System.

    PubMed

    Lu, Yu; Wang, Keyi; Fan, Gongshu

    2016-04-11

    A new compact large field of view (FOV) multi-camera system is introduced. The camera is based on seven tiny complementary metal-oxide-semiconductor sensor modules covering over 160° × 160° FOV. Although image stitching has been studied extensively, sensor and lens differences have not been considered in previous multi-camera devices. In this study, we have calibrated the photometric characteristics of the multi-camera device. Lenses were not mounted on the sensor in the process of radiometric response calibration to eliminate the influence of the focusing effect of uniform light from an integrating sphere. Linearity range of the radiometric response, non-linearity response characteristics, sensitivity, and dark current of the camera response function are presented. The R, G, and B channels have different responses for the same illuminance. Vignetting artifact patterns have been tested. The actual luminance of the object is retrieved by sensor calibration results, and is used to blend images to make panoramas reflect the objective luminance more objectively. This compensates for the limitation of stitching images that are more realistic only through the smoothing method. The dynamic range limitation of can be resolved by using multiple cameras that cover a large field of view instead of a single image sensor with a wide-angle lens. The dynamic range is expanded by 48-fold in this system. We can obtain seven images in one shot with this multi-camera system, at 13 frames per second.

  11. AWiFS camera for Resourcesat

    NASA Astrophysics Data System (ADS)

    Dave, Himanshu; Dewan, Chirag; Paul, Sandip; Sarkar, S. S.; Pandya, Himanshu; Joshi, S. R.; Mishra, Ashish; Detroja, Manoj

    2006-12-01

    Remote sensors were developed and used extensively world over using aircraft and space platforms. India has developed and launched many sensors into space to survey natural resources. The AWiFS is one such Camera, launched onboard Resourcesat-1 satellite by ISRO in 2003. It is a medium resolution camera with 5-day revisit designed for studies related to forestry, vegetation, soil, snow and disaster warning. The camera provides 56m (nadir) resolution from 817 km altitude in three visible bands and one SWIR band. This paper deals with configuration features of AWiFS Camera of Resourcesat-1, its onboard performance and also the highlights of Camera being developed for Resourcesat-2. The AWiFS is realized with two identical cameras viz. AWiFS-A and AWiFS-B, which cover the large field of view of 48°. Each camera consists of independent collecting optics and associated 6000 element detectors and electronics catering to 4 bands. The visible bands use linear Silicon CCDs, with 10μ × 7μ element while SWIR band uses 13μ staggered InGaAs linear active pixels. Camera Electronics are custom designed for each detector based on detector and system requirements. The camera covers the total dynamic range up to 100% albedo with a single gain setting and 12-bit digitization of which 10 MSBs are transmitted. The Camera saturation radiance of each band can also be selected through telecommand. The Camera provides very high SNR of about 700 near saturation. The camera components are housed in specially designed Invar structures. The AWiFS Camera onboard Resourcesat-1 is providing excellent imageries and the data is routinely used world over. AWiFS for Resourcesat-2 is being developed with overall performance specifications remaining same. The Camera electronics is miniaturized with reductions in hardware packages, size and weight to one third.

  12. Effect of applied interparticle force on the static and dynamic angles of repose of spherical granular material.

    PubMed

    Forsyth, A J; Hutton, S R; Rhodes, M J; Osborne, C F

    2001-03-01

    The static angle of repose for iron spheres in a narrow box, and the dynamic angle of repose for iron spheres in a narrow, half-filled rotating drum is investigated. A feature of this paper is the use of a homogenous magnetic field to induce an attractive interparticle force, allowing a wide range of angles of repose to be investigated and characterized as a function of interparticle force. The static and dynamic angles of repose were found to increase approximately linearly with increasing interparticle force.

  13. Radiometric calibration for MWIR cameras

    NASA Astrophysics Data System (ADS)

    Yang, Hyunjin; Chun, Joohwan; Seo, Doo Chun; Yang, Jiyeon

    2012-06-01

    Korean Multi-purpose Satellite-3A (KOMPSAT-3A), which weighing about 1,000 kg is scheduled to be launched in 2013 and will be located at a sun-synchronous orbit (SSO) of 530 km in altitude. This is Korea's rst satellite to orbit with a mid-wave infrared (MWIR) image sensor, which is currently being developed at Korea Aerospace Research Institute (KARI). The missions envisioned include forest re surveillance, measurement of the ocean surface temperature, national defense and crop harvest estimate. In this paper, we shall explain the MWIR scene generation software and atmospheric compensation techniques for the infrared (IR) camera that we are currently developing. The MWIR scene generation software we have developed taking into account sky thermal emission, path emission, target emission, sky solar scattering and ground re ection based on MODTRAN data. Here, this software will be used for generating the radiation image in the satellite camera which requires an atmospheric compensation algorithm and the validation of the accuracy of the temperature which is obtained in our result. Image visibility restoration algorithm is a method for removing the eect of atmosphere between the camera and an object. This algorithm works between the satellite and the Earth, to predict object temperature noised with the Earth's atmosphere and solar radiation. Commonly, to compensate for the atmospheric eect, some softwares like MODTRAN is used for modeling the atmosphere. Our algorithm doesn't require an additional software to obtain the surface temperature. However, it needs to adjust visibility restoration parameters and the precision of the result still should be studied.

  14. Imaging and radiometric performance simulation for a new high-performance dual-band airborne reconnaissance camera

    NASA Astrophysics Data System (ADS)

    Seong, Sehyun; Yu, Jinhee; Ryu, Dongok; Hong, Jinsuk; Yoon, Jee-Yeon; Kim, Sug-Whan; Lee, Jun-Ho; Shin, Myung-Jin

    2009-05-01

    In recent years, high performance visible and IR cameras have been used widely for tactical airborne reconnaissance. The process improvement for efficient discrimination and analysis of complex target information from active battlefields requires for simultaneous multi-band measurement from airborne platforms at various altitudes. We report a new dual band airborne camera designed for simultaneous registration of both visible and IR imagery from mid-altitude ranges. The camera design uses a common front end optical telescope of around 0.3m in entrance aperture and several relay optical sub-systems capable of delivering both high spatial resolution visible and IR images to the detectors. The camera design is benefited from the use of several optical channels packaged in a compact space and the associated freedom to choose between wide (~3 degrees) and narrow (~1 degree) field of view. In order to investigate both imaging and radiometric performances of the camera, we generated an array of target scenes with optical properties such as reflection, refraction, scattering, transmission and emission. We then combined the target scenes and the camera optical system into the integrated ray tracing simulation environment utilizing Monte Carlo computation technique. Taking realistic atmospheric radiative transfer characteristics into account, both imaging and radiometric performances were then investigated. The simulation results demonstrate successfully that the camera design satisfies NIIRS 7 detection criterion. The camera concept, details of performance simulation computation, the resulting performances are discussed together with future development plan.

  15. One frame subnanosecond spectroscopy camera

    NASA Astrophysics Data System (ADS)

    Silkis, E. G.; Titov, V. D.; Fel'Dman, G. G.; Zhilkina, V. M.; Petrokovich, O. A.; Syrtsev, V. N.

    1991-04-01

    The recording of ultraweak spectra is presently undertaken by a high-speed multichannel-spectrum camera (HSMSC) with a subnanosec-range time resolution in its photon-counting mode. This HSMSC's photodetector is a one-frame streak tube equipped with a grid shutter which is connected via fiber-optic contact to a linear CCD. The grain furnished by the streak tube on the basis of a microchannel plate is sufficiently high for recording single photoelectron signals. The HSMSC is compact and easy to handle.

  16. Digital laser scanning fundus camera.

    PubMed

    Plesch, A; Klingbeil, U; Bille, J

    1987-04-15

    Imaging and documentation of the human retina for clinical diagnostics are conventionally achieved by classical optical methods. We designed a digital laser scanning fundus camera. The optoelectronical instrument is based on scanning laser illumination of the retina and a modified video imaging procedure. It is coupled to a digital image buffer and a microcomputer for image storage and processing. Aside from its high sensitivity the LSF incorporates new ophthalmic imaging methods like polarization differential contrast. We give design considerations as well as a description of the instrument and its performance.

  17. Video cameras on wild birds.

    PubMed

    Rutz, Christian; Bluff, Lucas A; Weir, Alex A S; Kacelnik, Alex

    2007-11-02

    New Caledonian crows (Corvus moneduloides) are renowned for using tools for extractive foraging, but the ecological context of this unusual behavior is largely unknown. We developed miniaturized, animal-borne video cameras to record the undisturbed behavior and foraging ecology of wild, free-ranging crows. Our video recordings enabled an estimate of the species' natural foraging efficiency and revealed that tool use, and choice of tool materials, are more diverse than previously thought. Video tracking has potential for studying the behavior and ecology of many other bird species that are shy or live in inaccessible habitats.

  18. Dynamic contact angle analysis of silicone hydrogel contact lenses.

    PubMed

    Read, Michael Leonard; Morgan, Philip Bruce; Kelly, Jeremiah Michael; Maldonado-Codina, Carole

    2011-07-01

    Contact angle measurements are used to infer the clinical wetting characteristics of contact lenses. Such characterization has become more commonplace since the introduction of silicone hydrogel contact lens materials, which have been associated with reduced in vivo wetting due to the inclusion of siloxane-containing components. Using consistent methodology and a single investigator, advancing and receding contact angles were measured for 11 commercially available silicone hydrogel contact lens types with a dynamic captive bubble technique employing customized, fully automated image analysis. Advancing contact angles were found to range between 20° and 72° with the lenses falling into six statistically discrete groupings. Receding contact angles fell within a narrower range, between 17° and 22°, with the lenses segregated into three groups. The relationship between these laboratory measurements and the clinical performance of the lenses requires further investigation.

  19. A Different Angle on Perspective

    ERIC Educational Resources Information Center

    Frantz, Marc

    2012-01-01

    When a plane figure is photographed from different viewpoints, lengths and angles appear distorted. Hence it is often assumed that lengths, angles, protractors, and compasses have no place in projective geometry. Here we describe a sense in which certain angles are preserved by projective transformations. These angles can be constructed with…

  20. Global viscous overstabilities in narrow rings

    NASA Astrophysics Data System (ADS)

    Longaretti, Pierre-Yves; French, Richard G.; Nicholson, Philip D.

    2016-10-01

    Local viscous overstabilities have been the focus of a number of theoretical analyses in the last decades due to the rôle they are believed to play in the creation of the small scale structure of broad ring systems (Saturn, Uranus). Global viscous overstabilities have also been investigated in the 1980s and 1990s as a potential source of narrow ring eccentricities (Longaretti and Rappaport, 1995, Icarus, 116, 376).An important feature of global viscous overstabilities is that they produce slow relative librating or circulating motions of narrow ring edges; they may also produce slowly librating or circulating components of edge modes. This process is potentially relevant to explain the occurrence of unusually large apsidal shifts observed in some saturnian ringlets and may also explain the existence of the free m=2 B ring edge mode that is slowly circulating with respect to the component forced by Mimas.The time-scale of such motions is primarily controlled by the ring self-gravity and can be analytically quantified in a two-streamline analysis which yields a characteristic libration/circulation frequency Ωl = (n/π)(Mr/Mp)(a/δa)2H(q2) where n is the mean motion, Mr the ringlet or pertubed region mass, Mp the planet mass, a the semi-major axis, δa the narrow ringlet or pertubed region width and H(q2) a dimensionless factor of order unity that depends on the streamline compression parameter q. The related time-scale is of the order of a few years to a few tens of years depending on the surface density and ringlet/perturbed region geometry. Preliminary data analyzes indicate that the Maxwell and Huyghens ringlets are probably librating with periods consistent with this two-streamline estimate.The talk will briefly present the physics of global viscous overstabilities as well as more detailed applications to narrow rings, and if time permits, to edge modes.

  1. Are Quantity-Distances Narrowing in?

    DTIC Science & Technology

    2010-07-01

    REPORT TYPE N/A 3 . DATES COVERED - 4. TITLE AND SUBTITLE Are Quantity-Distances Narrowing in? 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c...adopt AASTP-1 and AASTP-5 for ammunition storage /1/, /2 /. Quantity-Distance History The history of QD goes far back, see e.g. / 3 /, /4/, /5...6 3 /. Reference / / quotes /7 /: “Act for preventing the Mischiefs which may happen by keeping too great Quantities of gunpowder in or near

  2. Digital cameras with designs inspired by the arthropod eye.

    PubMed

    Song, Young Min; Xie, Yizhu; Malyarchuk, Viktor; Xiao, Jianliang; Jung, Inhwa; Choi, Ki-Joong; Liu, Zhuangjian; Park, Hyunsung; Lu, Chaofeng; Kim, Rak-Hwan; Li, Rui; Crozier, Kenneth B; Huang, Yonggang; Rogers, John A

    2013-05-02

    In arthropods, evolution has created a remarkably sophisticated class of imaging systems, with a wide-angle field of view, low aberrations, high acuity to motion and an infinite depth of field. A challenge in building digital cameras with the hemispherical, compound apposition layouts of arthropod eyes is that essential design requirements cannot be met with existing planar sensor technologies or conventional optics. Here we present materials, mechanics and integration schemes that afford scalable pathways to working, arthropod-inspired cameras with nearly full hemispherical shapes (about 160 degrees). Their surfaces are densely populated by imaging elements (artificial ommatidia), which are comparable in number (180) to those of the eyes of fire ants (Solenopsis fugax) and bark beetles (Hylastes nigrinus). The devices combine elastomeric compound optical elements with deformable arrays of thin silicon photodetectors into integrated sheets that can be elastically transformed from the planar geometries in which they are fabricated to hemispherical shapes for integration into apposition cameras. Our imaging results and quantitative ray-tracing-based simulations illustrate key features of operation. These general strategies seem to be applicable to other compound eye devices, such as those inspired by moths and lacewings (refracting superposition eyes), lobster and shrimp (reflecting superposition eyes), and houseflies (neural superposition eyes).

  3. Light field panorama by a plenoptic camera

    NASA Astrophysics Data System (ADS)

    Xue, Zhou; Baboulaz, Loic; Prandoni, Paolo; Vetterli, Martin

    2013-03-01

    Consumer-grade plenoptic camera Lytro draws a lot of interest from both academic and industrial world. However its low resolution in both spatial and angular domain prevents it from being used for fine and detailed light field acquisition. This paper proposes to use a plenoptic camera as an image scanner and perform light field stitching to increase the size of the acquired light field data. We consider a simplified plenoptic camera model comprising a pinhole camera moving behind a thin lens. Based on this model, we describe how to perform light field acquisition and stitching under two different scenarios: by camera translation or by camera translation and rotation. In both cases, we assume the camera motion to be known. In the case of camera translation, we show how the acquired light fields should be resampled to increase the spatial range and ultimately obtain a wider field of view. In the case of camera translation and rotation, the camera motion is calculated such that the light fields can be directly stitched and extended in the angular domain. Simulation results verify our approach and demonstrate the potential of the motion model for further light field applications such as registration and super-resolution.

  4. The Pan-STARRS Gigapixel Camera

    NASA Astrophysics Data System (ADS)

    Tonry, J.; Onaka, P.; Luppino, G.; Isani, S.

    The Pan-STARRS project will undertake repeated surveys of the sky to find "Killer Asteroids", everything else which moves or blinks, and to build an unprecedented deep and accurate "static sky". The key enabling technology is a new generation of large format cameras that offer an order of magnitude improvement in size, speed, and cost compared to existing instruments. In this talk, we provide an overview of the camera research and development effort being undertaken by the Institute for Astronomy Camera Group in partnership with MIT Lincoln Laboratories. The main components of the camera subsystem will be identified and briefly described as an introduction to the more specialized talks presented elsewhere at this conference. We will focus on the development process followed at the IfA utilizing the orthogonal transfer CCD in building cameras of various sizes from a single OTA "mcam", to a 16-OTA "Test Camera", to the final 64-OTA 1.4 billion pixel camera (Gigapixel Camera #1 or GPC1) to be used for PS1 survey operations. We also show the design of a deployable Shack-Hartmann device residing in the camera and other auxiliary instrumentation used to support camera operations.

  5. Computational cameras: convergence of optics and processing.

    PubMed

    Zhou, Changyin; Nayar, Shree K

    2011-12-01

    A computational camera uses a combination of optics and processing to produce images that cannot be captured with traditional cameras. In the last decade, computational imaging has emerged as a vibrant field of research. A wide variety of computational cameras has been demonstrated to encode more useful visual information in the captured images, as compared with conventional cameras. In this paper, we survey computational cameras from two perspectives. First, we present a taxonomy of computational camera designs according to the coding approaches, including object side coding, pupil plane coding, sensor side coding, illumination coding, camera arrays and clusters, and unconventional imaging systems. Second, we use the abstract notion of light field representation as a general tool to describe computational camera designs, where each camera can be formulated as a projection of a high-dimensional light field to a 2-D image sensor. We show how individual optical devices transform light fields and use these transforms to illustrate how different computational camera designs (collections of optical devices) capture and encode useful visual information.

  6. A Unifying Theory for Camera Calibration.

    PubMed

    Ramalingam, SriKumar; Sturm, Peter

    2016-07-19

    This paper proposes a unified theory for calibrating a wide variety of camera models such as pinhole, fisheye, cata-dioptric, and multi-camera networks. We model any camera as a set of image pixels and their associated camera rays in space. Every pixel measures the light traveling along a (half-) ray in 3-space, associated with that pixel. By this definition, calibration simply refers to the computation of the mapping between pixels and the associated 3D rays. Such a mapping can be computed using images of calibration grids, which are objects with known 3D geometry, taken from unknown positions. This general camera model allows to represent non-central cameras; we also consider two special subclasses, namely central and axial cameras. In a central camera, all rays intersect in a single point, whereas the rays are completely arbitrary in a non-central one. Axial cameras are an intermediate case: the camera rays intersect a single line. In this work, we show the theory for calibrating central, axial and non-central models using calibration grids, which can be either three-dimensional or planar.

  7. Optimising camera traps for monitoring small mammals.

    PubMed

    Glen, Alistair S; Cockburn, Stuart; Nichols, Margaret; Ekanayake, Jagath; Warburton, Bruce

    2013-01-01

    Practical techniques are required to monitor invasive animals, which are often cryptic and occur at low density. Camera traps have potential for this purpose, but may have problems detecting and identifying small species. A further challenge is how to standardise the size of each camera's field of view so capture rates are comparable between different places and times. We investigated the optimal specifications for a low-cost camera trap for small mammals. The factors tested were 1) trigger speed, 2) passive infrared vs. microwave sensor, 3) white vs. infrared flash, and 4) still photographs vs. video. We also tested a new approach to standardise each camera's field of view. We compared the success rates of four camera trap designs in detecting and taking recognisable photographs of captive stoats (Mustelaerminea), feral cats (Felis catus) and hedgehogs (Erinaceuseuropaeus). Trigger speeds of 0.2-2.1 s captured photographs of all three target species unless the animal was running at high speed. The camera with a microwave sensor was prone to false triggers, and often failed to trigger when an animal moved in front of it. A white flash produced photographs that were more readily identified to species than those obtained under infrared light. However, a white flash may be more likely to frighten target animals, potentially affecting detection probabilities. Video footage achieved similar success rates to still cameras but required more processing time and computer memory. Placing two camera traps side by side achieved a higher success rate than using a single camera. Camera traps show considerable promise for monitoring invasive mammal control operations. Further research should address how best to standardise the size of each camera's field of view, maximise the probability that an animal encountering a camera trap will be detected, and eliminate visible or audible cues emitted by camera traps.

  8. NARROW-K-BAND OBSERVATIONS OF THE GJ 1214 SYSTEM

    SciTech Connect

    Colón, Knicole D.; Gaidos, Eric

    2013-10-10

    GJ 1214 is a nearby M dwarf star that hosts a transiting super-Earth-size planet, making this system an excellent target for atmospheric studies. Most studies find that the transmission spectrum of GJ 1214b is flat, which favors either a high mean molecular weight or cloudy/hazy hydrogen (H) rich atmosphere model. Photometry at short wavelengths (<0.7 μm) and in the K band can discriminate the most between these different atmosphere models for GJ 1214b, but current observations do not have sufficiently high precision. We present photometry of seven transits of GJ 1214b through a narrow K-band (2.141 μm) filter with the Wide Field Camera on the 3.8 m United Kingdom Infrared Telescope. Our photometric precision is typically 1.7 × 10{sup –3} (for a single transit), comparable with other ground-based observations of GJ 1214b. We measure a planet-star radius ratio of 0.1158 ± 0.0013, which, along with other studies, also supports a flat transmission spectrum for GJ 1214b. Since this does not exclude a scenario where GJ 1214b has an H-rich envelope with heavy elements that are sequestered below a cloud/haze layer, we compare K-band observations with models of H{sub 2} collision-induced absorption in an atmosphere for a range of temperatures. While we find no evidence for deviation from a flat spectrum (slope s = 0.0016 ± 0.0038), an H{sub 2}-dominated upper atmosphere (<60 mbar) cannot be excluded. More precise observations at <0.7 μm and in the K band, as well as a uniform analysis of all published data, would be useful for establishing more robust limits on atmosphere models for GJ 1214b.

  9. The Zwicky Transient Facility Camera

    NASA Astrophysics Data System (ADS)

    Dekany, Richard; Smith, Roger M.; Belicki, Justin; Delacroix, Alexandre; Duggan, Gina; Feeney, Michael; Hale, David; Kaye, Stephen; Milburn, Jennifer; Murphy, Patrick; Porter, Michael; Reiley, Daniel J.; Riddle, Reed L.; Rodriguez, Hector; Bellm, Eric C.

    2016-08-01

    The Zwicky Transient Facility Camera (ZTFC) is a key element of the ZTF Observing System, the integrated system of optoelectromechanical instrumentation tasked to acquire the wide-field, high-cadence time-domain astronomical data at the heart of the Zwicky Transient Facility. The ZTFC consists of a compact cryostat with large vacuum window protecting a mosaic of 16 large, wafer-scale science CCDs and 4 smaller guide/focus CCDs, a sophisticated vacuum interface board which carries data as electrical signals out of the cryostat, an electromechanical window frame for securing externally inserted optical filter selections, and associated cryo-thermal/vacuum system support elements. The ZTFC provides an instantaneous 47 deg2 field of view, limited by primary mirror vignetting in its Schmidt telescope prime focus configuration. We report here on the design and performance of the ZTF CCD camera cryostat and report results from extensive Joule-Thompson cryocooler tests that may be of broad interest to the instrumentation community.

  10. Still red light for red light cameras? An update.

    PubMed

    Høye, Alena

    2013-06-01

    The present study has replicated the results from a previous meta-analysis by Erke (2009) [Erke, A., 2009. Red light for red-light cameras? A meta-analysis of the effects of red-light cameras on crashes. Accident Analysis & Prevention 41 (5), 897-905.] based on a larger sample of RLC-studies, and provides answers to the criticisms that were raised by Lund et al. (2009) [Lund, A.K., Kyrychenko, S.Y., Retting, R.A., 2009. Caution: a comment on Alena Erke's red light for red-light cameras? A meta-analysis of the effects of red-light cameras on crashes. Accident Analysis and Prevention 41, 895-896.] against the previous meta-analysis. The addition of recent studies to the meta-analysis and a more thorough investigation of potential moderator variables lead to a slight improvement of the estimated effects of RLC in the previous meta-analysis. The present study found a non-significant increase of all crashes by 6% and a non-significant decrease of all injury crashes by 13%. Right-angle collisions were found to decrease by 13% and rear-end collisions were found to increase by 39%. For right-angle injury collisions a decrease by 33% was found and for rear-end injury collisions a smaller increase was found (+19%). The effects of RLC are likely to be more favorable when RLC-warning signs are set up at main entrances to areas with RLC enforcement than when each RLC-intersection is signposted. The effects of RLC may become more favorable over time, this could however not be investigated empirically. Several results indicate that spillover effects may occur for right-angle collisions, but most likely not for rear-end and other crashes. If spillover effects do not occur for rear-end crashes, which increase at RLC intersection, this would be a positive result for RLC. However, the results seem to be affected to some degree by publication bias and the effects may therefore be somewhat less favorable than indicated by the results from meta-analysis.

  11. CCD Camera Lens Interface for Real-Time Theodolite Alignment

    NASA Technical Reports Server (NTRS)

    Wake, Shane; Scott, V. Stanley, III

    2012-01-01

    Theodolites are a common instrument in the testing, alignment, and building of various systems ranging from a single optical component to an entire instrument. They provide a precise way to measure horizontal and vertical angles. They can be used to align multiple objects in a desired way at specific angles. They can also be used to reference a specific location or orientation of an object that has moved. Some systems may require a small margin of error in position of components. A theodolite can assist with accurately measuring and/or minimizing that error. The technology is an adapter for a CCD camera with lens to attach to a Leica Wild T3000 Theodolite eyepiece that enables viewing on a connected monitor, and thus can be utilized with multiple theodolites simultaneously. This technology removes a substantial part of human error by relying on the CCD camera and monitors. It also allows image recording of the alignment, and therefore provides a quantitative means to measure such error.

  12. Interference-induced angle-independent acoustical transparency

    SciTech Connect

    Qi, Lehua; Yu, Gaokun Wang, Ning; Wang, Xinlong; Wang, Guibo

    2014-12-21

    It is revealed that the Fano-like interference leads to the extraordinary acoustic transmission through a slab metamaterial of thickness much smaller than the wavelength, with each unit cell consisting of a Helmholtz resonator and a narrow subwavelength slit. More importantly, both the theoretical analysis and experimental measurement show that the angle-independent acoustical transparency can be realized by grafting a Helmholtz resonator and a quarter-wave resonator to the wall of a narrow subwavelength slit in each unit cell of a slit array. The observed phenomenon results from the interferences between the waves propagating in the slit, those re-radiated by the Helmholtz resonator, and those re-radiated by the quarter-wave resonator. The proposed design may find its applications in designing angle-independent acoustical filters and controlling the phase of the transmitted waves.

  13. First results from the Faint Object Camera - SN 1987A

    NASA Technical Reports Server (NTRS)

    Jakobsen, P.; Albrecht, R.; Barbieri, C.; Blades, J. C.; Boksenberg, A.

    1991-01-01

    The first images of SN 1987A taken on day 1278 after outburst with the Faint Object Camera on board the Hubble Space Telescope are presented. The supernova is well detected and resolved spatially in three broadband ultraviolet exposures spanning the 1500-3800 A range and in a narrow-band image centered on the forbidden O III 5007 line. Simple uniform disk fits to the profiles of SN 1987A yield an average angular diameter of 170 + or - 30 mas, corresponding to an average expansion velocity of 6000 km/s. The derived broadband ultraviolet fluxes, when corrected for interstellar absorption, indicate a blue ultraviolet spectrum corresponding to a color temperature near 13,000 K.

  14. Angles in the Sky?

    NASA Astrophysics Data System (ADS)

    Behr, Bradford

    2005-09-01

    Tycho Brahe lived and worked in the late 1500s before the telescope was invented. He made highly accurate observations of the positions of planets, stars, and comets using large angle-measuring devices of his own design. You can use his techniques to observe the sky as well. For example, the degree, a common unit of measurement in astronomy, can be measured by holding your fist at arm's length up to the sky. Open your fist and observe the distance across the sky covered by the width of your pinky fingernail. That is, roughly, a degree! After some practice, and knowing that one degree equals four minutes, you can measure elapsed time by measuring the angle of the distance that the Moon appears to have moved and multiplying that number by four. You can also figure distances and sizes of things. These are not precise measurements, but rough estimates that can give you a "close-enough" answer.

  15. Contact angle hysteresis explained.

    PubMed

    Gao, Lichao; McCarthy, Thomas J

    2006-07-04

    A view of contact angle hysteresis from the perspectives of the three-phase contact line and of the kinetics of contact line motion is given. Arguments are made that advancing and receding are discrete events that have different activation energies. That hysteresis can be quantified as an activation energy by the changes in interfacial area is argued. That this is an appropriate way of viewing hysteresis is demonstrated with examples.

  16. Laser angle sensor

    NASA Technical Reports Server (NTRS)

    Pond, C. R.; Texeira, P. D.

    1985-01-01

    A laser angle measurement system was designed and fabricated for NASA Langley Research Center. The instrument is a fringe counting interferometer that monitors the pitch attitude of a model in a wind tunnel. A laser source and detector are mounted above the model. Interference fringes are generated by a small passive element on the model. The fringe count is accumulated and displayed by a processor in the wind tunnel control room. This report includes optical and electrical schematics, system maintenance and operation procedures.

  17. Narrow resonances and short-range interactions

    NASA Astrophysics Data System (ADS)

    Gelman, Boris A.

    2009-09-01

    Narrow resonances in systems with short-range interactions are discussed in an effective field theory (EFT) framework. An effective Lagrangian is formulated in the form of a combined expansion in powers of a momentum Q≪Λ—a short-distance scale—and an energy difference δɛ=|E-ɛ0|≪ɛ0—a resonance peak energy. At leading order in the combined expansion, a two-body scattering amplitude is the sum of a smooth background term of order Q0 and a Breit-Wigner term of order Q2(δɛ)-1 which becomes dominant for δɛ≲Q3. Such an EFT is applicable to systems in which short-distance dynamics generates a low-lying quasistationary state. The EFT is generalized to describe a narrow low-lying resonance in a system of charged particles. It is shown that in the case of Coulomb repulsion, a two-body scattering amplitude at leading order in a combined expansion is the sum of a Coulomb-modified background term and a Breit-Wigner amplitude with parameters renormalized by Coulomb interactions.

  18. Narrow resonances and short-range interactions

    SciTech Connect

    Gelman, Boris A.

    2009-09-15

    Narrow resonances in systems with short-range interactions are discussed in an effective field theory (EFT) framework. An effective Lagrangian is formulated in the form of a combined expansion in powers of a momentum Q<<{lambda}--a short-distance scale--and an energy difference {delta}{epsilon}=|E-{epsilon}{sub 0}|<<{epsilon}{sub 0}--a resonance peak energy. At leading order in the combined expansion, a two-body scattering amplitude is the sum of a smooth background term of order Q{sup 0} and a Breit-Wigner term of order Q{sup 2}({delta}{epsilon}){sup -1} which becomes dominant for {delta}{epsilon} < or approx. Q{sup 3}. Such an EFT is applicable to systems in which short-distance dynamics generates a low-lying quasistationary state. The EFT is generalized to describe a narrow low-lying resonance in a system of charged particles. It is shown that in the case of Coulomb repulsion, a two-body scattering amplitude at leading order in a combined expansion is the sum of a Coulomb-modified background term and a Breit-Wigner amplitude with parameters renormalized by Coulomb interactions.

  19. Narrow field electromagnetic sensor system and method

    DOEpatents

    McEwan, Thomas E.

    1996-01-01

    A narrow field electromagnetic sensor system and method of sensing a characteristic of an object provide the capability to realize a characteristic of an object such as density, thickness, or presence, for any desired coordinate position on the object. One application is imaging. The sensor can also be used as an obstruction detector or an electronic trip wire with a narrow field without the disadvantages of impaired performance when exposed to dirt, snow, rain, or sunlight. The sensor employs a transmitter for transmitting a sequence of electromagnetic signals in response to a transmit timing signal, a receiver for sampling only the initial direct RF path of the electromagnetic signal while excluding all other electromagnetic signals in response to a receive timing signal, and a signal processor for processing the sampled direct RF path electromagnetic signal and providing an indication of the characteristic of an object. Usually, the electromagnetic signal is a short RF burst and the obstruction must provide a substantially complete eclipse of the direct RF path. By employing time-of-flight techniques, a timing circuit controls the receiver to sample only the initial direct RF path of the electromagnetic signal while not sampling indirect path electromagnetic signals. The sensor system also incorporates circuitry for ultra-wideband spread spectrum operation that reduces interference to and from other RF services while allowing co-location of multiple electronic sensors without the need for frequency assignments.

  20. Narrow field electromagnetic sensor system and method

    DOEpatents

    McEwan, T.E.

    1996-11-19

    A narrow field electromagnetic sensor system and method of sensing a characteristic of an object provide the capability to realize a characteristic of an object such as density, thickness, or presence, for any desired coordinate position on the object. One application is imaging. The sensor can also be used as an obstruction detector or an electronic trip wire with a narrow field without the disadvantages of impaired performance when exposed to dirt, snow, rain, or sunlight. The sensor employs a transmitter for transmitting a sequence of electromagnetic signals in response to a transmit timing signal, a receiver for sampling only the initial direct RF path of the electromagnetic signal while excluding all other electromagnetic signals in response to a receive timing signal, and a signal processor for processing the sampled direct RF path electromagnetic signal and providing an indication of the characteristic of an object. Usually, the electromagnetic signal is a short RF burst and the obstruction must provide a substantially complete eclipse of the direct RF path. By employing time-of-flight techniques, a timing circuit controls the receiver to sample only the initial direct RF path of the electromagnetic signal while not sampling indirect path electromagnetic signals. The sensor system also incorporates circuitry for ultra-wideband spread spectrum operation that reduces interference to and from other RF services while allowing co-location of multiple electronic sensors without the need for frequency assignments. 12 figs.

  1. Studies of narrow autoionizing resonances in gadolinium

    SciTech Connect

    Bushaw, Bruce A.; Nortershauser, W.; Blaum, K.; Wendt, Klaus

    2003-06-30

    The autoionization (AI) spectrum of gadolinium between the first and second limits has been investigated by triple-resonance excitation with high-resolution cw lasers. A large number of narrow AI resonances have been observed and assigned total angular momentum J values. The resonances are further divided into members of AI Rydberg series converging to the second limit or other ''interloping'' levels. Fine structure in the Rydberg series has been identified and interpreted in terms of Jc j coupling. A number of detailed studies have been performed on the interloping resonances: These include lifetime determination by lineshape analysis, isotope shifts, hyperfine structure, and photoionization saturation parameters. The electronic structure of the interloping levels is discussed in terms of these studies. Linewidths generally decrease with increasing total angular momentum and the J = 7 resonances are extremely narrow with Lorentzian widths ranging from < 1 MHz up to 157 MHz. The strongest resonances are found to have cross-sections of {approx}10-12 cm{sup 2} and photoionization can be saturated with powers available from cw diode lasers.

  2. Small Angle Neutron Scattering

    SciTech Connect

    Urban, Volker S

    2012-01-01

    Small Angle Neutron Scattering (SANS) probes structural details at the nanometer scale in a non-destructive way. This article gives an introduction to scientists who have no prior small-angle scattering knowledge, but who seek a technique that allows elucidating structural information in challenging situations that thwart approaches by other methods. SANS is applicable to a wide variety of materials including metals and alloys, ceramics, concrete, glasses, polymers, composites and biological materials. Isotope and magnetic interactions provide unique methods for labeling and contrast variation to highlight specific structural features of interest. In situ studies of a material s responses to temperature, pressure, shear, magnetic and electric fields, etc., are feasible as a result of the high penetrating power of neutrons. SANS provides statistical information on significant structural features averaged over the probed sample volume, and one can use SANS to quantify with high precision the structural details that are observed, for example, in electron microscopy. Neutron scattering is non-destructive; there is no need to cut specimens into thin sections, and neutrons penetrate deeply, providing information on the bulk material, free from surface effects. The basic principles of a SANS experiment are fairly simple, but the measurement, analysis and interpretation of small angle scattering data involves theoretical concepts that are unique to the technique and that are not widely known. This article includes a concise description of the basics, as well as practical know-how that is essential for a successful SANS experiment.

  3. Focal lengths of Venus Monitoring Camera from limb locations

    NASA Astrophysics Data System (ADS)

    Limaye, Sanjay S.; Markiewicz, W. J.; Krauss, R.; Ignatiev, N.; Roatsch, T.; Matz, K. D.

    2015-08-01

    The Venus Monitoring Camera (VMC) carried by European Space Agency's Venus Express orbiter (Svedhem et al., 2007) consists of four optical units, each with a separate filter casting an image on a single CCD (Markiewicz et al., 2007a, 2007b). The desire to capture as much of the planet in a single frame during the spacecraft's 24 h, 0.84 eccentricity orbit led to optics with 18° field of view. Analysis of Venus images obtained by the VMC indicated that the computed limb radius and altitude of haze layers were somewhat inconsistent with prior knowledge and expectations. Possible causes include errors in the knowledge of image geometry, misalignment of the optic axis from the pointing direction, and optical distortion. These were explored and eliminated, leaving only deviations from the ground and pre-solar damage estimate of the focal length lengths as the most likely reason. We use the location of planet's limb to estimate the focal length of each camera using images of the planet when the orbiter was more than 20,000 km from planet center. The method relies on the limb radius to be constant at least over a small range of solar zenith angles. We were able to achieve better estimates for the focal lengths for all four cameras and also estimate small offsets to the boresight alignment. An outcome of this analysis is the finding that the slant unit optical depth varies more rapidly with solar zenith angle in the afternoon as compared to morning, with lowest values at local noon. A variation of this level is also observed with latitude. Both are indicative of the presence of overlying haze above the clouds, and the morning afternoon asymmetry suggests different photochemical processes in destruction and production of the haze.

  4. Propagation modeling results for narrow-beam undersea laser communications

    NASA Astrophysics Data System (ADS)

    Fletcher, Andrew S.; Hardy, Nicholas D.; Hamilton, Scott A.

    2016-03-01

    Communication links through ocean waters are challenging due to undersea propagation physics. Undersea optical communications at blue or green wavelengths can achieve high data rates (megabit- to gigabit-per-second class links) despite the challenging undersea medium. Absorption and scattering in ocean waters attenuate optical signals and distort the waveform through dense multipath. The exponential propagation loss and the temporal spread due to multipath limit the achievable link distance and data rate. In this paper, we describe the Monte Carlo modeling of the undersea scattering and absorption channel. We model photon signal attenuation levels, spatial photon distributions, time of arrival statistics, and angle of arrival statistics for a variety of lasercom scenarios through both clear and turbid water environments. Modeling results inform the design options for an undersea optical communication system, particularly illustrating the advantages of narrow-beam lasers compared to wide beam methods (e.g. LED sources). The modeled pupil plane and focal plane photon arrival distributions enable beam tracking techniques for robust pointing solutions, even in highly scattering harbor waters. Laser communication with collimated beams maximizes the photon transfer through the scattering medium and enables spatial and temporal filters to minimize waveform distortion and background interference.

  5. High Speed Digital Camera Technology Review

    NASA Technical Reports Server (NTRS)

    Clements, Sandra D.

    2009-01-01

    A High Speed Digital Camera Technology Review (HSD Review) is being conducted to evaluate the state-of-the-shelf in this rapidly progressing industry. Five HSD cameras supplied by four camera manufacturers participated in a Field Test during the Space Shuttle Discovery STS-128 launch. Each camera was also subjected to Bench Tests in the ASRC Imaging Development Laboratory. Evaluation of the data from the Field and Bench Tests is underway. Representatives from the imaging communities at NASA / KSC and the Optical Systems Group are participating as reviewers. A High Speed Digital Video Camera Draft Specification was updated to address Shuttle engineering imagery requirements based on findings from this HSD Review. This draft specification will serve as the template for a High Speed Digital Video Camera Specification to be developed for the wider OSG imaging community under OSG Task OS-33.

  6. Television camera video level control system

    NASA Technical Reports Server (NTRS)

    Kravitz, M.; Freedman, L. A.; Fredd, E. H.; Denef, D. E. (Inventor)

    1985-01-01

    A video level control system is provided which generates a normalized video signal for a camera processing circuit. The video level control system includes a lens iris which provides a controlled light signal to a camera tube. The camera tube converts the light signal provided by the lens iris into electrical signals. A feedback circuit in response to the electrical signals generated by the camera tube, provides feedback signals to the lens iris and the camera tube. This assures that a normalized video signal is provided in a first illumination range. An automatic gain control loop, which is also responsive to the electrical signals generated by the camera tube 4, operates in tandem with the feedback circuit. This assures that the normalized video signal is maintained in a second illumination range.

  7. A telephoto camera system with shooting direction control by gaze detection

    NASA Astrophysics Data System (ADS)

    Teraya, Daiki; Hachisu, Takumi; Yendo, Tomohiro

    2015-05-01

    For safe driving, it is important for driver to check traffic conditions such as traffic lights, or traffic signs as early as soon. If on-vehicle camera takes image of important objects to understand traffic conditions from long distance and shows these to driver, driver can understand traffic conditions earlier. To take image of long distance objects clearly, the focal length of camera must be long. When the focal length is long, on-vehicle camera doesn't have enough field of view to check traffic conditions. Therefore, in order to get necessary images from long distance, camera must have long-focal length and controllability of shooting direction. In previous study, driver indicates shooting direction on displayed image taken by a wide-angle camera, a direction controllable camera takes telescopic image, and displays these to driver. However, driver uses a touch panel to indicate the shooting direction in previous study. It is cause of disturb driving. So, we propose a telephoto camera system for driving support whose shooting direction is controlled by driver's gaze to avoid disturbing drive. This proposed system is composed of a gaze detector and an active telephoto camera whose shooting direction is controlled. We adopt non-wear detecting method to avoid hindrance to drive. The gaze detector measures driver's gaze by image processing. The shooting direction of the active telephoto camera is controlled by galvanometer scanners and the direction can be switched within a few milliseconds. We confirmed that the proposed system takes images of gazing straight ahead of subject by experiments.

  8. A state observer for using a slow camera as a sensor for fast control applications

    NASA Astrophysics Data System (ADS)

    Gahleitner, Reinhard; Schagerl, Martin

    2013-03-01

    This contribution concerns about a problem that often arises in vision based control, when a camera is used as a sensor for fast control applications, or more precisely, when the sample rate of the control loop is higher than the frame rate of the camera. In control applications for mechanical axes, e.g. in robotics or automated production, a camera and some image processing can be used as a sensor to detect positions or angles. The sample time in these applications is typically in the range of a few milliseconds or less and this demands the use of a camera with a high frame rate up to 1000 fps. The presented solution is a special state observer that can work with a slower and therefore cheaper camera to estimate the state variables at the higher sample rate of the control loop. To simplify the image processing for the determination of positions or angles and make it more robust, some LED markers are applied to the plant. Simulation and experimental results show that the concept can be used even if the plant is unstable like the inverted pendulum.

  9. Fraunhofer diffraction to determine the twin angle in single-crystal BaTiO3.

    PubMed

    Melnichuk, Mike; Wood, Lowell T

    2003-08-01

    We present a new method for determining the electrically induced twin angle alpha of a (100) bulk single crystal of barium titanate (BaTiO3) using a nondestructive optical technique based on Fraunhofer diffraction. The technique required two steps that were performed simultaneously. First, we analyzed the diffracted light intensity captured with a line camera. Second, we measured the size of the diffracting element by analyzing images of crystal's surface taken with a CCD camera. The value obtained for the twin angle is 0.67 degrees +/- 0.05 degrees, which compares favorably with the theoretical value of 0.63 degrees.

  10. Electrostatic camera system functional design study

    NASA Technical Reports Server (NTRS)

    Botticelli, R. A.; Cook, F. J.; Moore, R. F.

    1972-01-01

    A functional design study for an electrostatic camera system for application to planetary missions is presented. The electrostatic camera can produce and store a large number of pictures and provide for transmission of the stored information at arbitrary times after exposure. Preliminary configuration drawings and circuit diagrams for the system are illustrated. The camera system's size, weight, power consumption, and performance are characterized. Tradeoffs between system weight, power, and storage capacity are identified.

  11. Stationary Camera Aims And Zooms Electronically

    NASA Technical Reports Server (NTRS)

    Zimmermann, Steven D.

    1994-01-01

    Microprocessors select, correct, and orient portions of hemispherical field of view. Video camera pans, tilts, zooms, and provides rotations of images of objects of field of view, all without moving parts. Used for surveillance in areas where movement of camera conspicuous or constrained by obstructions. Also used for closeup tracking of multiple objects in field of view or to break image into sectors for simultaneous viewing, thereby replacing several cameras.

  12. Development of biostereometric experiments. [stereometric camera system

    NASA Technical Reports Server (NTRS)

    Herron, R. E.

    1978-01-01

    The stereometric camera was designed for close-range techniques in biostereometrics. The camera focusing distance of 360 mm to infinity covers a broad field of close-range photogrammetry. The design provides for a separate unit for the lens system and interchangeable backs on the camera for the use of single frame film exposure, roll-type film cassettes, or glass plates. The system incorporates the use of a surface contrast optical projector.

  13. Comparative evaluation of consumer grade cameras and mobile phone cameras for close range photogrammetry

    NASA Astrophysics Data System (ADS)

    Chikatsu, Hirofumi; Takahashi, Yoji

    2009-08-01

    The authors have been concentrating on developing convenient 3D measurement methods using consumer grade digital cameras, and it was concluded that consumer grade digital cameras are expected to become a useful photogrammetric device for the various close range application fields. On the other hand, mobile phone cameras which have 10 mega pixels were appeared on the market in Japan. In these circumstances, we are faced with alternative epoch-making problem whether mobile phone cameras are able to take the place of consumer grade digital cameras in close range photogrammetric applications. In order to evaluate potentials of mobile phone cameras in close range photogrammetry, comparative evaluation between mobile phone cameras and consumer grade digital cameras are investigated in this paper with respect to lens distortion, reliability, stability and robustness. The calibration tests for 16 mobile phone cameras and 50 consumer grade digital cameras were conducted indoors using test target. Furthermore, practability of mobile phone camera for close range photogrammetry was evaluated outdoors. This paper presents that mobile phone cameras have ability to take the place of consumer grade digital cameras, and develop the market in digital photogrammetric fields.

  14. 33 CFR 117.561 - Kent Island Narrows.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 33 Navigation and Navigable Waters 1 2012-07-01 2012-07-01 false Kent Island Narrows. 117.561... DRAWBRIDGE OPERATION REGULATIONS Specific Requirements Maryland § 117.561 Kent Island Narrows. The draw of the U.S. Route 50/301 bridge, mile 1.0, Kent Island Narrows, operates as follows: (a) From November...

  15. 33 CFR 117.561 - Kent Island Narrows.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 33 Navigation and Navigable Waters 1 2010-07-01 2010-07-01 false Kent Island Narrows. 117.561... DRAWBRIDGE OPERATION REGULATIONS Specific Requirements Maryland § 117.561 Kent Island Narrows. The draw of the U.S. Route 50/301 bridge, mile 1.0, Kent Island Narrows, operates as follows: (a) From November...

  16. 33 CFR 117.561 - Kent Island Narrows.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 33 Navigation and Navigable Waters 1 2013-07-01 2013-07-01 false Kent Island Narrows. 117.561... DRAWBRIDGE OPERATION REGULATIONS Specific Requirements Maryland § 117.561 Kent Island Narrows. The draw of the U.S. Route 50/301 bridge, mile 1.0, Kent Island Narrows, operates as follows: (a) From November...

  17. 33 CFR 117.561 - Kent Island Narrows.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 33 Navigation and Navigable Waters 1 2014-07-01 2014-07-01 false Kent Island Narrows. 117.561... DRAWBRIDGE OPERATION REGULATIONS Specific Requirements Maryland § 117.561 Kent Island Narrows. The draw of the U.S. Route 50/301 bridge, mile 1.0, Kent Island Narrows, operates as follows: (a) From November...

  18. 33 CFR 117.561 - Kent Island Narrows.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 33 Navigation and Navigable Waters 1 2011-07-01 2011-07-01 false Kent Island Narrows. 117.561... DRAWBRIDGE OPERATION REGULATIONS Specific Requirements Maryland § 117.561 Kent Island Narrows. The draw of the U.S. Route 50/301 bridge, mile 1.0, Kent Island Narrows, operates as follows: (a) From November...

  19. 2. Photocopied July 1971 from photostat Jordan Narrows Folder #1, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    2. Photocopied July 1971 from photostat Jordan Narrows Folder #1, Engineering Department, Utah Power and Light Co., Salt Lake City, Utah. JORDAN NARROWS STATION. PLAN AND SECTION. - Salt Lake City Water & Electrical Power Company, Jordan Narrows Hydroelectric Plant, Jordan River, Riverton, Salt Lake County, UT

  20. Advanced High-Definition Video Cameras

    NASA Technical Reports Server (NTRS)

    Glenn, William

    2007-01-01

    A product line of high-definition color video cameras, now under development, offers a superior combination of desirable characteristics, including high frame rates, high resolutions, low power consumption, and compactness. Several of the cameras feature a 3,840 2,160-pixel format with progressive scanning at 30 frames per second. The power consumption of one of these cameras is about 25 W. The size of the camera, excluding the lens assembly, is 2 by 5 by 7 in. (about 5.1 by 12.7 by 17.8 cm). The aforementioned desirable characteristics are attained at relatively low cost, largely by utilizing digital processing in advanced field-programmable gate arrays (FPGAs) to perform all of the many functions (for example, color balance and contrast adjustments) of a professional color video camera. The processing is programmed in VHDL so that application-specific integrated circuits (ASICs) can be fabricated directly from the program. ["VHDL" signifies VHSIC Hardware Description Language C, a computing language used by the United States Department of Defense for describing, designing, and simulating very-high-speed integrated circuits (VHSICs).] The image-sensor and FPGA clock frequencies in these cameras have generally been much higher than those used in video cameras designed and manufactured elsewhere. Frequently, the outputs of these cameras are converted to other video-camera formats by use of pre- and post-filters.

  1. Advantages of improved timing accuracy in PET cameras using LSOscintillator

    SciTech Connect

    Moses, William W.

    2002-12-02

    PET scanners based on LSO have the potential forsignificantly better coincidence timing resolution than the 6 ns fwhmtypically achieved with BGO. This study analyzes the performanceenhancements made possible by improved timing as a function of thecoincidence time resolution. If 500 ps fwhm coincidence timing resolutioncan be achieved in a complete PET camera, the following four benefits canbe realized for whole-body FDG imaging: 1) The random event rate can bereduced by using a narrower coincidence timing window, increasing thepeak NECR by~;50 percent. 2) Using time-of-flight in the reconstructionalgorithm will reduce the noise variance by a factor of 5. 3) Emissionand transmission data can be acquired simultaneously, reducing the totalscan time. 4) Axial blurring can be reduced by using time-of-flight todetermine the correct axial plane that each event originated from. Whiletime-of-flight was extensively studied in the 1980's, practical factorslimited its effectiveness at that time and little attention has been paidto timing in PET since then. As these potential improvements aresubstantial and the advent of LSO PET cameras gives us the means toobtain them without other sacrifices, efforts to improve PET timingshould resume after their long dormancy.

  2. The VENμS super-spectral camera

    NASA Astrophysics Data System (ADS)

    Topaz, Jeremy; Tinto, Francesc; Hagolle, Olivier

    2006-09-01

    A 5m GSD satellite camera with 12 narrow spectral bands in the VNIR region is being developed by El-Op, Israel, for a cooperative project between CNES (France; the Israel Space Agency. The satellite, called "VENμS" (Vegetation and Environment monitoring on a New Micro-Satellite) will enable evaluation of the use of high-resolution, high repetitivity, super-spectral imaging data for vegetation and environmental monitoring. The camera will image a limited number of selected sites around the globe with a two-day revisit interval. Highly demanding requirements for signal-to-noise ratio, radiometric accuracy, band-to-band registration and precise location on the ground will ensure the validity of the data. It will also help to define the optimal set of bands and the image processing algorithms of future instruments in the framework of the GMES program. The satellite bus will be built by Israel Aircraft Industries and will also carry an experimental ion propulsion system developed by Rafael, Israel).

  3. Optical Enhancement of Exoskeleton-Based Estimation of Glenohumeral Angles.

    PubMed

    Cortés, Camilo; Unzueta, Luis; de Los Reyes-Guzmán, Ana; Ruiz, Oscar E; Flórez, Julián

    2016-01-01

    In Robot-Assisted Rehabilitation (RAR) the accurate estimation of the patient limb joint angles is critical for assessing therapy efficacy. In RAR, the use of classic motion capture systems (MOCAPs) (e.g., optical and electromagnetic) to estimate the Glenohumeral (GH) joint angles is hindered by the exoskeleton body, which causes occlusions and magnetic disturbances. Moreover, the exoskeleton posture does not accurately reflect limb posture, as their kinematic models differ. To address the said limitations in posture estimation, we propose installing the cameras of an optical marker-based MOCAP in the rehabilitation exoskeleton. Then, the GH joint angles are estimated by combining the estimated marker poses and exoskeleton Forward Kinematics. Such hybrid system prevents problems related to marker occlusions, reduced camera detection volume, and imprecise joint angle estimation due to the kinematic mismatch of the patient and exoskeleton models. This paper presents the formulation, simulation, and accuracy quantification of the proposed method with simulated human movements. In addition, a sensitivity analysis of the method accuracy to marker position estimation errors, due to system calibration errors and marker drifts, has been carried out. The results show that, even with significant errors in the marker position estimation, method accuracy is adequate for RAR.

  4. Optical Enhancement of Exoskeleton-Based Estimation of Glenohumeral Angles

    PubMed Central

    Cortés, Camilo; Unzueta, Luis; de los Reyes-Guzmán, Ana; Ruiz, Oscar E.; Flórez, Julián

    2016-01-01

    In Robot-Assisted Rehabilitation (RAR) the accurate estimation of the patient limb joint angles is critical for assessing therapy efficacy. In RAR, the use of classic motion capture systems (MOCAPs) (e.g., optical and electromagnetic) to estimate the Glenohumeral (GH) joint angles is hindered by the exoskeleton body, which causes occlusions and magnetic disturbances. Moreover, the exoskeleton posture does not accurately reflect limb posture, as their kinematic models differ. To address the said limitations in posture estimation, we propose installing the cameras of an optical marker-based MOCAP in the rehabilitation exoskeleton. Then, the GH joint angles are estimated by combining the estimated marker poses and exoskeleton Forward Kinematics. Such hybrid system prevents problems related to marker occlusions, reduced camera detection volume, and imprecise joint angle estimation due to the kinematic mismatch of the patient and exoskeleton models. This paper presents the formulation, simulation, and accuracy quantification of the proposed method with simulated human movements. In addition, a sensitivity analysis of the method accuracy to marker position estimation errors, due to system calibration errors and marker drifts, has been carried out. The results show that, even with significant errors in the marker position estimation, method accuracy is adequate for RAR. PMID:27403044

  5. Gesture recognition on smart cameras

    NASA Astrophysics Data System (ADS)

    Dziri, Aziz; Chevobbe, Stephane; Darouich, Mehdi

    2013-02-01

    Gesture recognition is a feature in human-machine interaction that allows more natural interaction without the use of complex devices. For this reason, several methods of gesture recognition have been developed in recent years. However, most real time methods are designed to operate on a Personal Computer with high computing resources and memory. In this paper, we analyze relevant methods found in the literature in order to investigate the ability of smart camera to execute gesture recognition algorithms. We elaborate two hand gesture recognition pipelines. The first method is based on invariant moments extraction and the second on finger tips detection. The hand detection method used for both pipeline is based on skin color segmentation. The results obtained show that the un-optimized versions of invariant moments method and finger tips detection method can reach 10 fps on embedded processor and use about 200 kB of memory.

  6. Illumination box and camera system

    DOEpatents

    Haas, Jeffrey S.; Kelly, Fredrick R.; Bushman, John F.; Wiefel, Michael H.; Jensen, Wayne A.; Klunder, Gregory L.

    2002-01-01

    A hand portable, field-deployable thin-layer chromatography (TLC) unit and a hand portable, battery-operated unit for development, illumination, and data acquisition of the TLC plates contain many miniaturized features that permit a large number of samples to be processed efficiently. The TLC unit includes a solvent tank, a holder for TLC plates, and a variety of tool chambers for storing TLC plates, solvent, and pipettes. After processing in the TLC unit, a TLC plate is positioned in a collapsible illumination box, where the box and a CCD camera are optically aligned for optimal pixel resolution of the CCD images of the TLC plate. The TLC system includes an improved development chamber for chemical development of TLC plates that prevents solvent overflow.

  7. Explosive Transient Camera (ETC) Program

    NASA Technical Reports Server (NTRS)

    Ricker, George

    1991-01-01

    Since the inception of the ETC program, a wide range of new technologies was developed to support this astronomical instrument. The prototype unit was installed at ETC Site 1. The first partially automated observations were made and some major renovations were later added to the ETC hardware. The ETC was outfitted with new thermoelectrically-cooled CCD cameras and a sophisticated vacuum manifold, which, together, made the ETC a much more reliable unit than the prototype. The ETC instrumentation and building were placed under full computer control, allowing the ETC to operate as an automated, autonomous instrument with virtually no human intervention necessary. The first fully-automated operation of the ETC was performed, during which the ETC monitored the error region of the repeating soft gamma-ray burster SGR 1806-21.

  8. Camera processing with chromatic aberration.

    PubMed

    Korneliussen, Jan Tore; Hirakawa, Keigo

    2014-10-01

    Since the refractive index of materials commonly used for lens depends on the wavelengths of light, practical camera optics fail to converge light to a single point on an image plane. Known as chromatic aberration, this phenomenon distorts image details by introducing magnification error, defocus blur, and color fringes. Though achromatic and apochromatic lens designs reduce chromatic aberration to a degree, they are complex and expensive and they do not offer a perfect correction. In this paper, we propose a new postcapture processing scheme designed to overcome these problems computationally. Specifically, the proposed solution is comprised of chromatic aberration-tolerant demosaicking algorithm and post-demosaicking chromatic aberration correction. Experiments with simulated and real sensor data verify that the chromatic aberration is effectively corrected.

  9. Wind dynamic range video camera

    NASA Technical Reports Server (NTRS)

    Craig, G. D. (Inventor)

    1985-01-01

    A television camera apparatus is disclosed in which bright objects are attenuated to fit within the dynamic range of the system, while dim objects are not. The apparatus receives linearly polarized light from an object scene, the light being passed by a beam splitter and focused on the output plane of a liquid crystal light valve. The light valve is oriented such that, with no excitation from the cathode ray tube, all light is rotated 90 deg and focused on the input plane of the video sensor. The light is then converted to an electrical signal, which is amplified and used to excite the CRT. The resulting image is collected and focused by a lens onto the light valve which rotates the polarization vector of the light to an extent proportional to the light intensity from the CRT. The overall effect is to selectively attenuate the image pattern focused on the sensor.

  10. HRSC: High resolution stereo camera

    USGS Publications Warehouse

    Neukum, G.; Jaumann, R.; Basilevsky, A.T.; Dumke, A.; Van Gasselt, S.; Giese, B.; Hauber, E.; Head, J. W.; Heipke, C.; Hoekzema, N.; Hoffmann, H.; Greeley, R.; Gwinner, K.; Kirk, R.; Markiewicz, W.; McCord, T.B.; Michael, G.; Muller, Jan-Peter; Murray, J.B.; Oberst, J.; Pinet, P.; Pischel, R.; Roatsch, T.; Scholten, F.; Willner, K.

    2009-01-01

    The High Resolution Stereo Camera (HRSC) on Mars Express has delivered a wealth of image data, amounting to over 2.5 TB from the start of the mapping phase in January 2004 to September 2008. In that time, more than a third of Mars was covered at a resolution of 10-20 m/pixel in stereo and colour. After five years in orbit, HRSC is still in excellent shape, and it could continue to operate for many more years. HRSC has proven its ability to close the gap between the low-resolution Viking image data and the high-resolution Mars Orbiter Camera images, leading to a global picture of the geological evolution of Mars that is now much clearer than ever before. Derived highest-resolution terrain model data have closed major gaps and provided an unprecedented insight into the shape of the surface, which is paramount not only for surface analysis and geological interpretation, but also for combination with and analysis of data from other instruments, as well as in planning for future missions. This chapter presents the scientific output from data analysis and highlevel data processing, complemented by a summary of how the experiment is conducted by the HRSC team members working in geoscience, atmospheric science, photogrammetry and spectrophotometry. Many of these contributions have been or will be published in peer-reviewed journals and special issues. They form a cross-section of the scientific output, either by summarising the new geoscientific picture of Mars provided by HRSC or by detailing some of the topics of data analysis concerning photogrammetry, cartography and spectral data analysis.

  11. Nondecaying hydrodynamic interactions along narrow channels

    NASA Astrophysics Data System (ADS)

    Misiunas, Karolis; Pagliara, Stefano; Lauga, Eric; Lister, John R.; Keyser, Ulrich

    Particle-particle interactions are of paramount importance in every multi-body system as they determine the collective behaviour and coupling strength. Many well-known interactions like electro-static, van der Waals or screened Coulomb, decay exponentially or with negative powers of the particle spacing r. Similarly, hydrodynamic interactions between particles undergoing Brownian motion decay as 1 / r in bulk, and are assumed to decay in small channels. Such interactions are ubiquitous in biological and technological systems. Here we confine two particles undergoing Brownian motion in narrow, microfluidic channels and study their coupling through hydrodynamic interactions. Our experiments show that the hydrodynamic particle-particle interactions are distance-independent in these channels. This finding is of fundamental importance for the interpretation of experiments where dense mixtures of particles or molecules diffuse through finite length, water-filled channels or pore networks.

  12. Line Narrowing Parameter Measurement by Modulation Spectroscopy

    NASA Technical Reports Server (NTRS)

    Dharamsi, Amin N.

    1998-01-01

    Accurate Characterization of Oxygen A-Band Line Parameters by Wavelength Modulation Spectroscopy with tunable diode lasers is an ongoing research at Old Dominion University, under sponsorship from NASA Langley research Center. The work proposed here will be undertaken under the guidance of Dr. William Chu and Dr. Lamont Poole of the Aerosol Research Branch at NASA Langley-Research Center in Hampton, Virginia. The research was started about two years ago and utilizes wavelength modulation absorption spectroscopy with higher harmonic detection, a technique that we developed at Old Dominion University, to obtain the absorption line characteristics of the Oxygen A-band rovibronic lines. Accurate characterization of this absorption band is needed for processing of data that will be obtained in experiments such as the NASA Stratospheric Aerosol and Gas Experiment III (SAGE III) as part of the US Mission to Planet Earth. The research work for Summer Fellowship undertook a measurement of the Dicke line-narrowing parameters of the Oxygen A-Band lines by using wavelength modulation spectroscopy. Our previous theoretical results had indicated that such a measurement could be done sensitively and in a convenient fashion by using this type of spectroscopy. In particular, theoretical results had indicated that the signal magnitude would depend on pressure in a manner that was very sensitive to the narrowing parameter. One of the major tasks undertaken during the summer of 1998 was to establish experimentally that these theoretical predictions were correct. This was done successfully and the results of the work are being prepared for publication. Experimental Results were obtained in which the magnitude of the signal was measured as a function of pressure, for various harmonic detection orders (N = 1, 2, 3, 4, 5). A comparison with theoretical results was made, and it was shown that the agreement between theory and experiment was very good. More importantly, however, it was shown

  13. Narrow UV Absorption Line Outflows from Quasars

    NASA Astrophysics Data System (ADS)

    Hamann, F.; Simon, L.; Rodriguez Hidalgo, P.; Capellupo, D.

    2012-08-01

    Narrow absorption line (NAL) outflows are an important yet poorly understood part of the quasar outflow phenomenon. We discuss one particular NAL outflow that has high speeds, time variability, and moderate ionizations like typical BAL flows, at an estimated location just ˜5 pc from the quasar. It also has a total column density and line widths (internal velocity dispersions) ˜100 times smaller than BALs, with no substantial X-ray absorption. We argue that radiative shielding (in the form of an X-ray/warm absorber) is not critical for the outflow acceleration and that the moderate ionizations occur in dense substructures that have an overall small volume filling factor in the flow. We also present new estimates of the overall incidence of quasar outflow lines; e.g., ˜43% of bright quasars have a C IV NAL outflow while ˜68% have a C IV outflow line of any variety (NAL, BAL, or mini-BAL).

  14. A grazing incidence x-ray streak camera for ultrafast, single-shot measurements

    SciTech Connect

    Feng, Jun; Engelhorn, K.; Cho, B.I.; Lee, H.J.; Greaves, M.; Weber, C.P.; Falcone, R.W.; Padmore, H. A.; Heimann, P.A.

    2010-02-18

    An ultrafast x-ray streak camera has been realized using a grazing incidence reflection photocathode. X-rays are incident on a gold photocathode at a grazing angle of 20 degree and photoemitted electrons are focused by a large aperture magnetic solenoid lens. The streak camera has high quantum efficiency, 600fs temporal resolution, and 6mm imaging length in the spectral direction. Its single shot capability eliminates temporal smearing due to sweep jitter, and allows recording of the ultrafast dynamics of samples that undergo non-reversible changes.

  15. Mission Report on the Orbiter Camera Payload System (OCPS) Large Format Camera (LFC) and Attitude Reference System (ARS)

    NASA Technical Reports Server (NTRS)

    Mollberg, Bernard H.; Schardt, Bruton B.

    1988-01-01

    The Orbiter Camera Payload System (OCPS) is an integrated photographic system which is carried into earth orbit as a payload in the Space Transportation System (STS) Orbiter vehicle's cargo bay. The major component of the OCPS is a Large Format Camera (LFC), a precision wide-angle cartographic instrument that is capable of producing high resolution stereo photography of great geometric fidelity in multiple base-to-height (B/H) ratios. A secondary, supporting system to the LFC is the Attitude Reference System (ARS), which is a dual lens Stellar Camera Array (SCA) and camera support structure. The SCA is a 70-mm film system which is rigidly mounted to the LFC lens support structure and which, through the simultaneous acquisition of two star fields with each earth-viewing LFC frame, makes it possible to determine precisely the pointing of the LFC optical axis with reference to the earth nadir point. Other components complete the current OCPS configuration as a high precision cartographic data acquisition system. The primary design objective for the OCPS was to maximize system performance characteristics while maintaining a high level of reliability compatible with Shuttle launch conditions and the on-orbit environment. The full-up OCPS configuration was launched on a highly successful maiden voyage aboard the STS Orbiter vehicle Challenger on October 5, 1984, as a major payload aboard mission STS 41-G. This report documents the system design, the ground testing, the flight configuration, and an analysis of the results obtained during the Challenger mission STS 41-G.

  16. Laser angle measurement system

    NASA Technical Reports Server (NTRS)

    Pond, C. R.; Texeira, P. D.; Wilbert, R. E.

    1980-01-01

    The design and fabrication of a laser angle measurement system is described. The instrument is a fringe counting interferometer that monitors the pitch attitude of a model in a wind tunnel. A laser source and detector are mounted above the mode. Interference fringes are generated by a small passive element on the model. The fringe count is accumulated and displayed by a processor in the wind tunnel control room. Optical and electrical schematics, system maintenance and operation procedures are included, and the results of a demonstration test are given.

  17. Switchable viewing angle display with a compact directional backlight and striped diffuser.

    PubMed

    Wang, Yi-Jun; Lu, Jian-Gang; Chao, Wei-Chung; Shieh, Han-Ping D

    2015-08-10

    A compact high-directionality backlight module combined with a striped diffuser is proposed to achieve an adjustable viewing angle for eco-display. The micro-prisms on the compact light guide plate guide the emitting rays to the normal viewing angle, whereas a set of striped diffusers scatter the rays to a wide viewing angle. View cones of ± 10° / ± 55° were obtained for narrow/wide viewing modes with 88% / 85% uniformity of spatial luminance, respectively. Compared with the conventional backlight, the optical efficiencies were increased by factors of 1.47 and 1.38 in narrow and wide viewing modes, respectively. In addition, only 5% of power consumption was needed when the backlight worked in private narrow viewing mode to maintain the same luminance as that of a conventional backlight.

  18. The effect of narrow provider networks on health care use.

    PubMed

    Atwood, Alicia; Lo Sasso, Anthony T

    2016-12-01

    Network design is an often overlooked aspect of health insurance contracts. Recent policy factors have resulted in narrower provider networks. We provide plausibly causal evidence on the effect of narrow network plans offered by a large national health insurance carrier in a major metropolitan market. Our econometric design exploits the fact that some firms offer a narrow network plan to their employees and some do not. Our results show that narrow network health plans lead to reductions in health care utilization and spending. We find evidence that narrow networks save money by selecting lower cost providers into the network.

  19. Fundamental study on identification of CMOS cameras

    NASA Astrophysics Data System (ADS)

    Kurosawa, Kenji; Saitoh, Naoki

    2003-08-01

    In this study, we discussed individual camera identification of CMOS cameras, because CMOS (complementary-metal-oxide-semiconductor) imaging detectors have begun to make their move into the CCD (charge-coupled-device) fields for recent years. It can be identified whether or not the given images have been taken with the given CMOS camera by detecting the imager's intrinsic unique fixed pattern noise (FPN) just like the individual CCD camera identification method proposed by the authors. Both dark and bright pictures taken with the CMOS cameras can be identified by the method, because not only dark current in the photo detectors but also MOS-FET amplifiers incorporated in each pixel may produce pixel-to-pixel nonuniformity in sensitivity. Each pixel in CMOS detectors has the amplifier, which degrades image quality of bright images due to the nonuniformity of the amplifier gain. Two CMOS cameras were evaluated in our experiments. They were WebCamGoPlus (Creative), and EOS D30 (Canon). WebCamGoPlus is a low-priced web camera, whereas EOS D30 is for professional use. Image of a white plate were recorded with the cameras under the plate's luminance condition of 0cd/m2 and 150cd/m2. The recorded images were multiply integrated to reduce the random noise component. From the images of both cameras, characteristic dots patterns were observed. Some bright dots were observed in the dark images, whereas some dark dots were in the bright images. The results show that the camera identification method is also effective for CMOS cameras.

  20. Trajectory association across multiple airborne cameras.

    PubMed

    Sheikh, Yaser Ajmal; Shah, Mubarak

    2008-02-01

    A camera mounted on an aerial vehicle provides an excellent means for monitoring large areas of a scene. Utilizing several such cameras on different aerial vehicles allows further flexibility, in terms of increased visual scope and in the pursuit of multiple targets. In this paper, we address the problem of associating objects across multiple airborne cameras. Since the cameras are moving and often widely separated, direct appearance-based or proximity-based constraints cannot be used. Instead, we exploit geometric constraints on the relationship between the motion of each object across cameras, to test multiple association hypotheses, without assuming any prior calibration information. Given our scene model, we propose a likelihood function for evaluating a hypothesized association between observations in multiple cameras that is geometrically motivated. Since multiple cameras exist, ensuring coherency in association is an essential requirement, e.g. that transitive closure is maintained between more than two cameras. To ensure such coherency we pose the problem of maximizing the likelihood function as a k-dimensional matching and use an approximation to find the optimal assignment of association. Using the proposed error function, canonical trajectories of each object and optimal estimates of inter-camera transformations (in a maximum likelihood sense) are computed. Finally, we show that as a result of associating objects across the cameras, a concurrent visualization of multiple aerial video streams is possible and that, under special conditions, trajectories interrupted due to occlusion or missing detections can be repaired. Results are shown on a number of real and controlled scenarios with multiple objects observed by multiple cameras, validating our qualitative models, and through simulation quantitative performance is also reported.

  1. Multi-camera calibration based on openCV and multi-view registration

    NASA Astrophysics Data System (ADS)

    Deng, Xiao-ming; Wan, Xiong; Zhang, Zhi-min; Leng, Bi-yan; Lou, Ning-ning; He, Shuai

    2010-10-01

    For multi-camera calibration systems, a method based on OpenCV and multi-view registration combining calibration algorithm is proposed. First of all, using a Zhang's calibration plate (8X8 chessboard diagram) and a number of cameras (with three industrial-grade CCD) to be 9 group images shooting from different angles, using OpenCV to calibrate the parameters fast in the camera. Secondly, based on the corresponding relationship between each camera view, the computation of the rotation matrix and translation matrix is formulated as a constrained optimization problem. According to the Kuhn-Tucker theorem and the properties on the derivative of the matrix-valued function, the formulae of rotation matrix and translation matrix are deduced by using singular value decomposition algorithm. Afterwards an iterative method is utilized to get the entire coordinate transformation of pair-wise views, thus the precise multi-view registration can be conveniently achieved and then can get the relative positions in them(the camera outside the parameters).Experimental results show that the method is practical in multi-camera calibration .

  2. Calibration of hyperspectral close-range pushbroom cameras for plant phenotyping

    NASA Astrophysics Data System (ADS)

    Behmann, Jan; Mahlein, Anne-Katrin; Paulus, Stefan; Kuhlmann, Heiner; Oerke, Erich-Christian; Plümer, Lutz

    2015-08-01

    Hyperspectral sensors are able to detect biological processes of plants which are invisible to the naked eye. Close-range cameras in particular support the identification of biotic and abiotic stress reactions at an early stage. Up to now, their full potential is only partially realized because geometrical factors as leaf angle, curvature and self-shading, overlay the signal of biological processes. Suitable 3D plant models constitutes an important step to removing these factors from the data. The matching of these 3D model and the hyperspectral image with sufficient accuracy even for small leaf veins is required but relies on an adequate geometric calibration of hyperspectral cameras. We present a method for the geometric calibration of hyperspectral pushbroom cameras in the close-range, which enables reliable and reproducible results at sub-pixel scale. This approach extends the linear pushbroom camera by the ability to model non-linear fractions. Accuracy and reproducibility of the method is validated using a hyperspectral senor system with two line cameras observing the reflected radiation in the spectral range from 400 to 2500 nm. We point out new potentials arising from with the proposed camera calibration, e.g. hyperspectral 3D plant models, which have high potential for crop plant phenotyping.

  3. Spherical Gaussian mixture model and object tracking system for PTZ camera

    NASA Astrophysics Data System (ADS)

    Hwangbo, Seok; Lee, Chan-Su

    2015-05-01

    Recently, pan-tilt-zoom(PTZ) camera is widely used in extensive-area surveillance applications. A number of background modeling methods have been proposed within existing object detection and tracking systems. However, conventional background modeling methods for PTZ camera have difficulties in covering extensive field of view(FOV). This paper presents a novel object tracking system based on a spherical background model for PTZ camera. The proposed system has two components: The first one is the spherical Gaussian mixture model(S-GMM) that learns background for all the view angles in the PTZ camera. Also, Gaussian parameters in each pixel in the S-GMM are learned and updated. The second one is object tracking system with foreground detection using the S-GMM in real-time. The proposed system is suitable to cover wide FOV compared to a conventional background modeling system for PTZ camera, and is able to exactly track moving objects. We demonstrate the advantages of the proposed S-GMM for object tracking system using PTZ camera. Also, we expect to build a more advanced surveillance applications via the proposed system.

  4. Camera self-calibration from translation by referring to a known camera.

    PubMed

    Zhao, Bin; Hu, Zhaozheng

    2015-09-01

    This paper presents a novel linear method for camera self-calibration by referring to a known (or calibrated) camera. The method requires at least three images, with two images generated by the uncalibrated camera from pure translation and one image generated by the known reference camera. We first propose a method to compute the infinite homography from scene depths. Based on this, we use two images generated by translating the uncalibrated camera to recover scene depths, which are further utilized to linearly compute the infinite homography between an arbitrary uncalibrated image, and the image from the known camera. With the known camera as reference, the computed infinite homography is readily decomposed for camera calibration. The proposed self-calibration method has been tested with simulation and real image data. Experimental results demonstrate that the method is practical and accurate. This paper proposes using a "known reference camera" for camera calibration. The pure translation, as required in the method, is much more maneuverable, compared with some strict motions in the literature, such as pure rotation. The proposed self-calibration method has good potential for solving online camera calibration problems, which has important applications, especially for multicamera and zooming camera systems.

  5. Narrow-band holographic optical filter using thick efficient holographic gratings

    NASA Astrophysics Data System (ADS)

    Billmers, Richard I.; Billmers, Elizabeth J.; Burzynski, Ryszard; Weibel, John F.; Heverley, L. H., III; Casstevens, Martin K.; Curran, Thomas P.; Contarino, V. Michael

    2002-06-01

    RL Associates in conjunction with Hybrid Technologies is developing a narrow linewidth optical filter employing extremely thick volume holographic diffraction gratings. The gratings are written in MEMPLEX, a new holographic materia invented by Laser Photonics Technology, Inc. and licensed to Hybrid Technologies. MEMPLEX has the following characteristics: (1) Excellent optical clarity, (2) Preparable at any thickness up to 10 mm, (3) Large dynamic range for plane wave holograms, (4) Hard, freestanding, stable, polishable and coatable. We have written and characterized numerous gratings in 1.8 mm thick samples to study the effect of writing geometry on the spectral linewidth and field-of-view of a single grating in the reflection geometry. We have succeeded in writing some very efficient gratings at 15 degrees internal write angles with external slant angles of 5 degrees. These gratings exhibit linewidths of < 0.2 nm and diffraction efficiencies of better than 70 percent. The measured angular acceptance of these gratings ranges from 0.1 to 0.24 degrees. We have also written some initial angle multiplexed gratings which include 3 efficient gratings in the same volume in an attempt to increase the angular acceptance. In this manner we hope to achieve a highly efficient optical filter with extremely narrow spectral linewidth and wide angular acceptance. Filters based on thick volume holograms show great promise in Lidar applications and should result in superior S/N ratios.

  6. Variable angle correlation spectroscopy

    SciTech Connect

    Lee, Young Kyo

    1994-05-01

    In this dissertation, a novel nuclear magnetic resonance (NMR) technique, variable angle correlation spectroscopy (VACSY) is described and demonstrated with 13C nuclei in rapidly rotating samples. These experiments focus on one of the basic problems in solid state NMR: how to extract the wealth of information contained in the anisotropic component of the NMR signal while still maintaining spectral resolution. Analysis of the anisotropic spectral patterns from poly-crystalline systems reveal information concerning molecular structure and dynamics, yet in all but the simplest of systems, the overlap of spectral patterns from chemically distinct sites renders the spectral analysis difficult if not impossible. One solution to this problem is to perform multi-dimensional experiments where the high-resolution, isotropic spectrum in one dimension is correlated with the anisotropic spectral patterns in the other dimensions. The VACSY technique incorporates the angle between the spinner axis and the static magnetic field as an experimental parameter that may be incremented during the course of the experiment to help correlate the isotropic and anisotropic components of the spectrum. The two-dimensional version of the VACSY experiments is used to extract the chemical shift anisotropy tensor values from multi-site organic molecules, study molecular dynamics in the intermediate time regime, and to examine the ordering properties of partially oriented samples. The VACSY technique is then extended to three-dimensional experiments to study slow molecular reorientations in a multi-site polymer system.

  7. A Note on Angle Construction

    ERIC Educational Resources Information Center

    Francis, Richard L.

    1978-01-01

    The author investigates the construction of angles (using Euclidean tools) through a numerical approach. He calls attention to the surprising impossibility of constructing the conventional units of angle measure--the degree, minute, second, radian, and mil. (MN)

  8. The Pluto System At Small Phase Angles

    NASA Astrophysics Data System (ADS)

    Verbiscer, Anne J.; Buie, Marc W.; Binzel, Richard; Ennico, Kimberly; Grundy, William M.; Olkin, Catherine B.; Showalter, Mark Robert; Spencer, John R.; Stern, S. Alan; Weaver, Harold A.; Young, Leslie; New Horizons Science Team

    2016-10-01

    Hubble Space Telescope observations of the Pluto system acquired during the New Horizons encounter epoch (HST Program 13667, M. Buie, PI) span the phase angle range from 0.06 to 1.7 degrees, enabling the measurement and characterization of the opposition effect for Pluto and its satellites at 0.58 microns using HST WFC3/UVIS with the F350LP filter, which has a broadband response and a pivot wavelength of 0.58 microns. At these small phase angles, differences in the opposition effect width and amplitude appear. The small satellites Nix and Hydra both exhibit a very narrow opposition surge, while the considerably larger moon Charon has a broader opposition surge. Microtextural surface properties derived from the shape and magnitude of the opposition surge of each surface contain a record of the collisional history of the system. We combine these small phase angle observations with those made at larger phase angles by the New Horizons Long Range Reconnaissance Imager (LORRI), which also has a broadband response with a pivot wavelength of 0.61 microns, to produce the most complete disk-integrated solar phase curves that we will have for decades to come. Modeling these disk-integrated phase curves generates sets of photometric parameters that will inform spectral modeling of the satellite surfaces as well as terrains on Pluto from spatially resolved New Horizons Ralph Linear Etalon Imaging Spectral Array (LEISA) data from 1.2 to 2.5 microns. Rotationally resolved phase curves of Pluto reveal opposition effects that only appear at phase angles less than 0.1 degree and have widths and amplitudes that are highly dependent on longitude and therefore on Pluto's diverse terrains. The high albedo region informally known as Sputnik Planum dominates the disk-integrated reflectance of Pluto on the New Horizons encounter hemisphere. These results lay the groundwork for observations at true opposition in 2018, when the Pluto system will be observable at phase angles so small that

  9. New camera tube improves ultrasonic inspection system

    NASA Technical Reports Server (NTRS)

    Berger, H.; Collis, W. J.; Jacobs, J. E.

    1968-01-01

    Electron multiplier, incorporated into the camera tube of an ultrasonic imaging system, improves resolution, effectively shields low level circuits, and provides a high level signal input to the television camera. It is effective for inspection of metallic materials for bonds, voids, and homogeneity.

  10. Making a room-sized camera obscura

    NASA Astrophysics Data System (ADS)

    Flynt, Halima; Ruiz, Michael J.

    2015-01-01

    We describe how to convert a room into a camera obscura as a project for introductory geometrical optics. The view for our camera obscura is a busy street scene set against a beautiful mountain skyline. We include a short video with project instructions, ray diagrams and delightful moving images of cars driving on the road outside.

  11. Solid state replacement of rotating mirror cameras

    NASA Astrophysics Data System (ADS)

    Frank, Alan M.; Bartolick, Joseph M.

    2007-01-01

    Rotating mirror cameras have been the mainstay of mega-frame per second imaging for decades. There is still no electronic camera that can match a film based rotary mirror camera for the combination of frame count, speed, resolution and dynamic range. The rotary mirror cameras are predominantly used in the range of 0.1 to 100 micro-seconds per frame, for 25 to more than a hundred frames. Electron tube gated cameras dominate the sub microsecond regime but are frame count limited. Video cameras are pushing into the microsecond regime but are resolution limited by the high data rates. An all solid state architecture, dubbed 'In-situ Storage Image Sensor' or 'ISIS', by Prof. Goji Etoh has made its first appearance into the market and its evaluation is discussed. Recent work at Lawrence Livermore National Laboratory has concentrated both on evaluation of the presently available technologies and exploring the capabilities of the ISIS architecture. It is clear though there is presently no single chip camera that can simultaneously match the rotary mirror cameras, the ISIS architecture has the potential to approach their performance.

  12. Single chip camera active pixel sensor

    NASA Technical Reports Server (NTRS)

    Shaw, Timothy (Inventor); Pain, Bedabrata (Inventor); Olson, Brita (Inventor); Nixon, Robert H. (Inventor); Fossum, Eric R. (Inventor); Panicacci, Roger A. (Inventor); Mansoorian, Barmak (Inventor)

    2003-01-01

    A totally digital single chip camera includes communications to operate most of its structure in serial communication mode. The digital single chip camera include a D/A converter for converting an input digital word into an analog reference signal. The chip includes all of the necessary circuitry for operating the chip using a single pin.

  13. Solid State Replacement of Rotating Mirror Cameras

    SciTech Connect

    Frank, A M; Bartolick, J M

    2006-08-25

    Rotating mirror cameras have been the mainstay of mega-frame per second imaging for decades. There is still no electronic camera that can match a film based rotary mirror camera for the combination of frame count, speed, resolution and dynamic range. The rotary mirror cameras are predominantly used in the range of 0.1 to 100 micro-seconds per frame, for 25 to more than a hundred frames. Electron tube gated cameras dominate the sub microsecond regime but are frame count limited. Video cameras are pushing into the microsecond regime but are resolution limited by the high data rates. An all solid state architecture, dubbed ''In-situ Storage Image Sensor'' or ''ISIS'', by Prof. Goji Etoh, has made its first appearance into the market and its evaluation is discussed. Recent work at Lawrence Livermore National Laboratory has concentrated both on evaluation of the presently available technologies and exploring the capabilities of the ISIS architecture. It is clear though there is presently no single chip camera that can simultaneously match the rotary mirror cameras, the ISIS architecture has the potential to approach their performance.

  14. 21 CFR 892.1110 - Positron camera.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 8 2012-04-01 2012-04-01 false Positron camera. 892.1110 Section 892.1110 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES RADIOLOGY DEVICES Diagnostic Devices § 892.1110 Positron camera. (a) Identification. A...

  15. 21 CFR 892.1110 - Positron camera.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 8 2014-04-01 2014-04-01 false Positron camera. 892.1110 Section 892.1110 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES RADIOLOGY DEVICES Diagnostic Devices § 892.1110 Positron camera. (a) Identification. A...

  16. 21 CFR 892.1110 - Positron camera.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Positron camera. 892.1110 Section 892.1110 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES RADIOLOGY DEVICES Diagnostic Devices § 892.1110 Positron camera. (a) Identification. A...

  17. 21 CFR 892.1110 - Positron camera.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Positron camera. 892.1110 Section 892.1110 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES RADIOLOGY DEVICES Diagnostic Devices § 892.1110 Positron camera. (a) Identification. A...

  18. 21 CFR 892.1110 - Positron camera.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 8 2013-04-01 2013-04-01 false Positron camera. 892.1110 Section 892.1110 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES RADIOLOGY DEVICES Diagnostic Devices § 892.1110 Positron camera. (a) Identification. A...

  19. Controlled Impact Demonstration (CID) tail camera video

    NASA Technical Reports Server (NTRS)

    1984-01-01

    The Controlled Impact Demonstration (CID) was a joint research project by NASA and the FAA to test a survivable aircraft impact using a remotely piloted Boeing 720 aircraft. The tail camera movie is one shot running 27 seconds. It shows the impact from the perspective of a camera mounted high on the vertical stabilizer, looking forward over the fuselage and wings.

  20. AIM: Ames Imaging Module Spacecraft Camera

    NASA Technical Reports Server (NTRS)

    Thompson, Sarah

    2015-01-01

    The AIM camera is a small, lightweight, low power, low cost imaging system developed at NASA Ames. Though it has imaging capabilities similar to those of $1M plus spacecraft cameras, it does so on a fraction of the mass, power and cost budget.