Sample records for camera narrow angle

  1. Lunar Reconnaissance Orbiter Camera Narrow Angle Cameras: Laboratory and Initial Flight Calibration

    Microsoft Academic Search

    D. C. Humm; M. Tschimmel; B. W. Denevi; S. Lawrence; P. Mahanti; T. N. Tran; P. C. Thomas; E. Eliason; M. S. Robinson

    2009-01-01

    The Lunar Reconnaissance Orbiter Camera (LROC) has two identical Narrow Angle Cameras (NACs). Each NAC is a monochrome pushbroom scanner, providing images with a pixel scale of 50 cm from a 50-km orbit. A single NAC image has a swath width of 2.5 km and a length of up to 26 km. The NACs are mounted to acquire side-by-side imaging

  2. Lunar Reconnaissance Orbiter Camera Narrow Angle Cameras: Laboratory and Initial Flight Calibration

    NASA Astrophysics Data System (ADS)

    Humm, D. C.; Tschimmel, M.; Denevi, B. W.; Lawrence, S.; Mahanti, P.; Tran, T. N.; Thomas, P. C.; Eliason, E.; Robinson, M. S.

    2009-12-01

    The Lunar Reconnaissance Orbiter Camera (LROC) has two identical Narrow Angle Cameras (NACs). Each NAC is a monochrome pushbroom scanner, providing images with a pixel scale of 50 cm from a 50-km orbit. A single NAC image has a swath width of 2.5 km and a length of up to 26 km. The NACs are mounted to acquire side-by-side imaging for a combined swath width of 5 km. The NAC is designed to fully characterize future human and robotic landing sites in terms of scientific and resource merit, trafficability, and hazards. The North and South poles will be mapped at 1-meter-scale poleward of 85.5 degrees latitude. Stereo coverage is achieved by pointing the NACs off-nadir, which requires planning in advance. Read noise is 91 and 93 e- and the full well capacity is 334,000 and 352,000 e- for NAC-L and NAC-R respectively. Signal-to-noise ranges from 42 for low-reflectance material with 70 degree illumination to 230 for high-reflectance material with 0 degree illumination. Longer exposure times and 2x binning are available to further increase signal-to-noise with loss of spatial resolution. Lossy data compression from 12 bits to 8 bits uses a companding table selected from a set optimized for different signal levels. A model of focal plane temperatures based on flight data is used to command dark levels for individual images, optimizing the performance of the companding tables and providing good matching of the NAC-L and NAC-R images even before calibration. The preliminary NAC calibration pipeline includes a correction for nonlinearity at low signal levels with an offset applied for DN>600 and a logistic function for DN<600. Flight images taken on the limb of the Moon provide a measure of stray light performance. Averages over many lines of images provide a measure of flat field performance in flight. These are comparable with laboratory data taken with a diffusely reflecting uniform panel.

  3. Two Years of Digital Terrain Model Production Using the Lunar Reconnaissance Orbiter Narrow Angle Camera

    NASA Astrophysics Data System (ADS)

    Burns, K.; Robinson, M. S.; Speyerer, E.; LROC Science Team

    2011-12-01

    One of the primary objectives of the Lunar Reconnaissance Orbiter Camera (LROC) is to gather stereo observations with the Narrow Angle Camera (NAC). These stereo observations are used to generate digital terrain models (DTMs). The NAC has a pixel scale of 0.5 to 2.0 meters but was not designed for stereo observations and thus requires the spacecraft to roll off-nadir to acquire these images. Slews interfere with the data collection of the other instruments, so opportunities are currently limited to four per day. Arizona State University has produced DTMs from 95 stereo pairs for 11 Constellation Project (CxP) sites (Aristarchus, Copernicus crater, Gruithuisen domes, Hortensius domes, Ina D-caldera, Lichtenberg crater, Mare Ingenii, Marius hills, Reiner Gamma, South Pole-Aitkin Rim, Sulpicius Gallus) as well as 30 other regions of scientific interest (including: Bhabha crater, highest and lowest elevation points, Highland Ponds, Kugler Anuchin, Linne Crater, Planck Crater, Slipher crater, Sears Crater, Mandel'shtam Crater, Virtanen Graben, Compton/Belkovich, Rumker Domes, King Crater, Luna 16/20/23/24 landing sites, Ranger 6 landing site, Wiener F Crater, Apollo 11/14/15/17, fresh craters, impact melt flows, Larmor Q crater, Mare Tranquillitatis pit, Hansteen Alpha, Moore F Crater, and Lassell Massif). To generate DTMs, the USGS ISIS software and SOCET SET° from BAE Systems are used. To increase the absolute accuracy of the DTMs, data obtained from the Lunar Orbiter Laser Altimeter (LOLA) is used to coregister the NAC images and define the geodetic reference frame. NAC DTMs have been used in examination of several sites, e.g. Compton-Belkovich, Marius Hills and Ina D-caldera [1-3]. LROC will continue to acquire high-resolution stereo images throughout the science phase of the mission and any extended mission opportunities, thus providing a vital dataset for scientific research as well as future human and robotic exploration. [1] B.L. Jolliff (2011) Nature Geoscience, in press. [2] Lawrence et al. (2011) LPSC XLII, Abst 2228. [3] Garry et al. (2011) LPSC XLII, Abst 2605.

  4. High-resolution topomapping of candidate MER landing sites with Mars Orbiter Camera narrow-angle images

    Microsoft Academic Search

    Randolph L. Kirk; Elpitha Howington-Kraus; Bonnie Redding; Donna Galuszka; Trent M. Hare; Brent A. Archinal; Laurence A. Soderblom; Janet M. Barrett

    2003-01-01

    We analyzed narrow-angle Mars Orbiter Camera (MOC-NA) images to produce high-resolution digital elevation models (DEMs) in order to provide topographic and slope information needed to assess the safety of candidate landing sites for the Mars Exploration Rovers (MER) and to assess the accuracy of our results by a variety of tests. The mapping techniques developed also support geoscientific studies and

  5. Narrow Angle movie

    NASA Technical Reports Server (NTRS)

    1999-01-01

    This brief three-frame movie of the Moon was made from three Cassini narrow-angle images as the spacecraft passed by the Moon on the way to its closest approach with Earth on August 17, 1999. The purpose of this particular set of images was to calibrate the spectral response of the narrow-angle camera and to test its 'on-chip summing mode' data compression technique in flight. From left to right, they show the Moon in the green, blue and ultraviolet regions of the spectrum in 40, 60 and 80 millisecond exposures, respectively. All three images have been scaled so that the brightness of Crisium basin, the dark circular region in the upper right, is the same in each image. The spatial scale in the blue and ultraviolet images is 1.4 miles per pixel (2.3 kilometers). The original scale in the green image (which was captured in the usual manner and then reduced in size by 2x2 pixel summing within the camera system) was 2.8 miles per pixel (4.6 kilometers). It has been enlarged for display to the same scale as the other two. The imaging data were processed and released by the Cassini Imaging Central Laboratory for Operations (CICLOPS) at the University of Arizona's Lunar and Planetary Laboratory, Tucson, AZ.

    Photo Credit: NASA/JPL/Cassini Imaging Team/University of Arizona

    Cassini, launched in 1997, is a joint mission of NASA, the European Space Agency and Italian Space Agency. The mission is managed by NASA's Jet Propulsion Laboratory, Pasadena, CA, for NASA's Office of Space Science, Washington DC. JPL is a division of the California Institute of Technology, Pasadena, CA.

  6. High-resolution topomapping of candidate MER landing sites with Mars Orbiter Camera narrow-angle images

    USGS Publications Warehouse

    Kirk, R.L.; Howington-Kraus, E.; Redding, B.; Galuszka, D.; Hare, T.M.; Archinal, B.A.; Soderblom, L.A.; Barrett, J.M.

    2003-01-01

    We analyzed narrow-angle Mars Orbiter Camera (MOC-NA) images to produce high-resolution digital elevation models (DEMs) in order to provide topographic and slope information needed to assess the safety of candidate landing sites for the Mars Exploration Rovers (MER) and to assess the accuracy of our results by a variety of tests. The mapping techniques developed also support geoscientific studies and can be used with all present and planned Mars-orbiting scanner cameras. Photogrammetric analysis of MOC stereopairs yields DEMs with 3-pixel (typically 10 m) horizontal resolution, vertical precision consistent with ???0.22 pixel matching errors (typically a few meters), and slope errors of 1-3??. These DEMs are controlled to the Mars Orbiter Laser Altimeter (MOLA) global data set and consistent with it at the limits of resolution. Photoclinometry yields DEMs with single-pixel (typically ???3 m) horizontal resolution and submeter vertical precision. Where the surface albedo is uniform, the dominant error is 10-20% relative uncertainty in the amplitude of topography and slopes after "calibrating" photoclinometry against a stereo DEM to account for the influence of atmospheric haze. We mapped portions of seven candidate MER sites and the Mars Pathfinder site. Safety of the final four sites (Elysium, Gusev, Isidis, and Meridiani) was assessed by mission engineers by simulating landings on our DEMs of "hazard units" mapped in the sites, with results weighted by the probability of landing on those units; summary slope statistics show that most hazard units are smooth, with only small areas of etched terrain in Gusev crater posing a slope hazard.

  7. Narrow-angle Astrometry with SUSI

    NASA Astrophysics Data System (ADS)

    Kok, Y.; Ireland, M. J.; Robertson, J. G.; Tuthill, P. G.; Warrington, B. A.; Tango, W. J.

    2014-09-01

    SUSI (Sydney University Stellar Interferometer) is currently being fitted with a 2nd beam combiner, MUSCA (Micro-arcsecond University of Sydney Companion Astrometry), for the purpose of narrow-angle astrometry. With an aim to achieve ˜10 micro-arcseconds of angular resolution at its best, MUSCA allows SUSI to search for planets around bright binary stars, which are its primary targets. While the first beam combiner, PAVO (Precision Astronomical Visible Observations), is used to track stellar fringes during an observation, MUSCA will be used to measure separations of binary stars. MUSCA is a Michelson interferometer and its setup at SUSI will be described in this poster.

  8. Peripapillary Schisis in Glaucoma Patients With Narrow Angles and

    E-print Network

    Srinivasan, Vivek J.

    Peripapillary Schisis in Glaucoma Patients With Narrow Angles and Increased Intraocular Pressure cases of peripapillary retinal schisis in patients with glaucoma without evidence of optic nerve pits patient was followed over time. RESULTS: The first patient, diagnosed with narrow angle glaucoma

  9. A camera for a narrow and deep welding groove

    NASA Astrophysics Data System (ADS)

    Vehmanen, Miika S.; Korhonen, Mika; Mäkynen, Anssi J.

    2008-06-01

    In this paper welding seam imaging in a very narrow and deep groove is presented. Standard camera optics can not be used as it does not reach the bottom of the groove. Therefore, selecting suitable imaging optics and components was the main challenge of the study. The implementation is based on image transmission via a borescope. The borescope has a long and narrow tube with graded index relay optics inside. To avoid excessive heating, the borescope tube is enclosed in a cooling pipe. The performance of the imaging system was tested by measuring its modulation transfer function (MTF) and visually evaluated its distortion. The results show that a borescope providing VGA resolution is adequate for the application. The spectrum of the welding processes was studied to determine optimum window to observe the welding seam and electrode. Optimal bandwidth was found in region of 700nm-1000nm.

  10. SCDU (Spectral Calibration Development Unit) Testbed Narrow Angle Astrometric Performance

    NASA Technical Reports Server (NTRS)

    Wang, Xu; Goullioud, Renaud; Nemati, Bijan; Shao, Michael; Wehmeier, Udo J.; Weilert, Mark A.; Werne, Thomas A.; Wu, Janet P.; Zhai, Chengxing

    2010-01-01

    The most stringent astrometric performance requirements on NASA's SIM(Space Interferometer Mission)-Lite mission will come from the so-called Narrow-Angle (NA) observing scenario, aimed at finding Earth-like exoplanets, where the interferometer chops between the target star and several nearby reference stars multiple times over the course of a single visit. Previously, about 20 pm NA error with various shifts was reported. Since then, investigation has been under way to understand the mechanisms that give rise to these shifts. In this paper we report our findings, the adopted mitigation strategies, and the resulting testbed performance.

  11. New comparative ultrasound biomicroscopic findings between fellow eyes of acute angle closure and glaucomatous eyes with narrow angle

    Microsoft Academic Search

    Rafael Vidal Mérula; Sebastião Cronemberger; Alberto Diniz Filho; Nassim Calixto

    2008-01-01

    Purpose: To compare morphometric features between fellow acute primary angle-closure (APAC) eyes and glaucomatous or suspect eyes with narrow angle (NA). Methods: Fellow eyes of 30 patients with unilateral APAC and 30 with NA were evaluated by ultrasound biomicroscopy (UBM) under light and dark conditions. UBM parameters such as anterior chamber depth (ACD), angle opening distance at 250 µm\\/500 µm

  12. Plane-based external camera calibration with accuracy measured by relative deflection angle

    E-print Network

    Nigan, King Ngi

    -camera system can be accomplished. In order to evaluate and compare the calibration results for different cameraPlane-based external camera calibration with accuracy measured by relative deflection angle Chunhui Keywords: Camera calibration Homography Accuracy evaluation a b s t r a c t In this paper, we present

  13. Image reconstruction from limited angle Compton camera data

    NASA Astrophysics Data System (ADS)

    Tomitani, T.; Hirasawa, M.

    2002-06-01

    The Compton camera is used for imaging the distributions of ? ray direction in a ? ray telescope for astrophysics and for imaging radioisotope distributions in nuclear medicine without the need for collimators. The integration of ? rays on a cone is measured with the camera, so that some sort of inversion method is needed. Parra found an analytical inversion algorithm based on spherical harmonics expansion of projection data. His algorithm is applicable to the full set of projection data. In this paper, six possible reconstruction algorithms that allow image reconstruction from projections with a finite range of scattering angles are investigated. Four algorithms have instability problems and two others are practical. However, the variance of the reconstructed image diverges in these two cases, so that window functions are introduced with which the variance becomes finite at a cost of spatial resolution. These two algorithms are compared in terms of variance. The algorithm based on the inversion of the summed back-projection is superior to the algorithm based on the inversion of the summed projection.

  14. Integral three-dimensional capture system with enhanced viewing angle by using camera array

    NASA Astrophysics Data System (ADS)

    Miura, Masato; Okaichi, Naoto; Arai, Jun; Mishina, Tomoyuki

    2015-03-01

    A three-dimensional (3D) capture system based on integral imaging with an enhanced viewing zone by using a camera array was developed. The viewing angle of the 3D image can be enlarged depending on the number of cameras consisting of the camera array. The 3D image was captured by using seven high-definition cameras, and converted to be displayed by using a 3D display system with a 4K LCD panel, and it was confirmed that the viewing angle of the 3D image can be enlarged by a factor of 2.5 compared with that of a single camera.

  15. 10. 22'X34' original blueprint, VariableAngle Launcher, 'SIDE VIEW CAMERA CARSTEEL ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    10. 22'X34' original blueprint, Variable-Angle Launcher, 'SIDE VIEW CAMERA CAR-STEEL FRAME AND AXLES' drawn at 1/2'=1'-0'. (BOURD Sketch # 209124). - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  16. 13. 22'X34' original vellum, VariableAngle Launcher, 'SIDEVIEW CAMERA CAR TRACK ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    13. 22'X34' original vellum, Variable-Angle Launcher, 'SIDEVIEW CAMERA CAR TRACK DETAILS' drawn at 1/4'=1'-0' (BUORD Sketch # 208078, PAPW 908). - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  17. Narrow Field-Of Visual Odometry Based on a Focused Plenoptic Camera

    NASA Astrophysics Data System (ADS)

    Zeller, N.; Quint, F.; Stilla, U.

    2015-03-01

    In this article we present a new method for visual odometry based on a focused plenoptic camera. This method fuses the depth data gained by a monocular Simultaneous Localization and Mapping (SLAM) algorithm and the one received from a focused plenoptic camera. Our algorithm uses the depth data and the totally focused images supplied by the plenoptic camera to run a real-time semi-dense direct SLAM algorithm. Based on this combined approach, the scale ambiguity of a monocular SLAM system can be overcome. Furthermore, the additional light-field information highly improves the tracking capabilities of the algorithm. Thus, visual odometry even for narrow field of view (FOV) cameras is possible. We show that not only tracking profits from the additional light-field information. By accumulating the depth information over multiple tracked images, also the depth accuracy of the focused plenoptic camera can be highly improved. This novel approach improves the depth error by one order of magnitude compared to the one received from a single light-field image.

  18. Improved iris localization by using wide and narrow field of view cameras for iris recognition

    NASA Astrophysics Data System (ADS)

    Kim, Yeong Gon; Shin, Kwang Yong; Park, Kang Ryoung

    2013-10-01

    Biometrics is a method of identifying individuals by their physiological or behavioral characteristics. Among other biometric identifiers, iris recognition has been widely used for various applications that require a high level of security. When a conventional iris recognition camera is used, the size and position of the iris region in a captured image vary according to the X, Y positions of a user's eye and the Z distance between a user and the camera. Therefore, the searching area of the iris detection algorithm is increased, which can inevitably decrease both the detection speed and accuracy. To solve these problems, we propose a new method of iris localization that uses wide field of view (WFOV) and narrow field of view (NFOV) cameras. Our study is new as compared to previous studies in the following four ways. First, the device used in our research acquires three images, one each of the face and both irises, using one WFOV and two NFOV cameras simultaneously. The relation between the WFOV and NFOV cameras is determined by simple geometric transformation without complex calibration. Second, the Z distance (between a user's eye and the iris camera) is estimated based on the iris size in the WFOV image and anthropometric data of the size of the human iris. Third, the accuracy of the geometric transformation between the WFOV and NFOV cameras is enhanced by using multiple matrices of the transformation according to the Z distance. Fourth, the searching region for iris localization in the NFOV image is significantly reduced based on the detected iris region in the WFOV image and the matrix of geometric transformation corresponding to the estimated Z distance. Experimental results showed that the performance of the proposed iris localization method is better than that of conventional methods in terms of accuracy and processing time.

  19. A switchable light field camera architecture with Angle Sensitive Pixels and dictionary-based sparse coding

    E-print Network

    Hirsch, Matthew Waggener

    We propose a flexible light field camera architecture that is at the convergence of optics, sensor electronics, and applied mathematics. Through the co-design of a sensor that comprises tailored, Angle Sensitive Pixels and ...

  20. The design and fabricate of wide angle 905nm narrow band filter

    NASA Astrophysics Data System (ADS)

    Shi, Baohua; Li, Zaijin; Li, Hongyu; Qu, Yi

    2014-12-01

    All-dielectric film narrow band filter is widely used in laser system owing to its excellent optical capability, manufacturability and environmental adaptability. But 905nm infrared semiconductor laser system have large divergence angel so we designed entrance light cone angle 905nm narrow band filter. And center wavelength shift, due to entrance light cone angle, affects its spectral selective power seriously. In order to reduce these impacts, an informal dielectric film narrowband filter is designed. Changes of transmission characteristics with oblique incidence of Gaussian beam of uneven illumination are analyzed. The relationship between the angle of incidence and the central wavelength shift quantificational are Solved. A ± 30 ° incident 905nm narrowband filter was fabricated. Between 880nm and 950nm, the average transmittance is above 90%, and at the cut-off band the average transmittance is below 1%.

  1. Data analysis of narrow-angle field dependent tests in the MAM testbed interferometer

    NASA Astrophysics Data System (ADS)

    Shen, T. J.; Goullioud, R.; Catanzarite, J.; Shao, M.; Yu, J.; Machuzak, R.

    2002-12-01

    The Microarcsecond Metrology Testbed (MAM) developed at the Jet Propulsion Laboratory is a single-baseline interferometer coupled with a precision pseudostar. It is designed to test the ability of the SIM science interferometer to perform microarcsecond stellar astrometry over both narrow-angle (1 degree) and wide-angle (7.5 degree) fields. The MAM Testbed features an optical interferometer with a white light source, all major optical components of a stellar interferometer and heterodyne metrology sensors. This paper will describe the performance metric used to evaluate our narrow-angle field dependent data and presents the results of the analysis. The narrow-angle 3 star observation scenario implemented in the MAM testbed consists of 1 target (T) star and 2 accompanied reference (R1,R2) stars, which are 1 degree apart horizontally from the target star. The observation of target (science) and reference stars are interlaced (R1,T,R2,T,repeat) in order to remove temporal and spatial drifts between consecutive measurements of the target star. The total observation time for target star is twice that of the 2 reference companions. Cyclic averaging was implemented in our observations in addition to interlacing. The least squares algorithm tested in our field independent measurements is applied to solve for the delays (or paths) of the target star and the 2 reference stars. A super chop variance was adopted as our performance metric. This super chop variance will remove the drifts of the target star path from its path differences with respect to the 2 reference stars. Recent data is presented which demonstrates agreement between the metrology and starlight paths to be better than 150pm in the 3 star narrow angle field of view. The research described was performed at the Jet Propulsion Laboratory, California Institute of Technology, under a contract with the National Aeronautics and Sp ace Administration.

  2. Alternative approach to precision narrow-angle astrometry for Antarctic long baseline interferometry

    NASA Astrophysics Data System (ADS)

    Kok, Yitping; Ireland, Michael J.; Rizzuto, Aaron C.; Tuthill, Peter G.; Robertson, J. Gordon; Warrington, Benjamin A.; Tango, William J.

    2014-07-01

    The conventional approach to high-precision narrow-angle astrometry using a long baseline interferometer is to directly measure the fringe packet separation of a target and a nearby reference star. This is done by means of a technique known as phase-referencing which requires a network of dual beam combiners and laser metrology systems. Using an alternative approach that does not rely on phase-referencing, the narrow-angle astrometry of several closed binary stars (with separation less than 2''), as described in this paper, was carried out by observing the fringe packet crossing event of the binary systems. Such an event occurs twice every sidereal day when the line joining the two stars of the binary is is perpendicular to the projected baseline of the interferometer. Observation of these events is well suited for an interferometer in Antarctica. Proof of concept observations were carried out at the Sydney University Stellar Interferometer (SUSI) with targets selected according to its geographical location. Narrow-angle astrometry using this indirect approach has achieved sub-100 micro-arcsecond precision.

  3. On-Orbit Geometric Calibration of the Lunar Reconnaissance Orbiter Wide Angle Camera

    NASA Astrophysics Data System (ADS)

    Speyerer, E. J.; Wagner, R.; Robinson, M. S.

    2013-12-01

    Lunar Reconnaissance Orbiter (LRO) is equipped with a single Wide Angle Camera (WAC) [1] designed to collect monochromatic and multispectral observations of the lunar surface. Cartographically accurate image mosaics and stereo image based terrain models requires the position of each pixel in a given image be known to a corresponding point on the lunar surface with a high degree of accuracy and precision. The Lunar Reconnaissance Orbiter Camera (LROC) team initially characterized the WAC geometry prior to launch at the Malin Space Science Systems calibration facility. After lunar orbit insertion, the LROC team recognized spatially varying geometric offsets between color bands. These misregistrations made analysis of the color data problematic and showed that refinements to the pre-launch geometric analysis were necessary. The geometric parameters that define the WAC optical system were characterized from statistics gathered from co-registering over 84,000 image pairs. For each pair, we registered all five visible WAC bands to a precisely rectified Narrow Angle Camera (NAC) image (accuracy <15 m) [2] to compute key geometric parameters. In total, we registered 2,896 monochrome and 1,079 color WAC observations to nearly 34,000 NAC observations and collected over 13.7 million data points across the visible portion of the WAC CCD. Using the collected statistics, we refined the relative pointing (yaw, pitch and roll), effective focal length, principal point coordinates, and radial distortion coefficients. This large dataset also revealed spatial offsets between bands after orthorectification due to chromatic aberrations in the optical system. As white light enters the optical system, the light bends at different magnitudes as a function of wavelength, causing a single incident ray to disperse in a spectral spread of color [3,4]. This lateral chromatic aberration effect, also known as 'chromatic difference in magnification' [5] introduces variation to the effective focal length for each WAC band. Secondly, tangential distortions caused by minor decentering in the optical system altered the derived exterior orientation parameters for each 14-line WAC band. We computed the geometric parameter sets separately for each band to characterize the lateral chromatic aberrations and the decentering components in the WAC optical system. From this approach, we negated the need for additional tangential terms in the distortion model, thus reducing the number of computations during image orthorectification and therefore expediting the orthorectification process. We undertook a similar process for refining the geometry for the UV bands (321 and 360 nm), except we registered each UV bands to orthorectified visible bands of the same WAC observation (the visible bands have resolutions 4 times greater than the UV). The resulting 7-band camera model with refined geometric parameters enables map projection with sub-pixel accuracy. References: [1] Robinson et al. (2010) Space Sci. Rev. 150, 81-124 [2] Wagner et al. (2013) Lunar Sci Forum [3] Mahajan, V.N. (1998) Optical Imaging and Aberrations [4] Fiete, R.D. (2013), Manual of Photogrammetry, pp. 359-450 [5] Brown, D.C. (1966) Photometric Eng. 32, 444-462.

  4. Phase Referencing and Narrow-Angle Astrometry in Current and Future Interferometers

    E-print Network

    Benjamin F. Lane; Matthew W. Muterspaugh

    2004-07-14

    Atmospheric turbulence is a serious problem for ground-based interferometers. It places tight limits on both sensitivity and measurement precision. Phase referencing is a method to overcome these limitations via the use of a bright reference star. The Palomar Testbed Interferometer was designed to use phase referencing and so can provide a pair of phase-stabilized starlight beams to a second (science) beam combiner. We have used this capability for several interesting studies, including very narrow angle astrometry. For close (1-arcsecond) pairs of stars we are able to achieve a differential astrometric precision in the range 20-30 micro-arcseconds.

  5. GRAVITY: the VLTI 4-beam combiner for narrow-angle astrometry and interferometric imaging

    E-print Network

    Blind, N; Gillessen, S; Kok, Y; Lippa, M; Perrin, G; Dembet, R; Fedou, P; Lacour, S; Perraut, K; Jocou, L; Burtscher, L; Hans, O; Haug, M; Haussmann, F; Huber, S; Janssen, A; Kellner, S; Ott, T; Pfuhl, O; Sturm, E; Weber, J; Wieprecht, E; Amorim, A; Brandner, W; Straubmeier, C

    2015-01-01

    GRAVITY is the second generation Very Large Telescope Interferometer instrument for precision narrow-angle astrometry and interferometric imaging in the Near Infra-Red (NIR). It shall provide precision astrometry of order 10 microarcseconds, and imaging capability at a few milliarcsecond resolution, and hence will revolutionise dynamical measurements of celestial objects. GRAVITY is currently in the last stages of its integration and tests in Garching at MPE, and will be delivered to the VLT Interferometer (VLTI) in 2015. We present here the instrument, with a particular focus on the components making use of fibres: integrated optics beam combiners, polarisation rotators, fibre differential delay lines, and the metrology.

  6. Narrow-angle tail radio sources and the distribution of galaxy orbits in Abell clusters

    NASA Technical Reports Server (NTRS)

    O'Dea, Christopher P.; Sarazin, Craig L.; Owen, Frazer N.

    1987-01-01

    The present data on the orientations of the tails with respect to the cluster centers of a sample of 70 narrow-angle-tail (NAT) radio sources in Abell clusters show the distribution of tail angles to be inconsistent with purely radial or circular orbits in all the samples, while being consistent with isotropic orbits in (1) the whole sample, (2) the sample of NATs far from the cluster center, and (3) the samples of morphologically regular Abell clusters. Evidence for very radial orbits is found, however, in the sample of NATs near the cluster center. If these results can be generalized to all cluster galaxies, then the presence of radial orbits near the center of Abell clusters suggests that violent relaxation may not have been fully effective even within the cores of the regular clusters.

  7. Improving sensitivity of a small angle x-ray scattering camera with pinhole collimation using separated optical elements

    E-print Network

    Paris-Sud XI, Université de

    Improving sensitivity of a small angle x-ray scattering camera with pinhole collimation using Our aim is to increase the sensitivity of a pinhole camera, in order to be able to analyze a wide show that a significant improvement in the sensitivity of a Huxley­Holmes design for a small angle x

  8. A two camera video imaging system with application to parafoil angle of attack measurements

    NASA Technical Reports Server (NTRS)

    Meyn, Larry A.; Bennett, Mark S.

    1991-01-01

    This paper describes the development of a two-camera, video imaging system for the determination of three-dimensional spatial coordinates from stereo images. This system successfully measured angle of attack at several span-wise locations for large-scale parafoils tested in the NASA Ames 80- by 120-Foot Wind Tunnel. Measurement uncertainty for angle of attack was less than 0.6 deg. The stereo ranging system was the primary source for angle of attack measurements since inclinometers sewn into the fabric ribs of the parafoils had unknown angle offsets acquired during installation. This paper includes discussions of the basic theory and operation of the stereo ranging system, system measurement uncertainty, experimental set-up, calibration results, and test results. Planned improvements and enhancements to the system are also discussed.

  9. Narrow-angle tail radio sources and evidence for radial orbits in Abell clusters

    NASA Technical Reports Server (NTRS)

    O'Dea, Christopher P.; Owen, Frazer N.; Sarazin, Craig L.

    1986-01-01

    Published observational data on the tail orientations (TOs) of 60 narrow-angle-tail (NAT) radio sources in Abell clusters of galaxies are analyzed statistically using a maximum-likelihood approach. The results are presented in a table, and it is found that the observed TO distributions in the whole sample and in subsamples of morphologically regular NATs and NATs with pericentric distances d greater than 500 kpc are consistent with isotropic orbits, whereas the TOs for NATs with d less than 500 kpc are consistent with highly radial orbits. If radial orbits were observed near the centers of other types of cluster galaxies as well, it could be inferred that violent relaxation during cluster formation was incomplete, and that clusters form by spherical collapse and secondary infall, as proposed by Gunn (1977).

  10. Development of equatorial visible\\/infrared wide angle viewing system and radial neutron camera for ITER

    Microsoft Academic Search

    Sophie Salasca; Basilio Esposito; Yann Corre; Maryline Davi; Christian Dechelle; Florian Pasdeloup; Roger Reichle; Jean-Marcel Travère; Giorgio Brolatti; Daniele Marocco; Fabio Moro; Luigino Petrizzi; Tonio Pinna; Marco Riva; Rosaria Villari; Eduardo De La Cal; Carlos Hidalgo; Ana Manzanares; Jose Luis De Pablos; Rafael Vila; Gabor Hordosy; Daniel Nagy; Sandor Recsei; Szilveszter Tulipan; Andre Neto; Carlos Silva; Luciano Bertalot; Chris Walker; Christian Ingesson; Yuri Kaschuck

    2009-01-01

    The exploitation of ITER tokamak will require diagnostics for machine protection, inputs to plasma control systems, evaluation and analysis of plasma parameters and performances.The equatorial visible\\/infrared wide angle viewing system and the radial neutron camera are the two main diagnostics of Procurement Package 11 (PP11), one of the diagnostic procurements under the responsibility of Europe, which also contains Equatorial Port

  11. LROC - Lunar Reconnaissance Orbiter Camera

    Microsoft Academic Search

    M. S. Robinson; E. Eliason; H. Hiesinger; B. L. Jolliff; A. McEwen; M. C. Malin; M. A. Ravine; P. C. Thomas; E. P. Turtle

    2009-01-01

    The Lunar Reconnaissance Orbiter (LRO) went into lunar orbit on 23 June 2009. The LRO Camera (LROC) acquired its first lunar images on June 30 and commenced full scale testing and commissioning on July 10. The LROC consists of two narrow-angle cameras (NACs) that provide 0.5 m scale panchromatic images over a combined 5 km swath, and a wide-angle camera

  12. Calibration of the Lunar Reconnaissance Orbiter Camera

    Microsoft Academic Search

    M. Tschimmel; M. S. Robinson; D. C. Humm; B. W. Denevi; S. J. Lawrence; S. Brylow; M. Ravine; T. Ghaemi

    2008-01-01

    The Lunar Reconnaissance Orbiter Camera (LROC) onboard the NASA Lunar Reconnaissance Orbiter (LRO) spacecraft consists of three cameras: the Wide-Angle Camera (WAC) and two identical Narrow Angle Cameras (NAC-L, NAC-R). The WAC is push-frame imager with 5 visible wavelength filters (415 to 680 nm) at a spatial resolution of 100 m\\/pixel and 2 UV filters (315 and 360 nm) with

  13. Lunar Reconnaissance Orbiter Camera (LROC) Instrument Overview

    Microsoft Academic Search

    M. S. Robinson; S. M. Brylow; M. Tschimmel; D. Humm; S. J. Lawrence; P. C. Thomas; B. W. Denevi; E. Bowman-Cisneros; J. Zerr; M. A. Ravine; M. A. Caplinger; F. T. Ghaemi; J. A. Schaffner; M. C. Malin; P. Mahanti; A. Bartels; J. Anderson; T. N. Tran; E. M. Eliason; A. S. McEwen; E. Turtle; B. L. Jolliff; H. Hiesinger

    2010-01-01

    The Lunar Reconnaissance Orbiter Camera (LROC) Wide Angle Camera (WAC) and Narrow Angle Cameras (NACs) are on the NASA Lunar\\u000a Reconnaissance Orbiter (LRO). The WAC is a 7-color push-frame camera (100 and 400 m\\/pixel visible and UV, respectively), while\\u000a the two NACs are monochrome narrow-angle linescan imagers (0.5 m\\/pixel). The primary mission of LRO is to obtain measurements\\u000a of the Moon that

  14. Comparison of Scheimpflug imaging and spectral domain anterior segment optical coherence tomography for detection of narrow anterior chamber angles

    Microsoft Academic Search

    D S Grewal; G S Brar; R Jain; S P S Grewal

    2011-01-01

    PurposeTo compare the performance of anterior chamber volume (ACV) and anterior chamber depth (ACD) obtained using Scheimpflug imaging with angle opening distance (AOD500) and trabecular-iris space area (TISA500) obtained using spectral domain anterior segment optical coherence tomography (SD-ASOCT) in detecting narrow angles classified using gonioscopy.MethodsIn this prospective, cross-sectional observational study, 265 eyes of 265 consecutive patients underwent sequential Scheimpflug imaging,

  15. Dynamic closed-loop test for real-time drift angle adjustment of space camera on the Earth

    NASA Astrophysics Data System (ADS)

    Hu, Jun; Cao, Xiaotao; Wang, Dong; Wu, Weiping; Xu, Shuyan

    2010-10-01

    In order to eliminate the influence of aircraft attitude angle to the image quality of space camera, and assure that the drift angle of space camera could be accurately adjusted at the orbit, a novel closed-loop test method is provided for real-time drift angle adjustment of space camera on the Earth. A long focal length dynamic aim generator is applied to simulate the image motion and the variety drift angle, and to detect the precision of the image motion compensation machinery and the capability of the drift angle control system. The computer system is used to control the dynamic aim generator, accomplish the data processing, transmit and receive the data information. The seamless connection and the data transmission between the aim generator and the aircraft simulation devices are constituted. The command, parameter and drift angle data transmitted by the simulation devices are received by the space camera at the real time, then the photos are taken and the draft angle is adjusted simultaneously. It is shown that the drift angle can be accurately tracked by the space camera at the real time, and the detective method satisfies the test requirement.

  16. The Mars observer camera

    NASA Technical Reports Server (NTRS)

    Malin, M. C.; Danielson, G. E.; Ingersoll, A. P.; Masursky, H.; Veverka, J.; Soulanille, T.; Ravine, M.

    1987-01-01

    A camera designed to operate under the extreme constraints of the Mars Observer Mission was selected by NASA in April, 1986. Contingent upon final confirmation in mid-November, the Mars Observer Camera (MOC) will begin acquiring images of the surface and atmosphere of Mars in September-October 1991. The MOC incorporates both a wide angle system for low resolution global monitoring and intermediate resolution regional targeting, and a narrow angle system for high resolution selective surveys. Camera electronics provide control of image clocking and on-board, internal editing and buffering to match whatever spacecraft data system capabilities are allocated to the experiment. The objectives of the MOC experiment follow.

  17. Low sooting combustion of narrow-angle wall-guided sprays in an HSDI diesel engine with retarded injection timings

    Microsoft Academic Search

    Tiegang Fang; Chia-fon F. Lee

    2011-01-01

    An optically accessible single-cylinder high speed direct-injection (HSDI) diesel engine was used to investigate the spray and combustion processes with narrow-angle wall-guided sprays. Influences of injection timings and injection pressure on combustion characteristics and emissions were studied. In-cylinder pressure was measured and used for heat release analysis. High-speed spray and combustion videos were captured. NOx emissions were measured in the

  18. Empirical Photometric Normalization for the Seven Band UV-VIS Lunar Reconnaissance Orbiter Wide Angle Camera

    NASA Astrophysics Data System (ADS)

    Boyd, A. K.; Robinson, M. S.; Nuno, R. G.; Sato, H.

    2014-12-01

    We present results on a near-global (80°S to 80°N) seven color Wide Angle Camera (WAC) photometric normalization and color analysis. Over 100,000 WAC color observations were calibrated to reflectance (radiance factor: IoF), and photometric angles (i,e,g), latitude, and longitude were calculated and stored for each WAC pixel. Photometric angles were calculated using the WAC GLD100 [1], and a six-dimensional data set (3 spatial and 3 photometric) was reduced to three by photometrically normalizing the IoF with a global wavelength-dependent, 3rd-order multivariate polynomial. The multispectral mosaic was normalized to a standard viewing geometry (incidence angle=30°, emission angle=0°, phase angle=30°).The WAC has a 60° cross-track field-of-view in color mode, which allows the acquisition of a near global data set each month; however, the phase angle can change by as much as 60° across each image. These large changes in viewing geometry present challenges to the required photometric normalization. In the ratio of the 321 nm and 689 nm wavelengths, the Moon has a standard deviation less than 3% in the highlands and 7% globally; thus to allow confident identification of true color differences, the photometric normalization must be precise. Pyroclastic deposits in Marius Hills, Sinus Aestuum, and Mare Serenitatis are among the least reflective materials, with 643 nm normalized reflectance values less than 0.036.Low-reflectance deposits are generally concentrated close to the equator on the nearside, whereas high-reflectance materials are dispersed globally. The highest reflectance materials occur at Giordano Bruno and Virtanen craters and are attributed to exposure of immature materials. Immature ejecta has shallower spectral slope compared to the mean highlands spectra (321 nm to 689 nm), and UV weathering characteristics can be seen when comparing different aged Copernican ejecta [2]. Copernican ejecta is found to have 643 nm reflectance values greater than 0.36 in some areas. The range of reflectance on the Moon is 10x from the least to most reflective.The new empirical normalized reflectance presented here correlates with an independent Hapke model based normalization [3] with an R-squared value of 0.985.[1] Scholten et al. LPSC XVII (2011) [2] Denevi et al. JGR Planets (2014) [3] Sato et al. JGR Planets (2014)

  19. A triple axis double crystal multiple reflection camera for ultra small angle X-ray scattering

    NASA Astrophysics Data System (ADS)

    Lambard, Jacques; Lesieur, Pierre; Zemb, Thomas

    1992-06-01

    To extend the domain of small angle X-ray scattering requires multiple reflection crystals to collimate the beam. A double crystal, triple axis X-ray camera using multiple reflection channel cut crystals is described. Procedures for measuring the desmeared scattering cross-section on absolute scale are described as well as the measurement from several typical samples : fibrils of collagen, 0.3 ?m diameter silica spheres, 0.16 ?m diameter interacting latex spheres, porous lignite coal, liquid crystals in a surfactant-water system, colloidal crystal of 0.32 ?m diameter silica spheres. L'extension du domaine de diffusion des rayons-X vers les petits angles demande l'emploi de cristaux à réflexions multiples pour collimater le faisceau. Nous décrivons une caméra à rayons-X à trois axes où les réflexions multiples sont réalisées dans deux cristaux à gorge. Nous donnons ensuite les procédures de déconvolution pour obtenir la section efficace de diffusion en échelle absolue, ainsi que les résultats des mesures effectuées avec plusieurs échantillons typiques : fibres de collagène, sphères de silice de 0,3 ?m de diamètre, sphères de latex de 0,16 ?m de diamètre en interaction, charbon lignite poreux, cristaux liquides formés dans un système eau-tensioactif, solution colloïdale de sphères de silice de 0,32 ?m de diamètre.

  20. Development of a large-angle pinhole gamma camera with depth-of-interaction capability for small animal imaging

    NASA Astrophysics Data System (ADS)

    Baek, C.-H.; An, S. J.; Kim, H.-I.; Choi, Y.; Chung, Y. H.

    2012-01-01

    A large-angle gamma camera was developed for imaging small animal models used in medical and biological research. The simulation study shows that a large field of view (FOV) system provides higher sensitivity with respect to a typical pinhole gamma cameras by reducing the distance between the pinhole and the object. However, this gamma camera suffers from the degradation of the spatial resolution at the periphery region due to parallax error by obliquely incident photons. We propose a new method to measure the depth of interaction (DOI) using three layers of monolithic scintillators to reduce the parallax error. The detector module consists of three layers of monolithic CsI(Tl) crystals with dimensions of 50.0 × 50.0 × 2.0 mm3, a Hamamatsu H8500 PSPMT and a large-angle pinhole collimator with an acceptance angle of 120°. The 3-dimensional event positions were determined by the maximum-likelihood position-estimation (MLPE) algorithm and the pre-generated look up table (LUT). The spatial resolution (FWHM) of a Co-57 point-like source was measured at different source position with the conventional method (Anger logic) and with DOI information. We proved that high sensitivity can be achieved without degradation of spatial resolution using a large-angle pinhole gamma camera: this system can be used as a small animal imaging tool.

  1. Copernican craters: Early results from the Lunar Reconnaissance Orbiter Camera

    Microsoft Academic Search

    A. S. McEwen; H. Hiesinger; P. C. Thomas; M. S. Robinson; C. van der Bogert; L. Ostrach; J. B. Plescia; V. J. Bray; L. L. Tornabene

    2009-01-01

    The youngest (Copernican) craters on the Moon provide the best examples of original crater morphology and a record of the impact flux over the last ~1 Ga in the Earth-Moon system. The LRO Narrow Angle Cameras (NAC) provide 50 cm pixels from an altitude of 50 km. With changing incidence angle, global access, and very high data rates, these cameras

  2. Numerical simulations of the bending of narrow-angle-tail radio jets by ram pressure or pressure gradients

    NASA Technical Reports Server (NTRS)

    Soker, Noam; Sarazin, Craig L.; O'Dea, Christopher P.

    1988-01-01

    Three-dimensional numerical hydrodynamic simulations are used to study the bending of radio jets. The simulations are compared with observations of jets in narrow-angle-tail radio sources. Two mechanisms for the observed bending are considered: direct bending of quasi-continuous jets by ram pressure from intergalactic gas and bending by pressure gradients in the interstellar gas of the host galaxy, the pressure gradients themselves being the result of ram pressure by intergalactic gas. It is shown that the pressure gradients are much less effective in bending jets, implying that the jets have roughly 30 times lower momentum fluxes if they are bent by this mechanism. Ram-pressure bending produces jets with 'kidney-shaped' cross sections; when observed from the side, these jets appear to have diffuse extensions on the downstream side. On the other hand, pressure-gradient bending causes the jets to be densest near their upstream side.

  3. Early direct-injection, low-temperature combustion of diesel fuel in an optical engine utilizing a 15-hole, dual-row, narrow-included-angle nozzle.

    SciTech Connect

    Gehrke, Christopher R. (Caterpillar Inc.); Radovanovic, Michael S. (Caterpillar Inc.); Milam, David M. (Caterpillar Inc.); Martin, Glen C.; Mueller, Charles J.

    2008-04-01

    Low-temperature combustion of diesel fuel was studied in a heavy-duty, single-cylinder optical engine employing a 15-hole, dual-row, narrow-included-angle nozzle (10 holes x 70/mD and 5 holes x 35/mD) with 103-/gmm-diameter orifices. This nozzle configuration provided the spray targeting necessary to contain the direct-injected diesel fuel within the piston bowl for injection timings as early as 70/mD before top dead center. Spray-visualization movies, acquired using a high-speed camera, show that impingement of liquid fuel on the piston surface can result when the in-cylinder temperature and density at the time of injection are sufficiently low. Seven single- and two-parameter sweeps around a 4.82-bar gross indicated mean effective pressure load point were performed to map the sensitivity of the combustion and emissions to variations in injection timing, injection pressure, equivalence ratio, simulated exhaust-gas recirculation, intake temperature, intake boost pressure, and load. High-speed movies of natural luminosity were acquired by viewing through a window in the cylinder wall and through a window in the piston to provide quasi-3D information about the combustion process. These movies revealed that advanced combustion phasing resulted in intense pool fires within the piston bowl, after the end of significant heat release. These pool fires are a result of fuel-films created when the injected fuel impinged on the piston surface. The emissions results showed a strong correlation with pool-fire activity. Smoke and NO/dx emissions rose steadily as pool-fire intensity increased, whereas HC and CO showed a dramatic increase with near-zero pool-fire activity.

  4. Angles

    NSDL National Science Digital Library

    Jo Edkins

    2007-01-01

    This set of eight interactive activities lets the user explore angles from many different perspectives. Activities include (1) visualizing the size of an angle; (2) examining objects that will stand or fall with right and non-right angles; (3) identifying obtuse, right, acute and straight angles; (4) guessing angle measures with different levels of precision; (5) exploring regular shapes and their angle measures; (6) studying angles in a fractal tree that is drawn with user inputs of the same angle measure between the branches at each stage; (7) exploring angle measures through firing a cannon (8) drawing with a Logo activity.

  5. Influence of the angle of incidence on the sensitivity of gamma camera based PET

    Microsoft Academic Search

    Stefaan Vandenberghe; Yves D'Asseler; Jeff Kolthammer; Rik Van de Walle; Ignace Lemahieu; Rudi A. Dierckx

    2002-01-01

    Thicker crystals have been used to increase the detection efficiency of gamma cameras for coincidence imaging. This results in a higher detection probability for oblique incidences than for perpendicular incidences. As the point sensitivity at different radial distances is composed of coincidences with different oblique incidences, the thickness of the crystal will have an effect on the sensitivity profiles. To

  6. Angles

    NSDL National Science Digital Library

    Practice your knowledge of acute, obtuse, and alternate angles. Also, practice relationships between angles - vertical, adjacent, alternate, same-side, and corresponding. Angles is one of the Interactivate assessment explorers.

  7. Enantiopure narrow bite-angle P-OP ligands: synthesis and catalytic performance in asymmetric hydroformylations and hydrogenations.

    PubMed

    Fernández-Pérez, Héctor; Benet-Buchholz, Jordi; Vidal-Ferran, Anton

    2014-11-17

    Herein is reported the preparation of a set of narrow bite-angle P-OP ligands the backbone of which contains a stereogenic carbon atom. The synthesis was based on a Corey-Bakshi-Shibata (CBS)-catalyzed asymmetric reduction of phosphomides. The structure of the resulting 1,1-P-OP ligands, which was selectively tuned through adequate combination of the configuration of the stereogenic carbon atom, its substituent, and the phosphite fragment, proved crucial for providing a rigid environment around the metal center, as evidenced by X-ray crystallography. These new ligands enabled very good catalytic properties in the Rh-mediated enantioselective hydrogenation and hydroformylation of challenging and model substrates (up to 99?%?ee). Whereas for asymmetric hydrogenation the optimal P-OP ligand depended on the substrate, for hydroformylation, a single ligand was the highest-performing one for almost all studied substrates: it contains an R-configured stereogenic carbon atom between the two phosphorus ligating groups, and an S-configured 3,3'-diphenyl-substituted biaryl unit. PMID:25335770

  8. Optical system design for wide-angle airborne mapping camera with diffractive optical element

    NASA Astrophysics Data System (ADS)

    Niu, Hai-Jun; Zhang, Jian; Yan, A.-qi; Leng, Han-bing; Fei, Jia-qi; Wu, Deng-shan; Cao, Jian-zhong

    2015-02-01

    With the development of the digital airborne photo-grammetry technology, the more performances of the optical system for airborne mapping camera are required, such as the longer focal, the wider field of view (FOV), at the same time, the secondary spectrum correction becomes more important and difficult for the optical system design. A high performance optical system of airborne mapping camera with 200mm focus and2?=60°FOV is designed in this paper. The range of work wavelength is from 430nm to 885nm. A two-layer HDOE with negative dispersive characteristic is used to eliminate the secondary spectrum in the process of optical system design. The diffraction efficiency of the designed two-layer HDOE is up to 90%. From the result of design, the MTFs in whole fields are over 0.5 at 90lp/mm, which shows that the system has a great image quality. Meantime, the thermal analysis is done at the temperature range between -20°C and 40°C, and the MTF curves of the system at-20°C ~40°C show that a great image quality is kept, which meets the design requirements.

  9. A crystal camera for ultra-small-angle x-ray scattering using synchrotron radiation.

    PubMed

    Pahl, R; Bonse, U

    1995-01-01

    We describe a novel USAXS camera that combines the use of synchrotron radiation with collimation by perfect-crystal optics. The outstanding result is that high measuring intensities and extreme angular resolution are achieved even with a point-focusing geometry. Along the principles of the original design (U. Bonse and M. Hart, Z. Phys. 189, 151 (1966)) which had to be operated at an x-ray tube, we employ two sets of pairs of multiply reflecting channel-cut crystals diffracting in the horizontal and vertical planes. The collimation characteristics thus obtained are equivalent to the point-focusing geometry of conventional SAXS cameras based on slit collimation. We present results from samples of polystyrene spheres which were used for test measurements performed with synchrotron radiation of DORIS at HASYLAB/DESY in Hamburg. Taking into account the number of reflections within the channel-cut crystals, the theoretical resolution was calculated and found to agree well with that derived from measured scattering patterns. Structures as large as about 1.3 ?m could easily be identified from the scattering curves. As expected with point-focusing geometry, desmearing of raw data was unnecessary. PMID:21307506

  10. Angles

    NSDL National Science Digital Library

    Shodor Education Foundation

    2004-01-01

    This Java applet enables students to investigate acute, obtuse, and right angles. The student decides to work with one or two transversals and a pair of parallel lines. Angle measure is given for one angle. The student answers a short series of questions about the size of other angles, identifying relationships such as vertical and adjacent angles and alternate interior and alternate exterior angles. In addition to automatically checking the student's answers, the applet can keep score of correct answers. From the activity page, What, How, and Why buttons open pages that explain the activity's purpose, function, and how the mathematics fits into the curriculum. Supplemental resources include lesson plans and a handout with a grid for showing the relationship between all possible angles that occur when parallel lines are cut by a transversal. Copyright 2005 Eisenhower National Clearinghouse

  11. Matching the Best Viewing Angle in Depth Cameras for Biomass Estimation Based on Poplar Seedling Geometry

    PubMed Central

    Andújar, Dionisio; Fernández-Quintanilla, César; Dorado, José

    2015-01-01

    In energy crops for biomass production a proper plant structure is important to optimize wood yields. A precise crop characterization in early stages may contribute to the choice of proper cropping techniques. This study assesses the potential of the Microsoft Kinect for Windows v.1 sensor to determine the best viewing angle of the sensor to estimate the plant biomass based on poplar seedling geometry. Kinect Fusion algorithms were used to generate a 3D point cloud from the depth video stream. The sensor was mounted in different positions facing the tree in order to obtain depth (RGB-D) images from different angles. Individuals of two different ages, e.g., one month and one year old, were scanned. Four different viewing angles were compared: top view (0°), 45° downwards view, front view (90°) and ground upwards view (?45°). The ground-truth used to validate the sensor readings consisted of a destructive sampling in which the height, leaf area and biomass (dry weight basis) were measured in each individual plant. The depth image models agreed well with 45°, 90° and ?45° measurements in one-year poplar trees. Good correlations (0.88 to 0.92) between dry biomass and the area measured with the Kinect were found. In addition, plant height was accurately estimated with a few centimeters error. The comparison between different viewing angles revealed that top views showed poorer results due to the fact the top leaves occluded the rest of the tree. However, the other views led to good results. Conversely, small poplars showed better correlations with actual parameters from the top view (0°). Therefore, although the Microsoft Kinect for Windows v.1 sensor provides good opportunities for biomass estimation, the viewing angle must be chosen taking into account the developmental stage of the crop and the desired parameters. The results of this study indicate that Kinect is a promising tool for a rapid canopy characterization, i.e., for estimating crop biomass production, with several important advantages: low cost, low power needs and a high frame rate (frames per second) when dynamic measurements are required. PMID:26053748

  12. Matching the Best Viewing Angle in Depth Cameras for Biomass Estimation Based on Poplar Seedling Geometry.

    PubMed

    Andújar, Dionisio; Fernández-Quintanilla, César; Dorado, José

    2015-01-01

    In energy crops for biomass production a proper plant structure is important to optimize wood yields. A precise crop characterization in early stages may contribute to the choice of proper cropping techniques. This study assesses the potential of the Microsoft Kinect for Windows v.1 sensor to determine the best viewing angle of the sensor to estimate the plant biomass based on poplar seedling geometry. Kinect Fusion algorithms were used to generate a 3D point cloud from the depth video stream. The sensor was mounted in different positions facing the tree in order to obtain depth (RGB-D) images from different angles. Individuals of two different ages, e.g., one month and one year old, were scanned. Four different viewing angles were compared: top view (0°), 45° downwards view, front view (90°) and ground upwards view (-45°). The ground-truth used to validate the sensor readings consisted of a destructive sampling in which the height, leaf area and biomass (dry weight basis) were measured in each individual plant. The depth image models agreed well with 45°, 90° and -45° measurements in one-year poplar trees. Good correlations (0.88 to 0.92) between dry biomass and the area measured with the Kinect were found. In addition, plant height was accurately estimated with a few centimeters error. The comparison between different viewing angles revealed that top views showed poorer results due to the fact the top leaves occluded the rest of the tree. However, the other views led to good results. Conversely, small poplars showed better correlations with actual parameters from the top view (0°). Therefore, although the Microsoft Kinect for Windows v.1 sensor provides good opportunities for biomass estimation, the viewing angle must be chosen taking into account the developmental stage of the crop and the desired parameters. The results of this study indicate that Kinect is a promising tool for a rapid canopy characterization, i.e., for estimating crop biomass production, with several important advantages: low cost, low power needs and a high frame rate (frames per second) when dynamic measurements are required. PMID:26053748

  13. The wavelength dependence of the lunar phase curve as seen by the Lunar Reconnaissance Orbiter wide-angle camera

    NASA Astrophysics Data System (ADS)

    Hapke, Bruce; Denevi, Brett; Sato, Hiroyuki; Braden, Sarah; Robinson, Mark

    2012-03-01

    The Lunar Reconnaissance Orbiter wide-angle camera measured the bidirectional reflectances of two areas on the Moon at seven wavelengths between 321 and 689 nm and at phase angles between 0° and 120°. It is not possible to account for the phase curves unless both coherent backscatter and shadow hiding contribute to the opposition effect. For the analyzed highlands area, coherent backscatter contributes nearly 40% in the UV, increasing to over 60% in the red. This conclusion is supported by laboratory measurements of the circular polarization ratios of Apollo regolith samples, which also indicate that the Moon's opposition effect contains a large component of coherent backscatter. The angular width of the lunar opposition effect is almost independent of wavelength, contrary to theories of the coherent backscatter which, for the Moon, predict that the width should be proportional to the square of the wavelength. When added to the large body of other experimental evidence, this lack of wavelength dependence reinforces the argument that our current understanding of the coherent backscatter opposition effect is incomplete or perhaps incorrect. It is shown that phase reddening is caused by the increased contribution of interparticle multiple scattering as the wavelength and albedo increase. Hence, multiple scattering cannot be neglected in lunar photometric analyses. A simplified semiempirical bidirectional reflectance function is proposed for the Moon that contains four free parameters and that is mathematically simple and straightforward to invert. This function should be valid everywhere on the Moon for phase angles less than about 120°, except at large viewing and incidence angles close to the limb, terminator, and poles.

  14. field are seen in the THEMIS data (Fig. 2), yet are absent or barely discernible in the MOC wide-angle camera (MOC WA) and Mars

    E-print Network

    -angle camera (MOC WA) and Mars Orbiter Laser Altimeter topographic data. One MOC NA image crosses several. In the Opportunity landing site crater, ,1/2-m-thick, finely layered units are exposed in the rim3 and embayed, whatever aqueous process altered, and perhaps formed, the layered units at the landing site3 must have

  15. Early direct-injection, low-temperature combustion of diesel fuel in an optical engine utilizing a 15-hole, dual-row, narrow-included-angle nozzle

    Microsoft Academic Search

    Christopher R. Gehrke; Michael S. Radovanovic; David M. Milam; Glen C. Martin; Charles J. Mueller

    2008-01-01

    Low-temperature combustion of diesel fuel was studied in a heavy-duty, single-cylinder optical engine employing a 15-hole, dual-row, narrow-included-angle nozzle (10 holes x 70\\/mD and 5 holes x 35\\/mD) with 103-\\/gmm-diameter orifices. This nozzle configuration provided the spray targeting necessary to contain the direct-injected diesel fuel within the piston bowl for injection timings as early as 70\\/mD before top dead center.

  16. 2. VAL CAMERA CAR, VIEW OF CAMERA CAR AND TRACK ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    2. VAL CAMERA CAR, VIEW OF CAMERA CAR AND TRACK WITH CAMERA STATION ABOVE LOOKING WEST TAKEN FROM RESERVOIR. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  17. 6. VAL CAMERA CAR, DETAIL OF COMMUNICATION EQUIPMENT INSIDE CAMERA ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    6. VAL CAMERA CAR, DETAIL OF COMMUNICATION EQUIPMENT INSIDE CAMERA CAR WITH CAMERA MOUNT IN FOREGROUND. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  18. 7. VAL CAMERA CAR, DETAIL OF 'FLARE' OR TRAJECTORY CAMERA ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    7. VAL CAMERA CAR, DETAIL OF 'FLARE' OR TRAJECTORY CAMERA INSIDE CAMERA CAR. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  19. 1060-nm multi quantum well diode lasers with narrow vertical divergence angle of 8° and high internal efficiency

    Microsoft Academic Search

    A. Pietrzak; P. Crump; F. Bugge; H. Wenzel; G. Erbert; G. Trankle

    2009-01-01

    MQW 1060 nm structures with extremely thick 8.64 mum waveguide resulting in 8deg angle of vertical divergence have been grown and tested. The measurements of uncoated lasers promise high optical power operation with nearly circular beam-shape.

  20. Mars Global Surveyor Mars Orbiter Camera Image Gallery

    NSDL National Science Digital Library

    Malin Space Science Systems

    This site from Malin Space Science Systems provides access to all of the images acquired by the Mars Orbiter Camera (MOC) during the Mars Global Surveyor mission through March 2005. MOC consists of several cameras: A narrow angle system that provides grayscale high resolution views of the planet's surface (typically, 1.5 to 12 meters/pixel), and red and blue wide angle cameras that provide daily global weather monitoring, context images to determine where the narrow angle views were actually acquired, and regional coverage to monitor variable surface features such as polar frost and wind streaks. Ancillary data for each image is provided and instructions regarding gallery usage are also available on the site.

  1. Camera Calibration with Super-Wide-Angle and Low-Distortion Lens Using Higher Degree Polynomial Model

    E-print Network

    Ohya, Akihisa

    , Camera Calibration 1. 3 3D 3 Theia Technologies LLC OpenCV 2.4 [1, 2] 2. 2.1 (X, Y, Z) (x, y) (1) x = fx X Z +cx, y = fy Y Z +cy (1) fx fy x y cx cy camera intrinsic parameter (x, y) OpenCV 2.4 1 1 Graduate School of Systems and Information Engineering, University of Tsukuba 2.2 Brown OpenCV 2.4 Brown [3

  2. LRO Camera Imaging of the Moon: Apollo 17 and other Sites for Ground Truth

    Microsoft Academic Search

    B. L. Jolliff; S. M. Wiseman; M. S. Robinson; S. Lawrence; B. W. Denevi; J. F. Bell

    2009-01-01

    One of the fundamental goals of the Lunar Reconnaissance Orbiter (LRO) is the determination of mineralogic and compositional distributions and their relation to geologic features on the Moon's surface. Through a combination of imaging with the LRO narrow-angle cameras and wide-angle camera (NAC, WAC), very fine-scale geologic features are resolved with better than meter-per-pixel resolution (NAC) and correlated to spectral

  3. Why do I sometimes see bright speckles in an image of the Terrain product, particularly at the oblique camera angles?

    Atmospheric Science Data Center

    2014-12-08

    MISR Level 1B2 data products use various high data values to signify fill, and one of the fill values (16377) in the 14 MSB's of the scaled radiances signifies that this location on the SOM grid was obscured from the camera's view by...

  4. 3. VAL CAMERA CAR, VIEW OF CAMERA CAR AND TRACK ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    3. VAL CAMERA CAR, VIEW OF CAMERA CAR AND TRACK WITH THE VAL TO THE RIGHT, LOOKING NORTHEAST. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  5. Automated texture mapping of 3D city models with images of wide-angle and light small combined digital camera system for UAV

    NASA Astrophysics Data System (ADS)

    Gui, De-zhu; Lin, Zong-jian; Zhang, Cheng-cheng; Zhi, Xiao-dong

    2009-10-01

    To overcome the disadvantages of single non-metric camera with small format and short baseline, a wide-angle camera system is developed with the combination of four digital cameras (UAV-LAC) to meet low altitude photogrmmetric for acquiring both facades and rooftops high resolution texture of 3D city model. A relative self-calibration method based on tie points in the overlapping areas of each two sub images is used to make compensation for the deformed errors due to the light and simple constructed mechanical frame. AAT is used to rapidly retrieve pose parameters of mosaic image and original inclined images. After that, space forward intersection algorithm is used to gradually improve the building model as well as matching each space edge of building. Next, geometrical rectification is considered for textures and the best textures for wall and roof are selected by taking into account occlusion, image resolution, surface normal orientation, and coherence with neighboring triangles. Finally, by incorporating friendly necessary human interaction into texture reconstruction algorithm, an semiautomatic system is designed to reconstruct texture from UAV-LAC.

  6. 1. VARIABLEANGLE LAUNCHER CAMERA CAR, VIEW OF CAMERA CAR AND ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    1. VARIABLE-ANGLE LAUNCHER CAMERA CAR, VIEW OF CAMERA CAR AND TRACK WITH CAMERA STATION ABOVE LOOKING NORTH TAKEN FROM RESERVOIR. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  7. 1 3-1 5 A Location Awareness System using Wide-angle Camera and Active IR-Tag

    Microsoft Academic Search

    Muneyuki Sakata; Yoshihiro Yasumuro; Masataka Imura; Yoshitsugu Manabe; Kunihiro Chihara

    2002-01-01

    This paper proposes a new location-awareness sys- tem, ALTAIR(Automatic Location Tracking with Ac- tive IR-tag) that automatically detects and tracks the location of the mobile PC(Persona1 Computer) users. IR(1nfraRed)-tag is stably detected and distinguished by IR filtering camera. Combination of the IR-tag and wireless LAN (Local Area Network) enables ALTAIR to control the IR-tags through the network to perform stable

  8. A survey of Martian dust devil activity using Mars Global Surveyor Mars Orbiter Camera images

    Microsoft Academic Search

    Jenny A. Fisher; Mark I. Richardson; Claire E. Newman; Mark A. Szwast; Chelsea Graf; Shabari Basu; Shawn P. Ewald; Anthony D. Toigo; R. John Wilson

    2005-01-01

    A survey of dust devils using the Mars Global Surveyor (MGS) Mars Orbiter Camera (MOC) wide- and narrow-angle (WA and NA) images has been undertaken. The survey comprises two parts: (1) sampling of nine broad regions from September 1997 to July 2001 and (2) a focused seasonal monitoring of variability in the Amazonis region, an active dust devil site, from

  9. LROC - Lunar Reconnaissance Orbiter Camera

    NASA Astrophysics Data System (ADS)

    Robinson, M. S.; Bowman-Cisneros, E.; Brylow, S. M.; Eliason, E.; Hiesinger, H.; Jolliff, B. L.; McEwen, A. S.; Malin, M. C.; Roberts, D.; Thomas, P. C.; Turtle, E.

    2006-12-01

    The Lunar Reconnaissance Orbiter Camera (LROC) is designed to address two of the prime LRO measurement requirements. 1) Assess meter and smaller-scale features to facilitate safety analysis for potential lunar landing sites near polar resources, and elsewhere on the Moon. 2) Acquire multi-temporal synoptic imaging of the poles every orbit to characterize the polar illumination environment (100 m scale), identifying regions of permanent shadow and permanent or near-permanent illumination over a full lunar year. The LROC consists of two narrow-angle camera components (NACs) to provide 0.5-m scale panchromatic images over a 5-km swath, a wide-angle camera component (WAC) to provide images at a scale of 100 and 400 m in seven color bands over a 100-km swath, and a common Sequence and Compressor System (SCS). In addition to acquiring the two LRO prime measurement sets, LROC will return six other high-value datasets that support LRO goals, the Robotic Lunar Exploration Program (RLEP), and basic lunar science. These additional datasets include: 3) meter-scale mapping of regions of permanent or near-permanent illumination of polar massifs; 4) multiple co-registered observations of portions of potential landing sites and elsewhere for derivation of high-resolution topography through stereogrammetric and photometric stereo analyses; 5) a global multispectral map in 7 wavelengths (300-680 nm) to characterize lunar resources, in particular ilmenite; 6) a global 100-m/pixel basemap with incidence angles (60-80°) favorable for morphologic interpretations; 7) sub-meter imaging of a variety of geologic units to characterize physical properties, variability of the regolith, and key science questions; and 8) meter-scale coverage overlapping with Apollo era Panoramic images (1-2 m/pixel) to document the number of small impacts since 1971-1972, to ascertain hazards for future surface operations and interplanetary travel.

  10. Wide Angle Movie

    NASA Technical Reports Server (NTRS)

    1999-01-01

    This brief movie illustrates the passage of the Moon through the Saturn-bound Cassini spacecraft's wide-angle camera field of view as the spacecraft passed by the Moon on the way to its closest approach with Earth on August 17, 1999. From beginning to end of the sequence, 25 wide-angle images (with a spatial image scale of about 14 miles per pixel (about 23 kilometers)were taken over the course of 7 and 1/2 minutes through a series of narrow and broadband spectral filters and polarizers, ranging from the violet to the near-infrared regions of the spectrum, to calibrate the spectral response of the wide-angle camera. The exposure times range from 5 milliseconds to 1.5 seconds. Two of the exposures were smeared and have been discarded and replaced with nearby images to make a smooth movie sequence. All images were scaled so that the brightness of Crisium basin, the dark circular region in the upper right, is approximately the same in every image. The imaging data were processed and released by the Cassini Imaging Central Laboratory for Operations (CICLOPS)at the University of Arizona's Lunar and Planetary Laboratory, Tucson, AZ.

    Photo Credit: NASA/JPL/Cassini Imaging Team/University of Arizona

    Cassini, launched in 1997, is a joint mission of NASA, the European Space Agency and Italian Space Agency. The mission is managed by NASA's Jet Propulsion Laboratory, Pasadena, CA, for NASA's Office of Space Science, Washington DC. JPL is a division of the California Institute of Technology, Pasadena, CA.

  11. Camera Obscura

    NSDL National Science Digital Library

    Mr. Engelman

    2008-10-28

    Before photography was invented there was the camera obscura, useful for studying the sun, as an aid to artists, and for general entertainment. What is a camera obscura and how does it work ??? Camera = Latin for room Obscura = Latin for dark But what is a Camera Obscura? The Magic Mirror of Life What is a camera obscura? A French drawing camera with supplies A French drawing camera with supplies Drawing Camera Obscuras with Lens at the top Drawing Camera Obscuras with Lens at the top Read the first three paragraphs of this article. Under the portion Early Observations and Use in Astronomy you will find the answers to the ...

  12. 8. VAL CAMERA CAR, CLOSEUP VIEW OF 'FLARE' OR TRAJECTORY ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    8. VAL CAMERA CAR, CLOSE-UP VIEW OF 'FLARE' OR TRAJECTORY CAMERA ON SLIDING MOUNT. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  13. Camera Animation

    NSDL National Science Digital Library

    A general discussion of the use of cameras in computer animation. This section includes principles of traditional film techniques and suggestions for the use of a camera during an architectural walkthrough. This section includes html pages, images and one video.

  14. Camera Committee 

    E-print Network

    Unknown

    2011-08-17

    objects on stereoscopic still video images. Digital picture elements (pixels) were used as units of measurement. A scale model was devised to emulate low altitude videography. Camera distance was set at 1500 cm to simulate a flight altitude of 1500 feet... above ground level. Accordingly, the model was designed for rods 40 to 100 cm long to represent poles measuring 40 to 100 feet in height. Absolute orientation of each stereoscopic image was obtained by surveying each nadir, camera location and camera...

  15. LROC - Lunar Reconnaissance Orbiter Camera

    NASA Astrophysics Data System (ADS)

    Robinson, M. S.; Eliason, E.; Hiesinger, H.; Jolliff, B. L.; McEwen, A.; Malin, M. C.; Ravine, M. A.; Thomas, P. C.; Turtle, E. P.

    2009-12-01

    The Lunar Reconnaissance Orbiter (LRO) went into lunar orbit on 23 June 2009. The LRO Camera (LROC) acquired its first lunar images on June 30 and commenced full scale testing and commissioning on July 10. The LROC consists of two narrow-angle cameras (NACs) that provide 0.5 m scale panchromatic images over a combined 5 km swath, and a wide-angle camera (WAC) to provide images at a scale of 100 m per pixel in five visible wavelength bands (415, 566, 604, 643, and 689 nm) and 400 m per pixel in two ultraviolet bands (321 nm and 360 nm) from the nominal 50 km orbit. Early operations were designed to test the performance of the cameras under all nominal operating conditions and provided a baseline for future calibrations. Test sequences included off-nadir slews to image stars and the Earth, 90° yaw sequences to collect flat field calibration data, night imaging for background characterization, and systematic mapping to test performance. LRO initially was placed into a terminator orbit resulting in images acquired under low signal conditions. Over the next three months the incidence angle at the spacecraft’s equator crossing gradually decreased towards high noon, providing a range of illumination conditions. Several hundred south polar images were collected in support of impact site selection for the LCROSS mission; details can be seen in many of the shadows. Commissioning phase images not only proved the instruments’ overall performance was nominal, but also that many geologic features of the lunar surface are well preserved at the meter-scale. Of particular note is the variety of impact-induced morphologies preserved in a near pristine state in and around kilometer-scale and larger young Copernican age impact craters that include: abundant evidence of impact melt of a variety of rheological properties, including coherent flows with surface textures and planimetric properties reflecting supersolidus (e.g., liquid melt) emplacement, blocks delicately perched on terraces and rims, and both large and small radial and circumferential ejecta patterns, reflecting their ballistic emplacement and interaction with pre-existing topography and that created by earlier ejecta, extending out more than a crater diameter. Early efforts at reducing NAC stereo observations to topographic models show spatial resolutions of 2.5 m to 5 m will be possible from the 50 km orbit. Systematic seven-color WAC observations will commence at the beginning of the primary mapping phase. A key goal of the LROC experiment is to characterize future exploration targets in cooperation with the NASA Constellation program. By the end of the commissioning phase all fifty high priority targets will have partial reconnaissance mode coverage (0.5 m to 2 m per pixel).

  16. Dynamics of an oscillating bubble in a narrow gap

    E-print Network

    Azam, Fahad Ibn

    The complex dynamics of a single bubble of a few millimeters in size oscillating inside a narrow fluid-filled gap between two parallel plates is studied using high-speed videography. Two synchronized high-speed cameras ...

  17. Changes in local energy spectra with SPECT rotation for two Anger cameras

    SciTech Connect

    Koral, K.F.; Luo, J.Q.; Ahmad, W.; Buchbinder, S.; Ficaro, E.

    1995-08-01

    The authors investigated the shift of local energy spectra with SPECT rotation for the GE 400 AT and the Picker Prism 3000 tomographs. A Co-57 flood source was taped to the parallel-beam collimator of the GE 400 AT; a Tc-99m line source was placed at the focus of the fan-beam collimator of one head of the Picker Prism. The count-based method, which employs a narrow window (about 4 keV) on the maximum slope of the photopeak, was used with both systems. Non-linear, polynomial spectral fittings was applied to x-y-E data acquisitions with the GE camera. The fitting yielded either shifts or shifts and width changes. Results show (1) the shifts are pseudo-sinusoidal with angle and similar for different spatial locations, (2) the average of their absolute value is 0.71 keV and 0.13 keV for the GE and Picker cameras, respectively, (3) width changes for the GE camera are small and appear random, (4) the calculated shifts from the count-based method for the central part of the GE camera are correlated with those from the spectral fitting method. They are 12% smaller. The conclusion is that energy shift with angle may be present with many rotating cameras although they may be smaller with newer cameras. It might be necessary to account for them in schemes designed for high-accuracy compensation of Compton-scattered gamma rays although they possibly could be ignored for newer cameras.

  18. Mirror-Based Extrinsic Camera Calibration

    Microsoft Academic Search

    Joel A. Hesch; Anastasios I. Mourikis; Stergios I. Roumeliotis

    2008-01-01

    This paper presents a method for determining the six degrees-of-freedom transformation between a camera and a base frame of interest. A planar mirror is maneuvered so as to allow the camera to observe the environment from sev- eral viewing angles. Points, whose coordinates in the base frame are known, are observed by the camera via their re?ections in the mirror.

  19. LRO Camera Imaging of Constellation Sites

    NASA Astrophysics Data System (ADS)

    Gruener, J.; Jolliff, B. L.; Lawrence, S.; Robinson, M. S.; Plescia, J. B.; Wiseman, S. M.; Li, R.; Archinal, B. A.; Howington-Kraus, A. E.

    2009-12-01

    One of the top priorities for Lunar Reconnaissance Orbiter Camera (LROC) imaging during the "exploration" phase of the mission is thorough coverage of 50 sites selected to represent a wide variety of terrain types and geologic features that are of interest for human exploration. These sites, which are broadly distributed around the Moon and include locations at or near both poles, will provide the Constellation Program with data for a set of targets that represent a diversity of scientific and resource opportunities, thus forming a basis for planning for scientific exploration, resource development, and mission operations including traverse and habitation zone planning. Identification of the Constellation targets is not intended to be a site-selection activity. Sites include volcanic terrains (surfaces with young and old basalt flows, pyroclastic deposits, vents, fissures, domes, low shields, rilles, wrinkle ridges, and lava tubes), impact craters and basins (crater floors, central peaks, terraces and walls; impact-melt and ejecta deposits, basin ring structures; and antipodal terrain), and contacts of geologic features in areas of complex geology. Sites at the poles represent different lighting conditions and include craters with areas of permanent shadow. Sites were also chosen that represent typical feldspathic highlands terrain, areas in the highlands with anomalous compositions, and unusual features such as magnetic anomalies. These sites were reviewed by the Lunar Exploration Analysis Group (LEAG). These sites all have considerable scientific and exploration interest and were derived from previous studies of potential lunar landing sites, supplemented with areas that capitalize on discoveries from recent orbital missions. Each site consists of nested regions of interest (ROI), including 10×10 km, 20×20 km, and 40×40 km areas. Within the 10×10 and 20×20 ROIs, the goal is to compile a set of narrow-angle-camera (NAC) observations for a controlled mosaic, photometric and geometric stereo, and images taken at low and high sun to enhance morphology and albedo, respectively. These data will provide the basis for topographic maps, digital elevation models, and slope and boulder hazard maps that could be used to establish landing or habitation zones. Within the 40×40 ROIs, images will be taken to achieve the best possible high-resolution mosaics. All ROIs will have wide-angle-camera context images covering the sites and surrounding areas. At the time of writing (prior to the end of the LRO commissioning phase), over 500 individual NAC frames have been acquired for 47 of the 50 sites. Because of the polar orbit, the majority of repeat coverage occurs for the polar and high latitude sites. Analysis of the environment for several representative Constellation site ROIs will be presented.

  20. Calibration of the Lunar Reconnaissance Orbiter Camera

    NASA Astrophysics Data System (ADS)

    Tschimmel, M.; Robinson, M. S.; Humm, D. C.; Denevi, B. W.; Lawrence, S. J.; Brylow, S.; Ravine, M.; Ghaemi, T.

    2008-12-01

    The Lunar Reconnaissance Orbiter Camera (LROC) onboard the NASA Lunar Reconnaissance Orbiter (LRO) spacecraft consists of three cameras: the Wide-Angle Camera (WAC) and two identical Narrow Angle Cameras (NAC-L, NAC-R). The WAC is push-frame imager with 5 visible wavelength filters (415 to 680 nm) at a spatial resolution of 100 m/pixel and 2 UV filters (315 and 360 nm) with a resolution of 400 m/pixel. In addition to the multicolor imaging the WAC can operate in monochrome mode to provide a global large- incidence angle basemap and a time-lapse movie of the illumination conditions at both poles. The WAC has a highly linear response, a read noise of 72 e- and a full well capacity of 47,200 e-. The signal-to-noise ratio in each band is 140 in the worst case. There are no out-of-band leaks and the spectral response of each filter is well characterized. Each NAC is a monochrome pushbroom scanner, providing images with a resolution of 50 cm/pixel from a 50-km orbit. A single NAC image has a swath width of 2.5 km and a length of up to 26 km. The NACs are mounted to acquire side-by-side imaging for a combined swath width of 5 km. The NAC is designed to fully characterize future human and robotic landing sites in terms of topography and hazard risks. The North and South poles will be mapped on a 1-meter-scale poleward of 85.5° latitude. Stereo coverage can be provided by pointing the NACs off-nadir. The NACs are also highly linear. Read noise is 71 e- for NAC-L and 74 e- for NAC-R and the full well capacity is 248,500 e- for NAC-L and 262,500 e- for NAC- R. The focal lengths are 699.6 mm for NAC-L and 701.6 mm for NAC-R; the system MTF is 28% for NAC-L and 26% for NAC-R. The signal-to-noise ratio is at least 46 (terminator scene) and can be higher than 200 (high sun scene). Both NACs exhibit a straylight feature, which is caused by out-of-field sources and is of a magnitude of 1-3%. However, as this feature is well understood it can be greatly reduced during ground processing. All three cameras were calibrated in the laboratory under ambient conditions. Future thermal vacuum tests will characterize critical behaviors across the full range of lunar operating temperatures. In-flight tests will check for changes in response after launch and provide key data for meeting the requirements of 1% relative and 10% absolute radiometric calibration.

  1. Camera Projector

    NSDL National Science Digital Library

    Oakland Discovery Center

    2011-01-01

    In this activity (posted on March 14, 2011), learners follow the steps to construct a camera projector to explore lenses and refraction. First, learners use relatively simple materials to construct the projector. Then, learners discover that lenses project images upside down and backwards. They explore this phenomenon by creating their own slides (must be drawn upside down and backwards to appear normally). Use this activity to also introduce learners to spherical aberration and chromatic aberration.

  2. Camera Calibration from Video of a Walking Human

    E-print Network

    Southern California, University of

    Camera Calibration from Video of a Walking Human Fengjun Lv, Member, IEEE, Tao Zhao, Member, IEEE, and Ramakant Nevatia, Fellow, IEEE Abstract--A self-calibration method to estimate a camera's intrinsic to various viewing angles and subjects. Index Terms--Camera calibration, self-calibration, vanishing point

  3. Omnidirectional Underwater Camera Design and Calibration

    PubMed Central

    Bosch, Josep; Gracias, Nuno; Ridao, Pere; Ribas, David

    2015-01-01

    This paper presents the development of an underwater omnidirectional multi-camera system (OMS) based on a commercially available six-camera system, originally designed for land applications. A full calibration method is presented for the estimation of both the intrinsic and extrinsic parameters, which is able to cope with wide-angle lenses and non-overlapping cameras simultaneously. This method is valid for any OMS in both land or water applications. For underwater use, a customized housing is required, which often leads to strong image distortion due to refraction among the different media. This phenomena makes the basic pinhole camera model invalid for underwater cameras, especially when using wide-angle lenses, and requires the explicit modeling of the individual optical rays. To address this problem, a ray tracing approach has been adopted to create a field-of-view (FOV) simulator for underwater cameras. The simulator allows for the testing of different housing geometries and optics for the cameras to ensure a complete hemisphere coverage in underwater operation. This paper describes the design and testing of a compact custom housing for a commercial off-the-shelf OMS camera (Ladybug 3) and presents the first results of its use. A proposed three-stage calibration process allows for the estimation of all of the relevant camera parameters. Experimental results are presented, which illustrate the performance of the calibration method and validate the approach. PMID:25774707

  4. Omnidirectional underwater camera design and calibration.

    PubMed

    Bosch, Josep; Gracias, Nuno; Ridao, Pere; Ribas, David

    2015-01-01

    This paper presents the development of an underwater omnidirectional multi-camera system (OMS) based on a commercially available six-camera system, originally designed for land applications. A full calibration method is presented for the estimation of both the intrinsic and extrinsic parameters, which is able to cope with wide-angle lenses and non-overlapping cameras simultaneously. This method is valid for any OMS in both land or water applications. For underwater use, a customized housing is required, which often leads to strong image distortion due to refraction among the different media. This phenomena makes the basic pinhole camera model invalid for underwater cameras, especially when using wide-angle lenses, and requires the explicit modeling of the individual optical rays. To address this problem, a ray tracing approach has been adopted to create a field-of-view (FOV) simulator for underwater cameras. The simulator allows for the testing of different housing geometries and optics for the cameras to ensure a complete hemisphere coverage in underwater operation. This paper describes the design and testing of a compact custom housing for a commercial off-the-shelf OMS camera (Ladybug 3) and presents the first results of its use. A proposed three-stage calibration process allows for the estimation of all of the relevant camera parameters. Experimental results are presented, which illustrate the performance of the calibration method and validate the approach. PMID:25774707

  5. New two-dimensional photon camera

    NASA Technical Reports Server (NTRS)

    Papaliolios, C.; Mertz, L.

    1982-01-01

    A photon-sensitive camera, applicable to speckle imaging of astronomical sources, high-resolution spectroscopy of faint galaxies in a crossed-dispersion spectrograph, or narrow-band direct imaging of galaxies, is presented. The camera is shown to supply 8-bit by 8-bit photon positions (256 x 256 pixels) for as many as 10 to the 6th photons/sec with a maximum linear resolution of approximately 10 microns. The sequence of photon positions is recorded digitally with a VHS-format video tape recorder or formed into an immediate image via a microcomputer. The four basic elements of the camera are described in detail: a high-gain image intensifier with fast-decay output phosphor, a glass-prism optical-beam splitter, a set of Gray-coded masks, and a photomultiplier tube for each mask. The characteristics of the camera are compared to those of other photon cameras.

  6. High-power, narrow-band, high-repetition-rate, 5.9 eV coherent light source using passive optical cavity for laser-based angle-resolved photoelectron spectroscopy.

    PubMed

    Omachi, J; Yoshioka, K; Kuwata-Gonokami, M

    2012-10-01

    We demonstrate a scheme for efficient generation of a 5.9 eV coherent light source with an average power of 23 mW, 0.34 meV linewidth, and 73 MHz repetition rate from a Ti: sapphire picosecond mode-locked laser with an output power of 1 W. Second-harmonic light is generated in a passive optical cavity by a BiB(3)O(6) crystal with a conversion efficiency as high as 67%. By focusing the second-harmonic light transmitted from the cavity into a ?-BaB(2)O(4) crystal, we obtain fourth-harmonic light at 5.9 eV. This light source offers stable operation for at least a week. We discuss the suitability of the laser light source for high-resolution angle-resolved photoelectron spectroscopy by comparing it with other sources (synchrotron radiation facilities and gas discharge lamp). PMID:23188317

  7. Characterization of previously unidentified lunar pyroclastic deposits using Lunar Reconnaissance Orbiter Camera (LROC) data

    USGS Publications Warehouse

    Gustafson, J. Olaf; Bell, James F.; Gaddis, Lisa R.R.; Hawke, B. Ray Ray; Giguere, Thomas A.

    2012-01-01

    We used a Lunar Reconnaissance Orbiter Camera (LROC) global monochrome Wide-angle Camera (WAC) mosaic to conduct a survey of the Moon to search for previously unidentified pyroclastic deposits. Promising locations were examined in detail using LROC multispectral WAC mosaics, high-resolution LROC Narrow Angle Camera (NAC) images, and Clementine multispectral (ultraviolet-visible or UVVIS) data. Out of 47 potential deposits chosen for closer examination, 12 were selected as probable newly identified pyroclastic deposits. Potential pyroclastic deposits were generally found in settings similar to previously identified deposits, including areas within or near mare deposits adjacent to highlands, within floor-fractured craters, and along fissures in mare deposits. However, a significant new finding is the discovery of localized pyroclastic deposits within floor-fractured craters Anderson E and F on the lunar farside, isolated from other known similar deposits. Our search confirms that most major regional and localized low-albedo pyroclastic deposits have been identified on the Moon down to ~100 m/pix resolution, and that additional newly identified deposits are likely to be either isolated small deposits or additional portions of discontinuous, patchy deposits.

  8. Pre-hibernation performances of the OSIRIS cameras onboard the Rosetta spacecraft

    NASA Astrophysics Data System (ADS)

    Magrin, S.; La Forgia, F.; Da Deppo, V.; Lazzarin, M.; Bertini, I.; Ferri, F.; Pajola, M.; Barbieri, M.; Naletto, G.; Barbieri, C.; Tubiana, C.; Küppers, M.; Fornasier, S.; Jorda, L.; Sierks, H.

    2015-02-01

    Context. The ESA cometary mission Rosetta was launched in 2004. In the past years and until the spacecraft hibernation in June 2011, the two cameras of the OSIRIS imaging system (Narrow Angle and Wide Angle Camera, NAC and WAC) observed many different sources. On 20 January 2014 the spacecraft successfully exited hibernation to start observing the primary scientific target of the mission, comet 67P/Churyumov-Gerasimenko. Aims: A study of the past performances of the cameras is now mandatory to be able to determine whether the system has been stable through the time and to derive, if necessary, additional analysis methods for the future precise calibration of the cometary data. Methods: The instrumental responses and filter passbands were used to estimate the efficiency of the system. A comparison with acquired images of specific calibration stars was made, and a refined photometric calibration was computed, both for the absolute flux and for the reflectivity of small bodies of the solar system. Results: We found a stability of the instrumental performances within ±1.5% from 2007 to 2010, with no evidence of an aging effect on the optics or detectors. The efficiency of the instrumentation is found to be as expected in the visible range, but lower than expected in the UV and IR range. A photometric calibration implementation was discussed for the two cameras. Conclusions: The calibration derived from pre-hibernation phases of the mission will be checked as soon as possible after the awakening of OSIRIS and will be continuously monitored until the end of the mission in December 2015. A list of additional calibration sources has been determined that are to be observed during the forthcoming phases of the mission to ensure a better coverage across the wavelength range of the cameras and to study the possible dust contamination of the optics.

  9. Caught on Camera.

    ERIC Educational Resources Information Center

    Milshtein, Amy

    2002-01-01

    Describes the benefits of and rules to be followed when using surveillance cameras for school security. Discusses various camera models, including indoor and outdoor fixed position cameras, pan-tilt zoom cameras, and pinhole-lens cameras for covert surveillance. (EV)

  10. Bacterial motion in narrow capillaries.

    PubMed

    Ping, Liyan; Wasnik, Vaibhav; Emberly, Eldon

    2015-02-01

    Motile bacteria often have to pass through small tortuous pores in soil or tissue of higher organisms. However, their motion in this prevalent type of niche is not fully understood. Here, we modeled it with narrow glass capillaries and identified a critical radius (Rc) for bacterial motion. Near the surface of capillaries narrower than that, the swimming trajectories are helices. In larger capillaries, they swim in distorted circles. Under non-slip condition, the peritrichous Escherichia coli swam in left-handed helices with an Rc of ~10 ?m near glass surface. However, slipping could occur in the fast monotrichous Pseudomonas fluorescens, when a speed threshold was exceeded, and thus both left-handed and right-handed helices were executed in glass capillaries. In the natural non-cylindrical pores, the near-surface trajectories would be spirals and twisted loops. Engaging in such motions reduces the bacterial migration rate. With a given pore size, the run length and the tumbling angle of the bacterium determine the probability and duration of their near-surface motion. Shear flow and chemotaxis potentially enhance it. Based on this observation, the puzzling previous observations on bacterial migration in porous environments can be interpreted. PMID:25764548

  11. Angle performance on optima MDxt

    SciTech Connect

    David, Jonathan; Kamenitsa, Dennis [Axcelis Technologies, Inc., 108 Cherry Hill Dr, Beverly, MA 01915 (United States)

    2012-11-06

    Angle control on medium current implanters is important due to the high angle-sensitivity of typical medium current implants, such as halo implants. On the Optima MDxt, beam-to-wafer angles are controlled in both the horizontal and vertical directions. In the horizontal direction, the beam angle is measured through six narrow slits, and any angle adjustment is made by electrostatically steering the beam, while cross-wafer beam parallelism is adjusted by changing the focus of the electrostatic parallelizing lens (P-lens). In the vertical direction, the beam angle is measured through a high aspect ratio mask, and any angle adjustment is made by slightly tilting the wafer platen prior to implant. A variety of tests were run to measure the accuracy and repeatability of Optima MDxt's angle control. SIMS profiles of a high energy, channeling sensitive condition show both the cross-wafer angle uniformity, along with the small-angle resolution of the system. Angle repeatability was quantified by running a channeling sensitive implant as a regular monitor over a seven month period and measuring the sheet resistance-to-angle sensitivity. Even though crystal cut error was not controlled for in this case, when attributing all Rs variation to angle changes, the overall angle repeatability was measured as 0.16 Degree-Sign (1{sigma}). A separate angle repeatability test involved running a series of V-curves tests over a four month period using low crystal cut wafers selected from the same boule. The results of this test showed the angle repeatability to be <0.1 Degree-Sign (1{sigma}).

  12. Angle Hunting

    NSDL National Science Digital Library

    Exploratorium

    2010-01-01

    In this activity, learners use a hand-made protractor to measure angles they find in playground equipment. Learners will observe that angle measurements do not change with distance, because they are distance invariant, or constant. Note: The "Pocket Protractor" activity should be done ahead as a separate activity (see related resource), but a standard protractor can be used as a substitute.

  13. Determining Camera Gain in Room Temperature Cameras

    SciTech Connect

    Joshua Cogliati

    2010-12-01

    James R. Janesick provides a method for determining the amplification of a CCD or CMOS camera when only access to the raw images is provided. However, the equation that is provided ignores the contribution of dark current. For CCD or CMOS cameras that are cooled well below room temperature, this is not a problem, however, the technique needs adjustment for use with room temperature cameras. This article describes the adjustment made to the equation, and a test of this method.

  14. Experience with duplex bearings in narrow angle oscillating applications

    NASA Technical Reports Server (NTRS)

    Phinney, D. D.; Pollard, C. L.; Hinricks, J. T.

    1988-01-01

    Duplex ball bearings are matched pairs on which the abutting faces of the rings have been accurately ground so that when the rings are clamped together, a controlled amount of interference (preload) exists across the balls. These bearings are vulnerable to radial temperature gradients, blocking in oscillation and increased sensitivity to contamination. These conditions decrease the service life of these bearings. It was decided that an accelerated thermal vacuum life test should be conducted. The test apparatus and results are described and the rationale is presented for reducing a multiyear life test on oil lubricated bearings to less than a year.

  15. 9. COMPLETED ROLLING CAMERA CAR ON RAILROAD TRACK AND BRIDGE ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    9. COMPLETED ROLLING CAMERA CAR ON RAILROAD TRACK AND BRIDGE LOOKING WEST, APRIL 26, 1948. (ORIGINAL PHOTOGRAPH IN POSSESSION OF DAVE WILLIS, SAN DIEGO, CALIFORNIA.) - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  16. Contrail study with ground-based cameras

    NASA Astrophysics Data System (ADS)

    Schumann, U.; Hempel, R.; Flentje, H.; Garhammer, M.; Graf, K.; Kox, S.; Lösslein, H.; Mayer, B.

    2013-12-01

    Photogrammetric methods and analysis results for contrails observed with wide-angle cameras are described. Four cameras of two different types (view angle < 90° or whole-sky imager) at the ground at various positions are used to track contrails and to derive their altitude, width, and horizontal speed. Camera models for both types are described to derive the observation angles for given image coordinates and their inverse. The models are calibrated with sightings of the Sun, the Moon and a few bright stars. The methods are applied and tested in a case study. Four persistent contrails crossing each other, together with a short-lived one, are observed with the cameras. Vertical and horizontal positions of the contrails are determined from the camera images to an accuracy of better than 230 m and horizontal speed to 0.2 m s-1. With this information, the aircraft causing the contrails are identified by comparison to traffic waypoint data. The observations are compared with synthetic camera pictures of contrails simulated with the contrail prediction model CoCiP, a Lagrangian model using air traffic movement data and numerical weather prediction (NWP) data as input. The results provide tests for the NWP and contrail models. The cameras show spreading and thickening contrails, suggesting ice-supersaturation in the ambient air. The ice-supersaturated layer is found thicker and more humid in this case than predicted by the NWP model used. The simulated and observed contrail positions agree up to differences caused by uncertain wind data. The contrail widths, which depend on wake vortex spreading, ambient shear and turbulence, were partly wider than simulated.

  17. Contrail study with ground-based cameras

    NASA Astrophysics Data System (ADS)

    Schumann, U.; Hempel, R.; Flentje, H.; Garhammer, M.; Graf, K.; Kox, S.; Lösslein, H.; Mayer, B.

    2013-08-01

    Photogrammetric methods and analysis results for contrails observed with wide-angle cameras are described. Four cameras of two different types (view angle < 90° or whole-sky imager) at the ground at various positions are used to track contrails and to derive their altitude, width, and horizontal speed. Camera models for both types are described to derive the observation angles for given image coordinates and their inverse. The models are calibrated with sightings of the Sun, the Moon and a few bright stars. The methods are applied and tested in a case study. Four persistent contrails crossing each other together with a short-lived one are observed with the cameras. Vertical and horizontal positions of the contrails are determined from the camera images to an accuracy of better than 200 m and horizontal speed to 0.2 m s-1. With this information, the aircraft causing the contrails are identified by comparison to traffic waypoint data. The observations are compared with synthetic camera pictures of contrails simulated with the contrail prediction model CoCiP, a Lagrangian model using air traffic movement data and numerical weather prediction (NWP) data as input. The results provide tests for the NWP and contrail models. The cameras show spreading and thickening contrails suggesting ice-supersaturation in the ambient air. The ice-supersaturated layer is found thicker and more humid in this case than predicted by the NWP model used. The simulated and observed contrail positions agree up to differences caused by uncertain wind data. The contrail widths, which depend on wake vortex spreading, ambient shear and turbulence, were partly wider than simulated.

  18. Constrained space camera assembly

    DOEpatents

    Heckendorn, Frank M. (Aiken, SC); Anderson, Erin K. (Augusta, GA); Robinson, Casandra W. (Trenton, SC); Haynes, Harriet B. (Aiken, SC)

    1999-01-01

    A constrained space camera assembly which is intended to be lowered through a hole into a tank, a borehole or another cavity. The assembly includes a generally cylindrical chamber comprising a head and a body and a wiring-carrying conduit extending from the chamber. Means are included in the chamber for rotating the body about the head without breaking an airtight seal formed therebetween. The assembly may be pressurized and accompanied with a pressure sensing means for sensing if a breach has occurred in the assembly. In one embodiment, two cameras, separated from their respective lenses, are installed on a mounting apparatus disposed in the chamber. The mounting apparatus includes means allowing both longitudinal and lateral movement of the cameras. Moving the cameras longitudinally focuses the cameras, and moving the cameras laterally away from one another effectively converges the cameras so that close objects can be viewed. The assembly further includes means for moving lenses of different magnification forward of the cameras.

  19. Vacuum Camera Cooler

    NASA Technical Reports Server (NTRS)

    Laugen, Geoffrey A.

    2011-01-01

    Acquiring cheap, moving video was impossible in a vacuum environment, due to camera overheating. This overheating is brought on by the lack of cooling media in vacuum. A water-jacketed camera cooler enclosure machined and assembled from copper plate and tube has been developed. The camera cooler (see figure) is cup-shaped and cooled by circulating water or nitrogen gas through copper tubing. The camera, a store-bought "spy type," is not designed to work in a vacuum. With some modifications the unit can be thermally connected when mounted in the cup portion of the camera cooler. The thermal conductivity is provided by copper tape between parts of the camera and the cooled enclosure. During initial testing of the demonstration unit, the camera cooler kept the CPU (central processing unit) of this video camera at operating temperature. This development allowed video recording of an in-progress test, within a vacuum environment.

  20. The MMT all-sky camera

    NASA Astrophysics Data System (ADS)

    Pickering, T. E.

    2006-06-01

    The MMT all-sky camera is a low-cost, wide-angle camera system that takes images of the sky every 10 seconds, day and night. It is based on an Adirondack Video Astronomy StellaCam II video camera and utilizes an auto-iris fish-eye lens to allow safe operation under all lighting conditions, even direct sunlight. This combined with the anti-blooming characteristics of the StellaCam's detector allows useful images to be obtained during sunny days as well as brightly moonlit nights. Under dark skies the system can detect stars as faint as 6th magnitude as well as very thin cirrus and low surface brightness zodiacal features such as gegenschein. The total hardware cost of the system was less than $3500 including computer and framegrabber card, a fraction of the cost of comparable systems utilizing traditional CCD cameras.

  1. The camera in space

    Microsoft Academic Search

    H. J. P. Arnold

    1974-01-01

    Photography in the U.S. space program for engineering information, lunar mapping, science and applications (e.g., studies of astronomical phenomena, terrain and weather, and the lunar surface), and general purposes. After Gemini 9, each manned mission (Apollo, Skylab) had a documented, detailed photographic plan. Photographic equipment (including still cameras, variable-sequence cameras, lunar surface close-up stereo camera, and lunar mapping cameras) and

  2. Calibrating Distributed Camera Networks

    Microsoft Academic Search

    Dhanya Devarajan; Zhaolin Cheng; Richard J. Radke

    2008-01-01

    Recent developments in wireless sensor networks have made feasible distributed camera networks, in which cameras and processing nodes may be spread over a wide geographical area, with no centralized processor and limited ability to communicate a large amount of information over long distances. This paper overviews distributed algorithms for the calibration of such camera networks- that is, the automatic estimation

  3. Evaluating intensified camera systems

    SciTech Connect

    S. A. Baker

    2000-07-01

    This paper describes image evaluation techniques used to standardize camera system characterizations. Key areas of performance include resolution, noise, and sensitivity. This team has developed a set of analysis tools, in the form of image processing software used to evaluate camera calibration data, to aid an experimenter in measuring a set of camera performance metrics. These performance metrics identify capabilities and limitations of the camera system, while establishing a means for comparing camera systems. Analysis software is used to evaluate digital camera images recorded with charge-coupled device (CCD) cameras. Several types of intensified camera systems are used in the high-speed imaging field. Electro-optical components are used to provide precise shuttering or optical gain for a camera system. These components including microchannel plate or proximity focused diode image intensifiers, electro-static image tubes, or electron-bombarded CCDs affect system performance. It is important to quantify camera system performance in order to qualify a system as meeting experimental requirements. The camera evaluation tool is designed to provide side-by-side camera comparison and system modeling information.

  4. Ultra-fast framing camera tube

    DOEpatents

    Kalibjian, Ralph (1051 Batavia Ave., Livermore, CA 94550)

    1981-01-01

    An electronic framing camera tube features focal plane image dissection and synchronized restoration of the dissected electron line images to form two-dimensional framed images. Ultra-fast framing is performed by first streaking a two-dimensional electron image across a narrow slit, thereby dissecting the two-dimensional electron image into sequential electron line images. The dissected electron line images are then restored into a framed image by a restorer deflector operated synchronously with the dissector deflector. The number of framed images on the tube's viewing screen is equal to the number of dissecting slits in the tube. The distinguishing features of this ultra-fast framing camera tube are the focal plane dissecting slits, and the synchronously-operated restorer deflector which restores the dissected electron line images into a two-dimensional framed image. The framing camera tube can produce image frames having high spatial resolution of optical events in the sub-100 picosecond range.

  5. Tower Camera Handbook

    SciTech Connect

    Moudry, D

    2005-01-01

    The tower camera in Barrow provides hourly images of ground surrounding the tower. These images may be used to determine fractional snow cover as winter arrives, for comparison with the albedo that can be calculated from downward-looking radiometers, as well as some indication of present weather. Similarly, during spring time, the camera images show the changes in the ground albedo as the snow melts. The tower images are saved in hourly intervals. In addition, two other cameras, the skydeck camera in Barrow and the piling camera in Atqasuk, show the current conditions at those sites.

  6. Automated Camera Calibration

    NASA Technical Reports Server (NTRS)

    Chen, Siqi; Cheng, Yang; Willson, Reg

    2006-01-01

    Automated Camera Calibration (ACAL) is a computer program that automates the generation of calibration data for camera models used in machine vision systems. Machine vision camera models describe the mapping between points in three-dimensional (3D) space in front of the camera and the corresponding points in two-dimensional (2D) space in the camera s image. Calibrating a camera model requires a set of calibration data containing known 3D-to-2D point correspondences for the given camera system. Generating calibration data typically involves taking images of a calibration target where the 3D locations of the target s fiducial marks are known, and then measuring the 2D locations of the fiducial marks in the images. ACAL automates the analysis of calibration target images and greatly speeds the overall calibration process.

  7. High Resolution Measurements of Beach Face Morphology Using Stereo Video Cameras

    Microsoft Academic Search

    L. Clarke; R. Holman

    2006-01-01

    High resolution measurements of beach elevation are computed using images from a pair of video cameras viewing the same scene from different angles. Given the camera positions and camera calibration data, the beach face can be accurately reconstructed from 3-D coordinates computed at positions corresponding to every image pixel. Measurements of subaerial beach morphology at Duck Beach, North Carolina and

  8. Ultraviolet spectroscopy of narrow coronal mass ejections

    E-print Network

    D. Dobrzycka; J. C. Raymond; D. A. Biesecker; J. Li; A. Ciaravella

    2003-01-31

    We present Ultraviolet Coronagraph Spectrometer (UVCS) observations of 5 narrow coronal mass ejections (CMEs) that were among 15 narrow CMEs originally selected by Gilbert et al. (2001). Two events (1999 March 27, April 15) were "structured", i.e. in white light data they exhibited well defined interior features, and three (1999 May 9, May 21, June 3) were "unstructured", i.e. appeared featureless. In UVCS data the events were seen as 4-13 deg wide enhancements of the strongest coronal lines HI Ly-alpha and OVI (1032,1037 A). We derived electron densities for several of the events from the Large Angle Spectrometric Coronagraph (LASCO) C2 white light observations. They are comparable to or smaller than densities inferred for other CMEs. We modeled the observable properties of examples of the structured (1999 April 15) and unstructured (1999 May 9) narrow CMEs at different heights in the corona between 1.5 and 2 R(Sun). The derived electron temperatures, densities and outflow speeds are similar for those two types of ejections. They were compared with properties of polar coronal jets and other CMEs. We discuss different scenarios of narrow CME formation either as a jet formed by reconnection onto open field lines or CME ejected by expansion of closed field structures. Overall, we conclude that the existing observations do not definitively place the narrow CMEs into the jet or the CME picture, but the acceleration of the 1999 April 15 event resembles acceleration seen in many CMEs, rather than constant speeds or deceleration observed in jets.

  9. UV Cameras for Volcanic Monitoring

    NASA Astrophysics Data System (ADS)

    Tamburelllo, G.; Swanson, E.

    2011-12-01

    Levels of SO2 emission provide valuable information on the activity status of volcanic systems and are routinely used in hazard and risk assessment. A recent development in this field is UV camera technology, an effective and easy to use method for remote monitoring of volcanic emissions, which provides information across the full field of view and real time analysis of equipment set-up and performance. This study, carried out on Stromboli, Italy, in July 2010 sought to explore the range of data available from this technique and improve issues relating to instrument calibration, building on the findings of Kantazas et al (2010) and Kern et al (2010). A 1Hz passive and explosive degassing data set was obtained using a dual camera set-up, filters focused on 310 nm and 330 nm wavelengths, in conjunction with a fixed point USB2000 spectrometer. The cameras were initially calibrated using cells containing known values of SO2. During recording periods the adoption of a new rapid calibration protocol provided enhanced data quality whilst minimising monitoring down time. Data was analysed using an in house built Lab View VI routine (Tamburello et al 2011). The ability to take multi directional plume cross sections improved the accuracy of obliquely angled plume data, whilst enabling within program measurement of plume speed. Explosive masses were also measured with values obtained for both short duration and prolonged release events. In addition to emitted SO2, the visual aspect of data sets enabled measurement and monitoring of ascent velocities, direction of ejection, plume collimation and changes between explosive types. Furthermore, flexibility within post processing set-up permitted concurrent analysis of passive and active degassing behaviours. Time shifting of plume traces to the start times of explosive events allowing interplay between these two behaviours to be directly studied. This work demonstrates that UV cameras are versatile and a valuable contributor to the systematic study of volcanic degassing processes.

  10. Exploiting Mutual Camera Visibility in Multi-camera Motion Estimation

    Microsoft Academic Search

    Christian Kurz; Thorsten Thormählen; Bodo Rosenhahn; Hans-peter Seidel

    2009-01-01

    This paper addresses the estimation of camera motion and 3D reconstruction from image sequences for multiple independently\\u000a moving cameras. If multiple moving cameras record the same scene, a camera is often visible in another camera’s field of view.\\u000a This poses a constraint on the position of the observed camera, which can be included into the conjoined optimization process.\\u000a The paper

  11. Follow up of focal narrowing of retinal arterioles in glaucoma

    PubMed Central

    Papastathopoulos, K.; Jonas, J.

    1999-01-01

    AIM—To evaluate whether focal narrowing of retinal arterioles increases with progressive glaucomatous optic neuropathy.?METHODS—Focal narrowing of retinal arterioles and area of neuroretinal rim were morphometrically evaluated on colour stereo optic disc photographs of 59 patients with primary open angle glaucoma, 22 patients with normal pressure glaucoma, 11 patients with secondary open angle glaucoma, and 31 patients with ocular hypertension. Minimum follow up was 8 months. Focal arteriolar narrowing was quantified by calculating the ratio of the vessel width in the broadest to the narrowest vessel part.?RESULTS—In the subgroup of patients with progressive glaucomatous optic nerve damage (n=37), focal narrowing of retinal arterioles increased significantly (p<0.005) with decreasing neuroretinal rim area. In the subgroup of patients with stable appearance of the optic disc (n=86), focal narrowing of retinal arterioles did not change significantly (p=0.79). The positive correlation between increasing focal thinning of retinal arterioles and progression of glaucomatous optic neuropathy was present, although not statistically significant, in all the glaucoma subtypes examined. The location of focal thinning of retinal arterioles did not change in the follow up.?CONCLUSIONS—Focal narrowing of retinal arterioles increases significantly with progressive glaucomatous optic neuropathy, independent of the type of glaucoma. It is stable in patients with non-progressive glaucoma. The findings agree with previous reports on a higher degree of focal arteriole narrowing in eyes with pronounced optic nerve damage in comparison with those with moderate optic nerve atrophy or normal eyes. In the clinical management of patients with glaucoma, in some eyes, increasing focal arteriole narrowing may suggest progression of disease.?? Keywords: focal narrowing; retinal arterioles; glaucoma PMID:10365034

  12. Microchannel plate streak camera

    DOEpatents

    Wang, C.L.

    1984-09-28

    An improved streak camera in which a microchannel plate electron multiplier is used in place of or in combination with the photocathode used in prior streak cameras. The improved streak camera is far more sensitive to photons (uv to gamma-rays) than the conventional x-ray streak camera which uses a photocathode. The improved streak camera offers gamma-ray detection with high temporal resolution. It also offers low-energy x-ray detection without attenuation inside the cathode. Using the microchannel plate in the improved camera has resulted in a time resolution of about 150 ps, and has provided a sensitivity sufficient for 1000 keV x-rays.

  13. Microchannel plate streak camera

    DOEpatents

    Wang, C.L.

    1989-03-21

    An improved streak camera in which a microchannel plate electron multiplier is used in place of or in combination with the photocathode used in prior streak cameras is disclosed. The improved streak camera is far more sensitive to photons (UV to gamma-rays) than the conventional x-ray streak camera which uses a photocathode. The improved streak camera offers gamma-ray detection with high temporal resolution. It also offers low-energy x-ray detection without attenuation inside the cathode. Using the microchannel plate in the improved camera has resulted in a time resolution of about 150 ps, and has provided a sensitivity sufficient for 1,000 KeV x-rays. 3 figs.

  14. Microchannel plate streak camera

    DOEpatents

    Wang, Ching L. (Livermore, CA)

    1989-01-01

    An improved streak camera in which a microchannel plate electron multiplier is used in place of or in combination with the photocathode used in prior streak cameras. The improved streak camera is far more sensitive to photons (UV to gamma-rays) than the conventional x-ray streak camera which uses a photocathode. The improved streak camera offers gamma-ray detection with high temporal resolution. It also offers low-energy x-ray detection without attenuation inside the cathode. Using the microchannel plate in the improved camera has resulted in a time resolution of about 150 ps, and has provided a sensitivity sufficient for 1000 KeV x-rays.

  15. Electron bombardment CCD camera

    Microsoft Academic Search

    Tadashi Maruno; Masahiko Shirai; Fumio Iwase; Naotaka Hakamata

    1998-01-01

    Two kinds of electron bombardment CCD (EB-CCD) camera are newly developed, employing an EB-CCD sensor made by Hamamatsu Photonics. The slow scan cooled CCD camera installs the full frame transfer type EB-CCD sensor with 512 X 512 pixel format and standard video rate camera installs the frame transfer type EB-CCD sensor with 658 X 490 pixel format. For slow scan

  16. The Dawn Framing Camera

    Microsoft Academic Search

    H. Sierks; H. U. Keller; R. Jaumann; H. Michalik; T. Behnke; F. Bubenhagen; I. Büttner; U. Carsenty; U. Christensen; R. Enge; B. Fiethe; P. Gutiérrez Marqués; H. Hartwig; H. Krüger; W. Kühne; T. Maue; S. Mottola; A. Nathues; K.-U. Reiche; M. L. Richards; T. Roatsch; S. E. Schröder; I. Szemerey; M. Tschentscher

    2011-01-01

    The Framing Camera (FC) is the German contribution to the Dawn mission. The camera will map 4 Vesta and 1 Ceres through a\\u000a clear filter and 7 band-pass filters covering the wavelengths from the visible to the near-IR. The camera will allow the determination\\u000a of the physical parameters of the asteroids, the reconstruction of their global shape as well as

  17. Narrow band 3 × 3 Mueller polarimetric endoscopy.

    PubMed

    Qi, Ji; Ye, Menglong; Singh, Mohan; Clancy, Neil T; Elson, Daniel S

    2013-01-01

    Mueller matrix polarimetric imaging has shown potential in tissue diagnosis but is challenging to implement endoscopically. In this work, a narrow band 3 × 3 Mueller matrix polarimetric endoscope was designed by rotating the endoscope to generate 0°, 45° and 90° linearly polarized illumination and positioning a rotating filter wheel in front of the camera containing three polarisers to permit polarization state analysis for backscattered light. The system was validated with a rotating linear polarizer and a diffuse reflection target. Initial measurements of 3 × 3 Mueller matrices on a rat are demonstrated, followed by matrix decomposition into the depolarization and retardance matrices for further analysis. Our work shows the feasibility of implementing polarimetric imaging in a rigid endoscope conveniently and economically in order to reveal diagnostic information. PMID:24298405

  18. Narrow band 3 × 3 Mueller polarimetric endoscopy

    PubMed Central

    Qi, Ji; Ye, Menglong; Singh, Mohan; Clancy, Neil T.; Elson, Daniel S.

    2013-01-01

    Mueller matrix polarimetric imaging has shown potential in tissue diagnosis but is challenging to implement endoscopically. In this work, a narrow band 3 × 3 Mueller matrix polarimetric endoscope was designed by rotating the endoscope to generate 0°, 45° and 90° linearly polarized illumination and positioning a rotating filter wheel in front of the camera containing three polarisers to permit polarization state analysis for backscattered light. The system was validated with a rotating linear polarizer and a diffuse reflection target. Initial measurements of 3 × 3 Mueller matrices on a rat are demonstrated, followed by matrix decomposition into the depolarization and retardance matrices for further analysis. Our work shows the feasibility of implementing polarimetric imaging in a rigid endoscope conveniently and economically in order to reveal diagnostic information. PMID:24298405

  19. LSST Camera Optics Design

    SciTech Connect

    Riot, V J; Olivier, S; Bauman, B; Pratuch, S; Seppala, L; Gilmore, D; Ku, J; Nordby, M; Foss, M; Antilogus, P; Morgado, N

    2012-05-24

    The Large Synoptic Survey Telescope (LSST) uses a novel, three-mirror, telescope design feeding a camera system that includes a set of broad-band filters and three refractive corrector lenses to produce a flat field at the focal plane with a wide field of view. Optical design of the camera lenses and filters is integrated in with the optical design of telescope mirrors to optimize performance. We discuss the rationale for the LSST camera optics design, describe the methodology for fabricating, coating, mounting and testing the lenses and filters, and present the results of detailed analyses demonstrating that the camera optics will meet their performance goals.

  20. Ringfield lithographic camera

    DOEpatents

    Sweatt, William C. (Albuquerque, NM)

    1998-01-01

    A projection lithography camera is presented with a wide ringfield optimized so as to make efficient use of extreme ultraviolet radiation from a large area radiation source (e.g., D.sub.source .apprxeq.0.5 mm). The camera comprises four aspheric mirrors optically arranged on a common axis of symmetry with an increased etendue for the camera system. The camera includes an aperture stop that is accessible through a plurality of partial aperture stops to synthesize the theoretical aperture stop. Radiation from a mask is focused to form a reduced image on a wafer, relative to the mask, by reflection from the four aspheric mirrors.

  1. Diffusion-induced Ramsey narrowing

    E-print Network

    Yanhong Xiao; Irina Novikova; David F. Phillips; Ronald L. Walsworth

    2005-07-19

    A novel form of Ramsey narrowing is identified and characterized. For long-lived coherent atomic states coupled by laser fields, the diffusion of atoms in-and-out of the laser beam induces a spectral narrowing of the atomic resonance lineshape. Illustrative experiments and an intuitive analytical model are presented for this diffusion-induced Ramsey narrowing, which occurs commonly in optically-interrogated systems.

  2. Measurement of Dicke Narrowing in Electromagnetically Induced Transparency

    E-print Network

    M. Shuker; O. Firstenberg; R. Pugatch; A. Ben-Kish; A. Ron; N. Davidson

    2007-03-13

    Dicke narrowing is a phenomena that dramatically reduces the Doppler width of spectral lines, due to frequent velocity-changing collisions. A similar phenomena occurs for electromagnetically induced transparency (EIT) resonances, and facilitates ultra-narrow spectral features in room-temperature vapor. We directly measure the Dicke-like narrowing by studying EIT line-shapes as a function of the angle between the pump and the probe beams. The measurements are in good agreement with an analytic theory with no fit parameters. The results show that Dicke narrowing can increase substantially the tolerance of hot-vapor EIT to angular deviations. We demonstrate the importance of this effect for applications such as imaging and spatial solitons using a single-shot imaging experiment, and discuss the implications on the feasibility of storing images in atomic vapor.

  3. LSST Camera Optics Design

    Microsoft Academic Search

    V J Riot; S Olivier; B Bauman; S Pratuch; L Seppala; D Gilmore; J Ku; M Nordby; M Foss; P Antilogus; N Morgado

    2012-01-01

    The Large Synoptic Survey Telescope (LSST) uses a novel, three-mirror, telescope design feeding a camera system that includes a set of broad-band filters and three refractive corrector lenses to produce a flat field at the focal plane with a wide field of view. Optical design of the camera lenses and filters is integrated in with the optical design of telescope

  4. Multi camera image tracking

    Microsoft Academic Search

    James Black; Tim Ellis

    2006-01-01

    This paper presents a method for multi-camera image tracking in the context of image surveillance. The approach differs from most methods in that we exploit multiple camera views to resolve object occlusion. Moving objects are detected by using background subtraction. Viewpoint correspondence between the detected objects is then established by using the ground plane homography constraint. The Kalman Filter is

  5. Replacing 16 mm film cameras with high definition digital cameras

    SciTech Connect

    Balch, K.S. [Eastman Kodak Co., San Diego, CA (United States). Motion Analysis Systems Div.

    1995-12-31

    For many years 16 mm film cameras have been used in severe environments. These film cameras are used on Hy-G automotive sleds, airborne gun cameras, range tracking and other hazardous environments. The companies and government agencies using these cameras are in need of replacing them with a more cost effective solution. Film-based cameras still produce the best resolving capability, however, film development time, chemical disposal, recurring media cost, and faster digital analysis are factors influencing the desire for a 16 mm film camera replacement. This paper will describe a new camera from Kodak that has been designed to replace 16 mm high speed film cameras.

  6. Multi-angle Imaging SpectroRadiometer (MISR) on-board calibrator (OBC) in-flight performance studies

    Microsoft Academic Search

    Nadine L. Chrien; Carol J. Bruegge; Robert R. Ando

    2002-01-01

    The Multi-angle Imaging SpectroRadiometer (MISR) consists of nine cameras pointing from nadir to an extreme of 70.5° in the view angle. It is a pushbroom imager with four spectral bands per camera. Instrument specifications call for each camera to be calibrated to an absolute uncertainty of 3% and to within 1% relative to the other cameras. To accomplish this, the

  7. Optimizing Compton camera geometries.

    PubMed

    Chelikani, Sudhakar; Gore, John; Zubal, George

    2004-04-21

    Compton cameras promise to improve the characteristics of nuclear medicine imaging, wherein mechanical collimation is replaced with electronic collimation. This leads to huge gains in sensitivity and, consequently, a reduction in the radiation dosage that needs to be administered to the patient. Design modifications that improve the sensitivity invariably compromise resolution. The scope of the current project was to determine an optimal design and configuration of a Compton camera that strikes a balance between these two properties. Transport of the photon flux from the source to the detectors was simulated with the camera geometry serving as the parameter to be optimized. Two variations of the Boltzmann photon transport equation, with and without photon polarization, were employed to model the flux. Doppler broadening of the energy spectra was also included. The simulation was done in a Monte Carlo framework using GEANT4. Two clinically relevant energies, 140 keV and 511 keV, corresponding to 99mTc and 18F were simulated. The gain in the sensitivity for the Compton camera over the conventional camera was 100 fold. Neither Doppler broadening nor polarization had any significant effect on the sensitivity of the camera. However, the spatial resolution of the camera was affected by these processes. Doppler broadening had a deleterious effect on the spatial resolution, but polarization improved the resolution when accounted for in the reconstruction algorithm. PMID:15152681

  8. Calibration of action cameras for photogrammetric purposes.

    PubMed

    Balletti, Caterina; Guerra, Francesco; Tsioukas, Vassilios; Vernier, Paolo

    2014-01-01

    The use of action cameras for photogrammetry purposes is not widespread due to the fact that until recently the images provided by the sensors, using either still or video capture mode, were not big enough to perform and provide the appropriate analysis with the necessary photogrammetric accuracy. However, several manufacturers have recently produced and released new lightweight devices which are: (a) easy to handle, (b) capable of performing under extreme conditions and more importantly (c) able to provide both still images and video sequences of high resolution. In order to be able to use the sensor of action cameras we must apply a careful and reliable self-calibration prior to the use of any photogrammetric procedure, a relatively difficult scenario because of the short focal length of the camera and its wide angle lens that is used to obtain the maximum possible resolution of images. Special software, using functions of the OpenCV library, has been created to perform both the calibration and the production of undistorted scenes for each one of the still and video image capturing mode of a novel action camera, the GoPro Hero 3 camera that can provide still images up to 12 Mp and video up 8 Mp resolution. PMID:25237898

  9. IR interferometers using modern cameras

    NASA Astrophysics Data System (ADS)

    Ai, Chiayu

    1997-11-01

    Laser interferometers have been used widely in the optics and disk drive industries. Often the surface of the sample is either too curved to resolve the fringes or too rough to reflect the incident beam back into the interferometer. Illuminating at a graze incident angle effectively increases the equivalent wavelength, and hence the reflectivity, but the image of a circular aperture becomes elliptical. Lasers with a long IR wavelength seem to be the solution. However,the spatial resolution of the vidicon cameras is usually poor, and the image lag is often too long. These limit the accuracy of an IR phase-shifting interferometer. Recently, we have designed tow types of interferometers for 3.39 micrometers and 10.6 micrometers using an InSb array and a micro- bolometer array, respectively. These modern cameras have a high resolution and hence greatly extend the range of measurable material from a blank to a finished optics. Because the refractive index of the optical material at the IR wavelength is usually very high, the anti-reflection coating of the optics at IR is more critical than that at a visible wavelength. The interferometer's design, the resolution, the dependence of the fringe contrast on the sample roughness, and the measurement results of various samples are presented.

  10. Comparing Cosmic Cameras

    NSDL National Science Digital Library

    Learners will take and then compare the images taken by a camera - to learn about focal length (and its effects on field of view), resolution, and ultimately how cameras take close-up pictures of far away objects. Finally, they will apply this knowledge to the images of comet Tempel 1 taken by two different spacecraft with three different cameras, in this case Deep Impact and those expected/obtained from Stardust-NExT. This lesson could easily be adapted for use with other NASA missions.

  11. Ringfield lithographic camera

    DOEpatents

    Sweatt, W.C.

    1998-09-08

    A projection lithography camera is presented with a wide ringfield optimized so as to make efficient use of extreme ultraviolet radiation from a large area radiation source (e.g., D{sub source} {approx_equal} 0.5 mm). The camera comprises four aspheric mirrors optically arranged on a common axis of symmetry. The camera includes an aperture stop that is accessible through a plurality of partial aperture stops to synthesize the theoretical aperture stop. Radiation from a mask is focused to form a reduced image on a wafer, relative to the mask, by reflection from the four aspheric mirrors. 11 figs.

  12. Sensing driver awareness by combining fisheye camera and Kinect

    NASA Astrophysics Data System (ADS)

    Wuhe, Z.; Lei, Z.; Ning, D.

    2014-11-01

    In this paper, we propose a Driver's Awareness Catching System to sense the driver's awareness. The system consists of a fisheye camera and a Kinect. The Kinect mounted inside vehicle is used to recognize and locate the 3D face of the driver. The fisheye camera mounted outside vehicle is used to monitor the road. The relative pose between two cameras is calibrated via a state-of-the-art method for calibrating cameras with non-overlapping field of view. The camera system works in this way: First, the SDK of Kinect released by Microsoft is used to tracking driver's face and capture eye's location together with sight direction. Secondly, the eye's location and the sight direction are transformed to the coordinate system of fisheye camera. Thirdly, corresponding view field is extracted from fisheye image. As there is a small displacement between driver's eyes and the optical center of fisheye camera, it will lead to a view angle deviation. Finally, we did a systematic analysis of the error distribution by numerical simulation and proved the feasibility of our camera system. On the other hand, we realized this camera system and achieved desired effect in realworld experiment.

  13. Lights, Camera,...MAGIC!

    ERIC Educational Resources Information Center

    Kolakowski, Pat; Dean, Jean E.

    1991-01-01

    Describes a four-session workshop called "Lights, Camera, Action" in which students are videotaped as they read letters they wrote from the point of view of a character in a book. Notes a positive effect on student motivation. (MG)

  14. The Star Formation Camera

    Microsoft Academic Search

    Paul A. Scowen; Rolf Jansen; Matthew Beasley; Daniela Calzetti; Steven Desch; Alex Fullerton; John Gallagher; Doug Lisman; Steve Macenka; Sangeeta Malhotra; Mark McCaughrean; Shouleh Nikzad; Robert O'Connell; Sally Oey; Deborah Padgett; James Rhoads; Aki Roberge; Oswald Siegmund; Stuart Shaklan; Nathan Smith; Daniel Stern; Jason Tumlinson; Rogier Windhorst; Robert Woodruff

    2009-01-01

    The Star Formation Camera (SFC) is a wide-field (~15'x19, >280 arcmin^2), high-resolution (18x18 mas pixels) UV\\/optical dichroic camera designed for the Theia 4-m space-borne space telescope concept. SFC will deliver diffraction-limited images at lambda > 300 nm in both a blue (190-517nm) and a red (517-1075nm) channel simultaneously. Our aim is to conduct a comprehensive and systematic study of the

  15. Gamma ray camera

    SciTech Connect

    Robbins, C.D.; Wang, S.

    1980-09-09

    An anger gamma ray camera is improved by the substitution of a gamma ray sensitive, proximity type image intensifier tube for the scintillator screen in the anger camera, the image intensifier tube having a negatively charged flat scintillator screen and a flat photocathode layer and a grounded, flat output phosphor display screen all of the same dimension (Unity image magnification) and all within a grounded metallic tube envelope and having a metallic, inwardly concaved input window between the scintillator screen and the collimator.

  16. Spacecraft camera image registration

    NASA Technical Reports Server (NTRS)

    Kamel, Ahmed A. (Inventor); Graul, Donald W. (Inventor); Chan, Fred N. T. (Inventor); Gamble, Donald W. (Inventor)

    1987-01-01

    A system for achieving spacecraft camera (1, 2) image registration comprises a portion external to the spacecraft and an image motion compensation system (IMCS) portion onboard the spacecraft. Within the IMCS, a computer (38) calculates an image registration compensation signal (60) which is sent to the scan control loops (84, 88, 94, 98) of the onboard cameras (1, 2). At the location external to the spacecraft, the long-term orbital and attitude perturbations on the spacecraft are modeled. Coefficients (K, A) from this model are periodically sent to the onboard computer (38) by means of a command unit (39). The coefficients (K, A) take into account observations of stars and landmarks made by the spacecraft cameras (1, 2) themselves. The computer (38) takes as inputs the updated coefficients (K, A) plus synchronization information indicating the mirror position (AZ, EL) of each of the spacecraft cameras (1, 2), operating mode, and starting and stopping status of the scan lines generated by these cameras (1, 2), and generates in response thereto the image registration compensation signal (60). The sources of periodic thermal errors on the spacecraft are discussed. The system is checked by calculating measurement residuals, the difference between the landmark and star locations predicted at the external location and the landmark and star locations as measured by the spacecraft cameras (1, 2).

  17. 3-D Flow Visualization with a Light-field Camera

    NASA Astrophysics Data System (ADS)

    Thurow, B.

    2012-12-01

    Light-field cameras have received attention recently due to their ability to acquire photographs that can be computationally refocused after they have been acquired. In this work, we describe the development of a light-field camera system for 3D visualization of turbulent flows. The camera developed in our lab, also known as a plenoptic camera, uses an array of microlenses mounted next to an image sensor to resolve both the position and angle of light rays incident upon the camera. For flow visualization, the flow field is seeded with small particles that follow the fluid's motion and are imaged using the camera and a pulsed light source. The tomographic MART algorithm is then applied to the light-field data in order to reconstruct a 3D volume of the instantaneous particle field. 3D, 3C velocity vectors are then determined from a pair of 3D particle fields using conventional cross-correlation algorithms. As an illustration of the concept, 3D/3C velocity measurements of a turbulent boundary layer produced on the wall of a conventional wind tunnel are presented. Future experiments are planned to use the camera to study the influence of wall permeability on the 3-D structure of the turbulent boundary layer.Schematic illustrating the concept of a plenoptic camera where each pixel represents both the position and angle of light rays entering the camera. This information can be used to computationally refocus an image after it has been acquired. Instantaneous 3D velocity field of a turbulent boundary layer determined using light-field data captured by a plenoptic camera.

  18. Cryogenic Detectors (Narrow Field Instruments)

    Microsoft Academic Search

    H. Hoevers; P. Verhoeve

    2003-01-01

    Two cryogenic imaging spectrometer arrays are currently considered as focal plane instruments for XEUS. The narrow field imager 1 (NFI 1) will cover the energy range from 0.05 to 3 keV with an energy resolution of 2 eV, or better, at 500 eV. A second narrow field imager (NFI 2) covers the energy range from 1 to 15 keV with

  19. Deployable Wireless Camera Penetrators

    NASA Technical Reports Server (NTRS)

    Badescu, Mircea; Jones, Jack; Sherrit, Stewart; Wu, Jiunn Jeng

    2008-01-01

    A lightweight, low-power camera dart has been designed and tested for context imaging of sampling sites and ground surveys from an aerobot or an orbiting spacecraft in a microgravity environment. The camera penetrators also can be used to image any line-of-sight surface, such as cliff walls, that is difficult to access. Tethered cameras to inspect the surfaces of planetary bodies use both power and signal transmission lines to operate. A tether adds the possibility of inadvertently anchoring the aerobot, and requires some form of station-keeping capability of the aerobot if extended examination time is required. The new camera penetrators are deployed without a tether, weigh less than 30 grams, and are disposable. They are designed to drop from any altitude with the boost in transmitting power currently demonstrated at approximately 100-m line-of-sight. The penetrators also can be deployed to monitor lander or rover operations from a distance, and can be used for surface surveys or for context information gathering from a touch-and-go sampling site. Thanks to wireless operation, the complexity of the sampling or survey mechanisms may be reduced. The penetrators may be battery powered for short-duration missions, or have solar panels for longer or intermittent duration missions. The imaging device is embedded in the penetrator, which is dropped or projected at the surface of a study site at 90 to the surface. Mirrors can be used in the design to image the ground or the horizon. Some of the camera features were tested using commercial "nanny" or "spy" camera components with the charge-coupled device (CCD) looking at a direction parallel to the ground. Figure 1 shows components of one camera that weighs less than 8 g and occupies a volume of 11 cm3. This camera could transmit a standard television signal, including sound, up to 100 m. Figure 2 shows the CAD models of a version of the penetrator. A low-volume array of such penetrator cameras could be deployed from an aerobot or a spacecraft onto a comet or asteroid. A system of 20 of these penetrators could be designed and built in a 1- to 2-kg mass envelope. Possible future modifications of the camera penetrators, such as the addition of a chemical spray device, would allow the study of simple chemical reactions of reagents sprayed at the landing site and looking at the color changes. Zoom lenses also could be added for future use.

  20. Mars Exploration Rover Engineering Cameras

    Microsoft Academic Search

    J. N. Maki; J. F. Bell; K. E. Herkenhoff; S. W. Squyres; A. Kiely; M. Klimesh; M. Schwochert; T. Litwin; R. Willson; A. Johnson; M. Maimone; E. Baumgartner; A. Collins; M. Wadsworth; S. T. Elliot; A. Dingizian; D. Brown; E. C. Hagerott; L. Scherr; R. Deen; D. Alexander; J. Lorre

    2003-01-01

    NASA's Mars Exploration Rover (MER) Mission will place a total of 20 cameras (10 per rover) onto the surface of Mars in early 2004. Fourteen of the 20 cameras are designated as engineering cameras and will support the operation of the vehicles on the Martian surface. Images returned from the engineering cameras will also be of significant importance to the

  1. Theoretical description of functionality, applications, and limitations of SO2 cameras for the remote sensing of volcanic plumes

    NASA Astrophysics Data System (ADS)

    Kern, C.; Kick, F.; Lübcke, P.; Vogel, L.; Wöhrbach, M.; Platt, U.

    2010-06-01

    The SO2 camera is a novel device for the remote sensing of volcanic emissions using solar radiation scattered in the atmosphere as a light source for the measurements. The method is based on measuring the ultra-violet absorption of SO2 in a narrow wavelength window around 310 nm by employing a band-pass interference filter and a 2 dimensional UV-sensitive CCD detector. The effect of aerosol scattering can in part be compensated by additionally measuring the incident radiation around 325 nm, where the absorption of SO2 is about 30 times weaker, thus rendering the method applicable to optically thin plumes. For plumes with high aerosol optical densities, collocation of an additional moderate resolution spectrometer is desirable to enable a correction of radiative transfer effects. The ability to deliver spatially resolved images of volcanic SO2 distributions at a frame rate on the order of 1 Hz makes the SO2 camera a very promising technique for volcanic monitoring and for studying the dynamics of volcanic plumes in the atmosphere. This study gives a theoretical basis for the pertinent aspects of working with SO2 camera systems, including the measurement principle, instrument design, data evaluation and technical applicability. Several issues are identified that influence camera calibration and performance. For one, changes in the solar zenith angle lead to a variable light path length in the stratospheric ozone layer and therefore change the spectral distribution of scattered solar radiation incident at the Earth's surface. The varying spectral illumination causes a shift in the calibration of the SO2 camera's results. Secondly, the lack of spectral resolution inherent in the measurement technique leads to a non-linear relationship between measured weighted average optical density and the SO2 column density. Thirdly, as is the case with all remote sensing techniques that use scattered solar radiation as a light source, the radiative transfer between the sun and the instrument is variable, with both "radiative dilution" as well as multiple scattering occurring. These effects can lead to both, over or underestimation of the SO2 column density by more than an order of magnitude. As the accurate assessment of volcanic emissions depends on our ability to correct for these issues, recommendations for correcting the individual effects during data analysis are given. Aside from the above mentioned intrinsic effects, the particular technical design of the SO2 camera can also greatly influence its performance, depending on the setup chosen. A general description of an instrument setup is given, and the advantages and disadvantages of certain specific instrument designs are discussed. Finally, several measurement examples are shown and possibilities to combine SO2 camera measurements with other remote sensing techniques are explored.

  2. Theoretical description of functionality, applications, and limitations of SO2 cameras for the remote sensing of volcanic plumes

    NASA Astrophysics Data System (ADS)

    Kern, C.; Kick, F.; Lübcke, P.; Vogel, L.; Wöhrbach, M.; Platt, U.

    2010-02-01

    The SO2 camera is a novel technique for the remote sensing of volcanic emissions using solar radiation scattered in the atmosphere as a light source for the measurements. The method is based on measuring the ultra-violet absorption of SO2 in a narrow wavelength window around 310 nm by employing a band-pass interference filter and a 2-D UV-sensitive CCD detector. The effect of aerosol scattering can be eliminated by additionally measuring the incident radiation around 325 nm where the absorption of SO2 is no longer significant, thus rendering the method applicable to optically opaque plumes. The ability to deliver spatially resolved images of volcanic SO2 distributions at a frame rate on the order of 1 Hz makes the SO2 camera a very promising technique for volcanic monitoring and for studying the dynamics of volcanic plumes in the atmosphere. This study gives a theoretical basis for the pertinent aspects of working with SO2 camera systems, including the measurement principle, instrument design, data evaluation and technical applicability. Several issues are identified that influence camera calibration and performance. For one, changes in the solar zenith angle lead to a variable light path length in the stratospheric ozone layer and therefore change the spectral distribution of scattered solar radiation incident at the Earth's surface. The thus varying spectral illumination causes a shift in the calibration of the SO2 camera's results. Secondly, the lack of spectral resolution inherent in the measurement technique leads to a non-linear relationship between measured weighted average optical density and the SO2 column density. In addition, as is the case with all remote sensing techniques that use scattered solar radiation as a light source, the radiative transfer between the sun and the instrument is variable, with both radiative dilution as well as multiple scattering occurring. These effects can lead to both, over or underestimation of the SO2 column density by more than an order of magnitude. As the accurate assessment of volcanic emissions depends on our ability to correct for these issues, recommendations for correcting the individual effects during data analysis are given. Aside from the above mentioned intrinsic effects, the particular technical design of the SO2 camera can also greatly influence its performance, depending on the chosen setup. A general description of the instrument setup is given, and the advantages and disadvantages of certain specific instrument designs are discussed. Finally, several measurement examples are shown and possibilities to combine SO2 camera measurements with other remote sensing techniques are explored.

  3. Range camera self-calibration with scattering compensation

    NASA Astrophysics Data System (ADS)

    Lichti, Derek D.; Qi, Xiaojuan; Ahmed, Tanvir

    2012-11-01

    Time-of-flight range camera data are prone to the scattering bias caused by multiple internal reflections of light received from a highly reflective object in the camera's foreground that induce a phase shift in the light received from background targets. The corresponding range bias can have serious implications on the quality of data of captured scenes as well as the geometric self-calibration of range cameras. In order to minimise the impact of the scattering range biases, the calibration must be performed over a planar target field rather than a more desirable 3D target field. This significantly impacts the quality of the rangefinder offset parameter estimation due to its high correlation with the camera perspective centre position. In this contribution a new model to compensate for scattering-induced range errors is proposed that allows range camera self-calibration to be conducted over a 3D target field. Developed from experimental observations of scattering behaviour under specific scene conditions, it comprises four new additional parameters that are estimated in the self-calibrating bundle adjustment. The results of experiments conducted on five range camera datasets demonstrate the model's efficacy in compensating for the scattering error without compromising model fidelity. It is further demonstrated that it actually reduces the rangefinder offset-perspective centre correlation and its use with a 3D target field is the preferred method for calibrating narrow field-of-view range cameras.

  4. Limits on neutrino oscillations in the Fermilab narrow band beam

    SciTech Connect

    Brucker, E.B.; Jacques, P.F.; Kalelkar, M.; Koller, E.L.; Plano, R.J.; Stamer, P.E.; Baker, N.J.; Connolly, P.L.; Kahn, S.A.; Murtagh, M.J.

    1986-01-01

    A search for neutrino oscillations was made using the Fermilab narrow-band neutrino beam and the 15 ft. bubble chamber. No positive signal for neutrino oscillations was observed. Limits were obtained for mixing angles and neutrino mass differences for nu/sub ..mu../ ..-->.. nu/sub e/, nu/sub ..mu../ ..-->.. nu/sub tau/, nu/sub e/ ..-->.. nu/sub e/. 5 refs.

  5. Artificial human vision camera

    NASA Astrophysics Data System (ADS)

    Goudou, J.-F.; Maggio, S.; Fagno, M.

    2014-10-01

    In this paper we present a real-time vision system modeling the human vision system. Our purpose is to inspire from human vision bio-mechanics to improve robotic capabilities for tasks such as objects detection and tracking. This work describes first the bio-mechanical discrepancies between human vision and classic cameras and the retinal processing stage that takes place in the eye, before the optic nerve. The second part describes our implementation of these principles on a 3-camera optical, mechanical and software model of the human eyes and associated bio-inspired attention model.

  6. In-flight calibration of the Cassini imaging science sub-system cameras Robert West a,n

    E-print Network

    In-flight calibration of the Cassini imaging science sub-system cameras Robert West a,n , Benjamin s t r a c t We describe in-flight calibration of the Cassini Imaging Science Sub-system narrow- and wide. Introduction The Cassini imaging science sub-system (ISS) consists of two cameras on the Cassini spacecraft

  7. Multi-camera calibration based on openCV and multi-view registration

    Microsoft Academic Search

    Xiao-Ming Deng; Xiong Wan; Zhi-Min Zhang; Bi-Yan Leng; Ning-Ning Lou; Shuai He

    2010-01-01

    For multi-camera calibration systems, a method based on OpenCV and multi-view registration combining calibration algorithm is proposed. First of all, using a Zhang's calibration plate (8X8 chessboard diagram) and a number of cameras (with three industrial-grade CCD) to be 9 group images shooting from different angles, using OpenCV to calibrate the parameters fast in the camera. Secondly, based on the

  8. Do Speed Cameras Reduce Collisions?

    PubMed Central

    Skubic, Jeffrey; Johnson, Steven B.; Salvino, Chris; Vanhoy, Steven; Hu, Chengcheng

    2013-01-01

    We investigated the effects of speed cameras along a 26 mile segment in metropolitan Phoenix, Arizona. Motor vehicle collisions were retrospectively identified according to three time periods – before cameras were placed, while cameras were in place and after cameras were removed. A 14 mile segment in the same area without cameras was used for control purposes. Five cofounding variables were eliminated. In this study, the placement or removal of interstate highway speed cameras did not independently affect the incidence of motor vehicle collisions. PMID:24406979

  9. Characterization of gravity waves at Venus cloud top from the Venus Monitoring Camera images

    NASA Astrophysics Data System (ADS)

    Piccialli, A.; Titov, D.; Svedhem, H.; Markiewicz, W. J.

    2012-04-01

    Since 2006 the European mission Venus Express (VEx) is studying Venus atmosphere with a focus on atmospheric dynamics and circulation. Recently, several experiments on board Venus Express have detected waves in the Venus atmosphere both as oscillations in the temperature and wind fields and as patterns on the cloud layer. Waves could be playing an important role in the maintenance of the atmospheric circulation of Venus since they can transport energy and momentum. High resolution images of Venus Northern hemisphere obtained with the Venus Monitoring Camera (VMC/VEx) show distinct wave patterns at the cloud tops (~70 km altitude) interpreted as gravity waves. Venus Monitoring Camera (VMC) is a CCD-based camera specifically designed to take images of Venus in four narrow band filters in UV (365 nm), visible (513 nm), and near-IR (965 and 1000 nm). A systematic visual search of waves in VMC images was performed; more than 1700 orbits were analyzed and wave patterns were observed in about 200 images. With the aim to characterize the wave types and their possible origin, we retrieved wave properties such as location (latitude and longitude), local time, solar zenith angle, packet length and width, and orientation. A wavelet analysis was also applied to determine the wavelength and the region of dominance of each wave. Four types of waves were identified in VMC images: long, medium, short and irregular waves. The long type waves are characterized by long and narrow straight features extending more than a few hundreds kilometers and with a wavelength within the range of 7 to 48 km. Medium type waves have irregular wavefronts extending more than 100 km and with wavelengths in the range 8 - 21 km. Short wave packets have a width of several tens of kilometers and extends to few hundreds kilometers and are characterized by small wavelengths (3 - 16 km). Often short waves trains are observed at the edges of long features and seem connected to them. Irregular wave fields extend beyond the field of view of VMC and appear to be the result of wave breaking or wave interference. The waves are often identified in all channels and are mostly found at high latitudes (60-80°N) in the Northern hemisphere and seem to be concentrated above Ishtar Terra, a continental size highland that includes the highest mountain belts of the planet, thus suggesting a possible orographic origin of the waves. However, at the moment it is not possible to rule out a bias in the observations due to the spacecraft orbit that prevents waves to be seen at lower latitudes, because of lower resolution, and on the night side of the planet.

  10. Far UV camera \\/FAUST

    Microsoft Academic Search

    G. Riviere; J.-M. Deharveng

    1978-01-01

    This instrument consists of a Wynne telescope coupled with an image detector including an ultraviolet image intensifier with film recording. The resolution is about 2 arcmin. Owing to its large field (7.5 deg) and high sensitivity (limiting magnitude of 17), this camera is well suited for far-UV deep surveys; it will be flown on the first Spacelab flight.

  11. Communities, Cameras, and Conservation

    ERIC Educational Resources Information Center

    Patterson, Barbara

    2012-01-01

    Communities, Cameras, and Conservation (CCC) is the most exciting and valuable program the author has seen in her 30 years of teaching field science courses. In this citizen science project, students and community volunteers collect data on mountain lions ("Puma concolor") at four natural areas and public parks along the Front Range of Colorado.…

  12. Make a Pinhole Camera

    ERIC Educational Resources Information Center

    Fisher, Diane K.; Novati, Alexander

    2009-01-01

    On Earth, using ordinary visible light, one can create a single image of light recorded over time. Of course a movie or video is light recorded over time, but it is a series of instantaneous snapshots, rather than light and time both recorded on the same medium. A pinhole camera, which is simple to make out of ordinary materials and using ordinary…

  13. The LSST Camera Overview

    SciTech Connect

    Gilmore, Kirk; Kahn, Steven A.; Nordby, Martin; Burke, David; O'Connor, Paul; Oliver, John; Radeka, Veljko; Schalk, Terry; Schindler, Rafe; /SLAC

    2007-01-10

    The LSST camera is a wide-field optical (0.35-1um) imager designed to provide a 3.5 degree FOV with better than 0.2 arcsecond sampling. The detector format will be a circular mosaic providing approximately 3.2 Gigapixels per image. The camera includes a filter mechanism and, shuttering capability. It is positioned in the middle of the telescope where cross-sectional area is constrained by optical vignetting and heat dissipation must be controlled to limit thermal gradients in the optical beam. The fast, f/1.2 beam will require tight tolerances on the focal plane mechanical assembly. The focal plane array operates at a temperature of approximately -100 C to achieve desired detector performance. The focal plane array is contained within an evacuated cryostat, which incorporates detector front-end electronics and thermal control. The cryostat lens serves as an entrance window and vacuum seal for the cryostat. Similarly, the camera body lens serves as an entrance window and gas seal for the camera housing, which is filled with a suitable gas to provide the operating environment for the shutter and filter change mechanisms. The filter carousel can accommodate 5 filters, each 75 cm in diameter, for rapid exchange without external intervention.

  14. Spas color camera

    NASA Technical Reports Server (NTRS)

    Toffales, C.

    1983-01-01

    The procedures to be followed in assessing the performance of the MOS color camera are defined. Aspects considered include: horizontal and vertical resolution; value of the video signal; gray scale rendition; environmental (vibration and temperature) tests; signal to noise ratios; and white balance correction.

  15. Miniaturized fundus camera

    Microsoft Academic Search

    Christine Gliss; Jean-Marie A. Parel; John T. Flynn; Hans S. Pratisto; Peter F. Niederer

    2003-01-01

    Retinopathy of Prematurity (ROP) denotes a patholgoic development of the retina in prematurely born children. In order to prevent severe permanent damage to the eye and enable a timely treatment, the fundus of the eye in such children has to be examined according to established procedures. By the way of a miniaturized fundus camera it is intended to record digital

  16. A smart fast camera

    NASA Astrophysics Data System (ADS)

    Ragazzoni, Roberto; Arcidiacono, Carmelo; Diolaiti, Emiliano; Farinato, Jacopo; Moore, Anna M.; Soci, Roberto

    2004-09-01

    It is generally believed that very fast cameras imaging large Fields of View translate into huge optomechanics and mosaics of very large contiguous CCDs. It has already been suggested that seeing limited imaging cameras for telescopes whose diameters are larger than 20m are considered virtually impossible for a reasonable cost. We show here that, using existing technology and at a moderate price, one can build a Smart Fast Camera, a device that placed on aberrated Field of View, including those of slow focal ratios, is able to provide imaging at an equivalent focal ratio as low as F/1, with a size that is identical to the large focal ratio focal plane size. The design allows for easy correction of aberrations over the Field of View. It has low weight and size with respect to any focal reducer or prime focus station of the same performance. It can be applied to existing 8m-class telescopes to provide a wide field fast focal plane or to achieve seeing-limited imaging on Extremely Large Telescopes. As it offers inherently fast read-out in a massive parallel mode, the SFC can be used as a pupil or focal plane camera for pupil-plane or Shack-Hartmann wavefront sensing for 30-100m class telescopes.

  17. Jack & the Video Camera

    ERIC Educational Resources Information Center

    Charlan, Nathan

    2010-01-01

    This article narrates how the use of video camera has transformed the life of Jack Williams, a 10-year-old boy from Colorado Springs, Colorado, who has autism. The way autism affected Jack was unique. For the first nine years of his life, Jack remained in his world, alone. Functionally non-verbal and with motor skill problems that affected his…

  18. Multispectral Photometry of the Moon and Absolute Calibration of the Clementine UV/Vis Camera

    NASA Astrophysics Data System (ADS)

    Hillier, John K.; Buratti, Bonnie J.; Hill, Kathryn

    1999-10-01

    We present a multispectral photometric study of the Moon between solar phase angles of 0 and 85°. Using Clementine images obtained between 0.4 and 1.0 ?m, we produce a comprehensive study of the lunar surface containing the following results: (1) empirical photometric functions for the spectral range and viewing and illumination geometries mentioned, (2) photometric modeling that derives the physical properties of the upper regolith and includes a detailed study of the causes for the lunar opposition surge, (3) an absolute calibration of the Clementine UV/Vis camera. The calibration procedure given on the Clementine calibration web site produces reflectances relative to a halon standard and further appear significantly higher than those seen in groundbased observations. By comparing Clementine observations with prior groundbased observations of 15 sites on the Moon we have determined a good absolute calibration of the Clementine UV/Vis camera. A correction factor of 0.532 has been determined to convert the web site (www.planetary.brown.edu/clementine/calibration.html) reflectances to absolute values. From the calibrated data, we calculate empirical phase functions useful for performing photometric corrections to observations of the Moon between solar phase angles of 0 and 85° and in the spectral range 0.4 to 1.0?m. Finally, the calibrated data is used to fit a version of Hapke's photometric model modified to incorporate a new formulation, developed in this paper, of the lunar opposition surge which includes coherent backscatter. Recent studies of the lunar opposition effect have yielded contradictory results as to the mechanism responsible: shadow hiding, coherent backscatter, or both. We find that most of the surge can be explained by shadow hiding with a halfwidth of ˜8°. However, for the brightest regions (the highlands at 0.75-1.0?m) a small additional narrow component (halfwidth of <2°) of total amplitude ˜1/6 to 1/4 that of the shadow hiding surge is observed, which may be attributed to coherent backscatter. Interestingly, no evidence for the narrow component is seen in the maria or in the highlands at 0.415?m. A natural explanation for this is that these regions are too dark to exhibit enough multiple scattering for the effects of coherent backscatter to be seen. Finally, because the Moon is the only celestial body for which we have "ground truth" measurements, our results provide an important test for the robustness of photometric models of remote sensing observations.

  19. Dual cameras acquisition and display system of retina-like sensor camera and rectangular sensor camera

    NASA Astrophysics Data System (ADS)

    Cao, Nan; Cao, Fengmei; Lin, Yabin; Bai, Tingzhu; Song, Shengyu

    2015-04-01

    For a new kind of retina-like senor camera and a traditional rectangular sensor camera, dual cameras acquisition and display system need to be built. We introduce the principle and the development of retina-like senor. Image coordinates transformation and interpolation based on sub-pixel interpolation need to be realized for our retina-like sensor's special pixels distribution. The hardware platform is composed of retina-like senor camera, rectangular sensor camera, image grabber and PC. Combined the MIL and OpenCV library, the software program is composed in VC++ on VS 2010. Experience results show that the system can realizes two cameras' acquisition and display.

  20. An Initial Assessment of the CALIPSO Wide Field Camera Performance

    Microsoft Academic Search

    M. C. Pitts; W. S. Luck; Y. Hu; D. M. Winker

    2006-01-01

    The Wide Field Camera (WFC) is one of three instruments in the CALIPSO science payload, with the other two being the Cloud-Aerosol LIdar with Orthogonal Polarization (CALIOP) and the Infrared Imaging Radiometer (IIR). The WFC is a narrow-band, push-broom imager that provides continuous high-spatial-resolution imagery during the daylight segments of the orbit over a swath centered on the lidar footprint.

  1. Visual Feedback Stabilization of Balancing Tasks with Camera Misalignment

    NASA Astrophysics Data System (ADS)

    Hirata, Kentaro; Mizuno, Takashi

    In this paper, we consider visual feedback stabilization which tolerates small camera misalignment. Specifically, a balancing task with a cart-pendulum system using camera image is examined. Such a task is known to rely heavily on the detection of the vertical direction and the angle measurement error due to the camera misalignment could be fatal for stabilization. From a mathematical model of the measurement error, the effect of the misalignment is naturally represented by affine perturbation to the coefficient matrix of the output equation. Motivated by this fact, a special type of robust dynamic output feedback stabilization against polytopic uncertainty is investigated. By solving the related BMI, one can design a controller which tolerates the camera misalignment to some extent. The result is verified via experiments.

  2. Streak camera receiver definition study

    NASA Technical Reports Server (NTRS)

    Johnson, C. B.; Hunkler, L. T., Sr.; Letzring, S. A.; Jaanimagi, P.

    1990-01-01

    Detailed streak camera definition studies were made as a first step toward full flight qualification of a dual channel picosecond resolution streak camera receiver for the Geoscience Laser Altimeter and Ranging System (GLRS). The streak camera receiver requirements are discussed as they pertain specifically to the GLRS system, and estimates of the characteristics of the streak camera are given, based upon existing and near-term technological capabilities. Important problem areas are highlighted, and possible corresponding solutions are discussed.

  3. Cameras Would Withstand High Accelerations

    NASA Technical Reports Server (NTRS)

    Meinel, Aden B.; Meinel, Marjorie P.; Macenka, Steven A.; Puerta, Antonio M.

    1992-01-01

    Very rugged cameras with all-reflective optics proposed for use in presence of high accelerations. Optics consist of four coaxial focusing mirrors in Cassegrain configuration. Mirrors are conics or aspherics. Optics achromatic,and imaging system overall passes light from extreme ultraviolet to far infrared. Charge-coupled-device video camera, film camera, or array of photodetectors placed at focal plane. Useful as portable imagers subject to rough handling, or instrumentation cameras mounted on severely vibrating or accelerating vehicles.

  4. Scale invariant feature matching with wide angle images

    Microsoft Academic Search

    Peter Hansen; Peter Corke; Wageeh Boles; Kostas Daniilidis

    2007-01-01

    Numerous scale-invariant feature matching algorithms using scale-space analysis have been proposed for use with perspective cameras, where scale-space is defined as convolution with a Gaussian. The contribution of this work is a method suitable for use with wide angle cameras. Given an input image, we map it to the unit sphere and obtain scale-space images by convolution with the solution

  5. Camera Phone Based Presentation Control

    Microsoft Academic Search

    Carolyn Hafernik; Faculty Mentor; John Canny; Jingtao Wang

    Due to the recent progress of technology, camera phones are becoming an indispensable part of our daily life. Their functions are no longer limited to pure voice communications. Instead camera phones can be used for many additional functions such as taking pictures or listening to music. This project aims to explore the possibility of converting a camera phone to an

  6. Camera Calibration with Known Rotation

    Microsoft Academic Search

    Jan-michael Frahm; Reinhard Koch

    2003-01-01

    We address the problem of using external rotation informa- tion with uncalibrated video sequences. The main problem addressed is, what is the benefit of the orientation infor- mation for camera calibration? It is shown that in case of a rotating camera the camera calibration problem is lin- ear even in the case that all intrinsic parameters vary. For arbitrarily moving

  7. SELF-CALIBRATION OF CENTRAL CAMERAS BY MINIMIZING ANGULAR ERROR

    E-print Network

    Brandt, Sami

    is usable for many narrow-angle lenses but it is not sufficient for omnidirectional cam- eras which may have more than 180 field of view (Micus´ik and Pajdla, 2006). Nevertheless, most cam- eras, even the wide radially symmetric so that the distortion is merely in the radial direction. Recently, there has been a lot

  8. LRO Camera Imaging of the Moon: Apollo 17 and other Sites for Ground Truth

    NASA Astrophysics Data System (ADS)

    Jolliff, B. L.; Wiseman, S. M.; Robinson, M. S.; Lawrence, S.; Denevi, B. W.; Bell, J. F.

    2009-12-01

    One of the fundamental goals of the Lunar Reconnaissance Orbiter (LRO) is the determination of mineralogic and compositional distributions and their relation to geologic features on the Moon’s surface. Through a combination of imaging with the LRO narrow-angle cameras and wide-angle camera (NAC, WAC), very fine-scale geologic features are resolved with better than meter-per-pixel resolution (NAC) and correlated to spectral variations mapped with the lower resolution, 7-band WAC (400-m/pix, ultraviolet bands centered at 321 and 360 nm; 100-m/pix, visible bands centered at 415, 566, 604, 643, and 689 nm). Keys to understanding spectral variations in terms of composition, and relationships between compositional variations and surface geology, are ground-truth sites where surface compositions and mineralogy, as well as geology and geologic history, are well known. The Apollo 17 site is especially useful because the site geology includes a range of features from high-Ti mare basalts to Serenitatis-Basin-related massifs containing basin impact-melt breccia and feldspathic highlands materials, and a regional black and orange pyroclastic deposit. Moreover, relative and absolute ages of these features are known. In addition to rock samples, astronauts collected well-documented soil samples at 22 different sample locations across this diverse area. Many of these sample sites can be located in the multispectral data using the co-registered NAC images. Digital elevation data are used to normalize illumination geometry and thus fully exploit the multispectral data and compare derived compositional parameters for different geologic units. Regolith characteristics that are known in detail from the Apollo 17 samples, such as maturity and petrography of mineral, glass, and lithic components, contribute to spectral variations and are considered in the assessment of spectral variability at the landing site. In this work, we focus on variations associated with the ilmenite content (a Ti-rich mineral) of the soils and with known compositional and mineralogic characteristics of different geomorphic units. Results will be compared to those derived from analysis of data from the Clementine UV-VIS camera and from the Hubble Space Telescope.

  9. LSST Camera Optics

    SciTech Connect

    Olivier, S S; Seppala, L; Gilmore, K; Hale, L; Whistler, W

    2006-06-05

    The Large Synoptic Survey Telescope (LSST) is a unique, three-mirror, modified Paul-Baker design with an 8.4m primary, a 3.4m secondary, and a 5.0m tertiary feeding a camera system that includes corrector optics to produce a 3.5 degree field of view with excellent image quality (<0.3 arcsecond 80% encircled diffracted energy) over the entire field from blue to near infra-red wavelengths. We describe the design of the LSST camera optics, consisting of three refractive lenses with diameters of 1.6m, 1.0m and 0.7m, along with a set of interchangeable, broad-band, interference filters with diameters of 0.75m. We also describe current plans for fabricating, coating, mounting and testing these lenses and filters.

  10. NSTX Tangential Divertor Camera

    SciTech Connect

    A.L. Roquemore; Ted Biewer; D. Johnson; S.J. Zweben; Nobuhiro Nishino; V.A. Soukhanovskii

    2004-07-16

    Strong magnetic field shear around the divertor x-point is numerically predicted to lead to strong spatial asymmetries in turbulence driven particle fluxes. To visualize the turbulence and associated impurity line emission near the lower x-point region, a new tangential observation port has been recently installed on NSTX. A reentrant sapphire window with a moveable in-vessel mirror images the divertor region from the center stack out to R 80 cm and views the x-point for most plasma configurations. A coherent fiber optic bundle transmits the image through a remotely selected filter to a fast camera, for example a 40500 frames/sec Photron CCD camera. A gas puffer located in the lower inboard divertor will localize the turbulence in the region near the x-point. Edge fluid and turbulent codes UEDGE and BOUT will be used to interpret impurity and deuterium emission fluctuation measurements in the divertor.

  11. LSST Camera Electronics

    Microsoft Academic Search

    F. Mitchell Newcomer; S. Bailey; C. L. Britton; N. Felt; J. Geary; K. Hashimi; H. Lebbolo; Z. Ning; P. O'Connor; J. Oliver; V. Radeka; R. Sefri; V. Tocut; R. Van Berg

    2009-01-01

    The 3.2 Gpixel LSST camera will be read out by means of 189 highly segmented 4K x 4K CCDs. A total of 3024 video channels will be processed by a modular, in-cryostat electronics package based on two custom multichannel analog ASICs now in development. Performance goals of 5 electrons noise, .01% electronic crosstalk, and 80 mW power dissipation per channel

  12. Orbiter Camera Payload System

    NASA Astrophysics Data System (ADS)

    1980-12-01

    Components for an orbiting camera payload system (OCPS) include the large format camera (LFC), a gas supply assembly, and ground test, handling, and calibration hardware. The LFC, a high resolution large format photogrammetric camera for use in the cargo bay of the space transport system, is also adaptable to use on an RB-57 aircraft or on a free flyer satellite. Carrying 4000 feet of film, the LFC is usable over the visible to near IR, at V/h rates of from 11 to 41 milliradians per second, overlap of 10, 60, 70 or 80 percent and exposure times of from 4 to 32 milliseconds. With a 12 inch focal length it produces a 9 by 18 inch format (long dimension in line of flight) with full format low contrast resolution of 88 lines per millimeter (AWAR), full format distortion of less than 14 microns and a complement of 45 Reseau marks and 12 fiducial marks. Weight of the OCPS as supplied, fully loaded is 944 pounds and power dissipation is 273 watts average when in operation, 95 watts in standby. The LFC contains an internal exposure sensor, or will respond to external command. It is able to photograph starfields for inflight calibration upon command.

  13. Hemispherical Laue camera

    DOEpatents

    Li, James C. M. (Pittsford, NY); Chu, Sungnee G. (Rochester, NY)

    1980-01-01

    A hemispherical Laue camera comprises a crystal sample mount for positioning a sample to be analyzed at the center of sphere of a hemispherical, X-radiation sensitive film cassette, a collimator, a stationary or rotating sample mount and a set of standard spherical projection spheres. X-radiation generated from an external source is directed through the collimator to impinge onto the single crystal sample on the stationary mount. The diffracted beam is recorded on the hemispherical X-radiation sensitive film mounted inside the hemispherical film cassette in either transmission or back-reflection geometry. The distances travelled by X-radiation diffracted from the crystal to the hemispherical film are the same for all crystal planes which satisfy Bragg's Law. The recorded diffraction spots or Laue spots on the film thereby preserve both the symmetry information of the crystal structure and the relative intensities which are directly related to the relative structure factors of the crystal orientations. The diffraction pattern on the exposed film is compared with the known diffraction pattern on one of the standard spherical projection spheres for a specific crystal structure to determine the orientation of the crystal sample. By replacing the stationary sample support with a rotating sample mount, the hemispherical Laue camera can be used for crystal structure determination in a manner previously provided in conventional Debye-Scherrer cameras.

  14. Orbiter Camera Payload System

    NASA Technical Reports Server (NTRS)

    1980-01-01

    Components for an orbiting camera payload system (OCPS) include the large format camera (LFC), a gas supply assembly, and ground test, handling, and calibration hardware. The LFC, a high resolution large format photogrammetric camera for use in the cargo bay of the space transport system, is also adaptable to use on an RB-57 aircraft or on a free flyer satellite. Carrying 4000 feet of film, the LFC is usable over the visible to near IR, at V/h rates of from 11 to 41 milliradians per second, overlap of 10, 60, 70 or 80 percent and exposure times of from 4 to 32 milliseconds. With a 12 inch focal length it produces a 9 by 18 inch format (long dimension in line of flight) with full format low contrast resolution of 88 lines per millimeter (AWAR), full format distortion of less than 14 microns and a complement of 45 Reseau marks and 12 fiducial marks. Weight of the OCPS as supplied, fully loaded is 944 pounds and power dissipation is 273 watts average when in operation, 95 watts in standby. The LFC contains an internal exposure sensor, or will respond to external command. It is able to photograph starfields for inflight calibration upon command.

  15. Gamma ray camera

    DOEpatents

    Perez-Mendez, Victor (Berkeley, CA)

    1997-01-01

    A gamma ray camera for detecting rays emanating from a radiation source such as an isotope. The gamma ray camera includes a sensor array formed of a visible light crystal for converting incident gamma rays to a plurality of corresponding visible light photons, and a photosensor array responsive to the visible light photons in order to form an electronic image of the radiation therefrom. The photosensor array is adapted to record an integrated amount of charge proportional to the incident gamma rays closest to it, and includes a transparent metallic layer, photodiode consisting of a p-i-n structure formed on one side of the transparent metallic layer, and comprising an upper p-type layer, an intermediate layer and a lower n-type layer. In the preferred mode, the scintillator crystal is composed essentially of a cesium iodide (CsI) crystal preferably doped with a predetermined amount impurity, and the p-type upper intermediate layers and said n-type layer are essentially composed of hydrogenated amorphous silicon (a-Si:H). The gamma ray camera further includes a collimator interposed between the radiation source and the sensor array, and a readout circuit formed on one side of the photosensor array.

  16. Synthetic Doppler spectroscopy and curvilinear camera diagnostics in the ERO code

    NASA Astrophysics Data System (ADS)

    Makkonen, T.; Groth, M.; Airila, M. I.; Dux, R.; Janzer, A.; Kurki-Suonio, T.; Lunt, T.; Mueller, H. W.; Puetterich, T.; Viezzer, E.

    2013-08-01

    We present a set of new synthetic diagnostics, recently implemented in the ERO code, that were developed to facilitate direct comparisons between experiments and modeling of tokamak scrape-off-layer plasmas. The diagnostics calculate the spectroscopic Doppler shift and Doppler broadening of impurity lines of interest for any line of sight, and they also generate camera images from arbitrary viewing angles allowing for curvilinear (e.g., wide-angle or fisheye) lenses. The synthetic camera diagnostics can either replicate the distortions caused by curvilinear lenses or create a rectilinear synthetic camera image and correct the curvilinear distortions in the experimental image. Comparison to experimental data is presented.

  17. Image processing-based wheel steer angle detection

    NASA Astrophysics Data System (ADS)

    Shu, Tian; Zheng, Yongan; Shi, Zhongke

    2013-10-01

    Wheel steer angle information is crucial for the estimation of vehicle sideslip. Different from previous detection methods using angle sensors, this work presents a new wheel steer angle detection method based on a computer vision system providing image sequences recorded by a camera mounted on a car. The difference between wheel steering right and left is analyzed to determine steer direction while wheel steer angle in the image is derived from the information of the wheel contour extracted by the threshold segmentation and edge extraction. Experimental results show the efficiency of the proposed approach.

  18. Object recognition through turbulence with a modified plenoptic camera

    NASA Astrophysics Data System (ADS)

    Wu, Chensheng; Ko, Jonathan; Davis, Christopher

    2015-03-01

    Atmospheric turbulence adds accumulated distortion to images obtained by cameras and surveillance systems. When the turbulence grows stronger or when the object is further away from the observer, increasing the recording device resolution helps little to improve the quality of the image. Many sophisticated methods to correct the distorted images have been invented, such as using a known feature on or near the target object to perform a deconvolution process, or use of adaptive optics. However, most of the methods depend heavily on the object's location, and optical ray propagation through the turbulence is not directly considered. Alternatively, selecting a lucky image over many frames provides a feasible solution, but at the cost of time. In our work, we propose an innovative approach to improving image quality through turbulence by making use of a modified plenoptic camera. This type of camera adds a micro-lens array to a traditional high-resolution camera to form a semi-camera array that records duplicate copies of the object as well as "superimposed" turbulence at slightly different angles. By performing several steps of image reconstruction, turbulence effects will be suppressed to reveal more details of the object independently (without finding references near the object). Meanwhile, the redundant information obtained by the plenoptic camera raises the possibility of performing lucky image algorithmic analysis with fewer frames, which is more efficient. In our work, the details of our modified plenoptic cameras and image processing algorithms will be introduced. The proposed method can be applied to coherently illuminated object as well as incoherently illuminated objects. Our result shows that the turbulence effect can be effectively suppressed by the plenoptic camera in the hardware layer and a reconstructed "lucky image" can help the viewer identify the object even when a "lucky image" by ordinary cameras is not achievable.

  19. Mars Science Laboratory Engineering Cameras

    NASA Technical Reports Server (NTRS)

    Maki, Justin N.; Thiessen, David L.; Pourangi, Ali M.; Kobzeff, Peter A.; Lee, Steven W.; Dingizian, Arsham; Schwochert, Mark A.

    2012-01-01

    NASA's Mars Science Laboratory (MSL) Rover, which launched to Mars in 2011, is equipped with a set of 12 engineering cameras. These cameras are build-to-print copies of the Mars Exploration Rover (MER) cameras, which were sent to Mars in 2003. The engineering cameras weigh less than 300 grams each and use less than 3 W of power. Images returned from the engineering cameras are used to navigate the rover on the Martian surface, deploy the rover robotic arm, and ingest samples into the rover sample processing system. The navigation cameras (Navcams) are mounted to a pan/tilt mast and have a 45-degree square field of view (FOV) with a pixel scale of 0.82 mrad/pixel. The hazard avoidance cameras (Haz - cams) are body-mounted to the rover chassis in the front and rear of the vehicle and have a 124-degree square FOV with a pixel scale of 2.1 mrad/pixel. All of the cameras utilize a frame-transfer CCD (charge-coupled device) with a 1024x1024 imaging region and red/near IR bandpass filters centered at 650 nm. The MSL engineering cameras are grouped into two sets of six: one set of cameras is connected to rover computer A and the other set is connected to rover computer B. The MSL rover carries 8 Hazcams and 4 Navcams.

  20. Bundle Adjustment for Multi-Camera Systems with Points at Infinity

    NASA Astrophysics Data System (ADS)

    Schneider, J.; Schindler, F.; Läbe, T.; Förstner, W.

    2012-07-01

    We present a novel approach for a rigorous bundle adjustment for omnidirectional and multi-view cameras, which enables an efficient maximum-likelihood estimation with image and scene points at infinity. Multi-camera systems are used to increase the resolution, to combine cameras with different spectral sensitivities (Z/I DMC, Vexcel Ultracam) or - like omnidirectional cameras - to augment the effective aperture angle (Blom Pictometry, Rollei Panoscan Mark III). Additionally multi-camera systems gain in importance for the acquisition of complex 3D structures. For stabilizing camera orientations - especially rotations - one should generally use points at the horizon over long periods of time within the bundle adjustment that classical bundle adjustment programs are not capable of. We use a minimal representation of homogeneous coordinates for image and scene points. Instead of eliminating the scale factor of the homogeneous vectors by Euclidean normalization, we normalize the homogeneous coordinates spherically. This way we can use images of omnidirectional cameras with single-view point like fisheye cameras and scene points, which are far away or at infinity. We demonstrate the feasibility and the potential of our approach on real data taken with a single camera, the stereo camera FinePix Real 3D W3 from Fujifilm and the multi-camera system Ladybug 3 from Point Grey.

  1. Enhanced narrow-bandwidth emission during high-order harmonic generation from aligned molecules.

    PubMed

    Zhang, Chaojin; Yao, Jinping; Umran, Fadhil A; Ni, Jielei; Zeng, Bin; Li, Guihua; Lin, Di

    2013-02-11

    We theoretically investigate the selective enhancement of high-harmonic generation (HHG) in a small spectral range when an orthogonal-polarized two-color laser field interacts with aligned O(2) molecules. It is shown clearly that the enhanced narrow-bandwidth emission near the cutoff of the HHG spectrum can be effectively controlled by the molecular alignment angle, laser intensity and the relative phase of two-color laser fields. Furthermore, the strong dependence of narrow-bandwidth HHG on molecular alignment angle indicates that it encodes information about O(2) molecular orbitals, so it may be an alternative method for reconstruction of O(2) molecular orbitals. PMID:23481785

  2. 2000 FPS digital airborne camera

    NASA Astrophysics Data System (ADS)

    Balch, Kris S.

    1998-11-01

    For many years 16 mm film cameras have been used in severe environments. These film cameras are used on Hy-G automotive sleds, airborne weapon testing, range tracking, and other hazardous environments. The companies and government agencies using these cameras are in need of replacing them with a more cost-effective solution. Film-based cameras still produce the best resolving capability. However, film development time, chemical disposal, non-optimal lighting conditions, recurring media cost, and faster digital analysis are factors influencing the desire for a 16 mm film camera replacement. This paper will describe a new imager from Kodak that has been designed to replace 16 mm high- speed film cameras. Also included is a detailed configuration, operational scenario, and cost analysis of Kodak's imager for airborne applications. The KODAK EKTAPRO HG Imager, Model 2000 is a high-resolution color or monochrome CCD camera especially designed for replacement of rugged high-speed film cameras. The HG Imager is a self-contained camera. It features a high-resolution [512x384], light-sensitive CCD sensor with an electronic shutter. This shutter provides blooming protection that prevents "smearing" of bright light sources, e.g., camera looking into a bright sun reflection. The HG Imager is a very rugged camera packaged in a highly integrated housing. This imager operates from +22 to 42 VDC. The HG Imager has a similar interface and form factor is that of high-speed film cameras, e.g., Photosonics 1B. However, the HG also has the digital interfaces such as 100BaseT Ethernet and RS-485 that enable control and image transfer. The HG Imager is designed to replace 16 mm film cameras that support rugged testing applications.

  3. Safe Folding/Unfolding with Conditional Narrowing?

    E-print Network

    Alpuente, María

    Safe Folding/Unfolding with Conditional Narrowing? M study the combination of this technique with a folding transformation rule in the case of innermost conditional narrowing. We also discuss a relationship between unfold/fold transformations

  4. Neutron Imaging Camera

    NASA Technical Reports Server (NTRS)

    Hunter, Stanley; deNolfo, G. A.; Barbier, L. M.; Link, J. T.; Son, S.; Floyd, S. R.; Guardala, N.; Skopec, M.; Stark, B.

    2008-01-01

    The Neutron Imaging Camera (NIC) is based on the Three-dimensional Track Imager (3DTI) technology developed at GSFC for gamma-ray astrophysics applications. The 3-DTI, a large volume time-projection chamber, provides accurate, approximately 0.4 mm resolution, 3-D tracking of charged particles. The incident direction of fast neutrons, En > 0.5 MeV, are reconstructed from the momenta and energies of the proton and triton fragments resulting from (sup 3)He(n,p) (sup 3)H interactions in the 3-DTI volume. The performance of the NIC from laboratory and accelerator tests is presented.

  5. Applications of intelligent cameras

    NASA Astrophysics Data System (ADS)

    McLeod, Alastair; Braband, Tord; Smastuen, Steinar; Nicklasson, Roy; Sommerfelt, Arne; Lillekjendlie, Bjorn; Strom, Eivind

    1993-12-01

    Development of vision systems which are easy to use is the `holy grail' of vision system engineering. This paper describes the architecture of our so-called `SmartCamera' hardware platform and describes the type of modules and functions which can be provided in such a system. The aim of the paper is to describe how this approach can simplify vision system application in some cases. By providing an integrated software and hardware package which has been configured for a highly specific application, the system integrator or end-user will find the applications engineering much easier and most importantly, less time-consuming.

  6. Automatic feature extraction for panchromatic Mars Global Surveyor Mars Orbiter camera imagery

    NASA Astrophysics Data System (ADS)

    Plesko, Catherine S.; Brumby, Steven P.; Leovy, Conway B.

    2002-01-01

    The Mars Global Surveyor Mars Orbiter Camera (MOC) has produced tens of thousands of images, which contain a wealth of information about the surface of the planet Mars. Current manual analysis techniques are inadequate for the comprehensive analysis of such a large dataset, while development of handwritten feature extraction algorithms is laborious and expensive. This project investigates application of an automatic feature extraction approach to analysis of the MOC narrow angle panchromatic dataset, using an evolutionary computation software package called GENIE. GENIE uses a genetic algorithm to assemble feature extraction tools from low-level image operators. Each generated tool is evaluated against training data provided by the user. The best tools in each generation are allowed to 'reproduce' to produce the next generation, and the population of tools is permitted to evolve until it converges to a solution or reaches a level of performance specified by the user. Craters are one of the most scientifically interesting and most numerous features in the MOC data set, and present a wide range of shapes at many spatial scales. We now describe preliminary results on development of a crater finder algorithm using the GENIE software.

  7. In-flight calibration of the Dawn Framing Camera II: Flat fields and stray light correction

    NASA Astrophysics Data System (ADS)

    Schröder, S. E.; Mottola, S.; Matz, K.-D.; Roatsch, T.

    2014-05-01

    The NASA Dawn spacecraft acquired thousands of images of asteroid Vesta during its year-long orbital tour, and is now on its way to asteroid Ceres. A method for calibrating images acquired by the onboard Framing Camera was described by Schröder et al. (Schröder et al. [2013]. Icarus 226, 1304). However, their method is only valid for point sources. In this paper we extend the calibration to images of extended sources like Vesta. For this, we devise a first-order correction for in-field stray light, which is known to plague images taken through the narrow band filters, and revise the flat fields that were acquired in an integrating sphere before launch. We used calibrated images of the Vesta surface to construct simple photometric models for all filters, that allow us to study how the spectrum changes with increasing phase angle (phase reddening). In combination with these models, our calibration method can be used to create near-seamless mosaics that are radiometrically accurate to a few percent. Such mosaics are provided in JVesta, the Vesta version of the JMARS geographic information system.

  8. 980 nm narrow linewidth Yb-doped phosphate fiber laser

    NASA Astrophysics Data System (ADS)

    Li, Pingxue; Yao, Yifei; Hu, Haowei; Chi, Junjie; Yang, Chun; Zhao, Ziqiang; Zhang, Guangju

    2014-12-01

    A narrow-linewidth ytterbium (Yb)-doped phosphate fiber laser based on fiber Bragg grating (FBG) operating around 980 nm is reported. Two different kinds of cavity are applied to obtain the 980 nm narrow-linewidth output. One kind of the cavity consists of a 0.35 nm broadband lindwidth high-reflection FBG and the Yb-doped phosphate fiber end with 0° angle, which generates a maximum output power of 25 mW. The other kind of resonator is composed of a single mode Yb-doped phosphate fiber and a pair of FBGs. Over 10.7 mW stable continuous wave are obtained with two longitudinal modes at 980 nm. We have given a detailed analysis and discussion for the results.

  9. MEMS digital camera

    NASA Astrophysics Data System (ADS)

    Gutierrez, R. C.; Tang, T. K.; Calvet, R.; Fossum, E. R.

    2007-02-01

    MEMS technology uses photolithography and etching of silicon wafers to enable mechanical structures with less than 1 ?m tolerance, important for the miniaturization of imaging systems. In this paper, we present the first silicon MEMS digital auto-focus camera for use in cell phones with a focus range of 10 cm to infinity. At the heart of the new silicon MEMS digital camera, a simple and low-cost electromagnetic actuator impels a silicon MEMS motion control stage on which a lens is mounted. The silicon stage ensures precise alignment of the lens with respect to the imager, and enables precision motion of the lens over a range of 300 ?m with < 5 ?m hysteresis and < 2 ?m repeatability. Settling time is < 15 ms for 200 ?m step, and < 5ms for 20 ?m step enabling AF within 0.36 sec at 30 fps. The precise motion allows COTS optics to maintain MTF > 0.8 at 20 cy/mm up to 80% field over the full range of motion. Accelerated lifetime testing has shown that the alignment and precision of motion is maintained after 8,000 g shocks, thermal cycling from - 40 C to 85 C, and operation over 20 million cycles.

  10. Planetary Camera Observations of NGC 2440

    NASA Astrophysics Data System (ADS)

    Heap, S. R.; Lindler, D. J.; Malumuth, E.

    1992-05-01

    NGC 2440 is a planetary nebula harboring one of the hottest stars known. IUE spectra and narrow-band CCD imagery at Kitt Peak suggest a stellar temperature, Teff ~ 200,000 K (Heap and Hintzen 1990). Because of seeing problems, however, it is very difficult to detect the central star against the bright nebular background. This problem is virtually eliminated by the high angular resolution of the Hubble Space Telescope. Not only is the star visible on narrow-band continuum (F517N) images taken by the Planetary Camera, it is even visible on H? (F487M) images. We deconvolved the PC images using three different restoration algorithms: Lucy-Richardson, Maximum Entropy Method, and the Block-Iterative Method. We measured the brightness of the central star on these restored images and on the PODPS-reduced image and then applied the appropriate aperture corrections and absolute calibrations as given in the WFPC Science Verification Report (Faber and Westphal 1992). The resulting stellar magnitude confirms earlier findings of a HI Zanstra temperature, Tz(HI) ~ 200,000 K. At the meeting, we will compare the results of the various methods of image restoration and stellar photometry as well as the nebular structure.

  11. The All Sky Camera Network

    NSDL National Science Digital Library

    Andy Caldwell

    2005-02-01

    In 2001, the All Sky Camera Network came to life as an outreach program to connect the Denver Museum of Nature and Science (DMNS) exhibit Space Odyssey with Colorado schools. The network is comprised of cameras placed strategically at schools throughout Colorado to capture fireballs--rare events that produce meteorites. Students involved in the network participate in an authentic, inquiry-based experience by tracking meteor events. This article discusses the past, present, and future of the All Sky Camera Network.

  12. Combination Calibration of Digital Cameras

    Microsoft Academic Search

    Feng Qiqiang; Li Zongchun; Li Guangyun; Chen Xin

    2009-01-01

    The paper presents a combination calibration method of digital cameras based on the ten-parameter model and the finite element model. The predictable distortion errors can be compensated with the ten-parameter model while the unpredictable ones compensated with the finite element model. A calibration experiment has been carried out on the metric camera Inca3 and non-metric cameras Canon 5D Mark II

  13. The New WHT Mosaic Camera

    NASA Astrophysics Data System (ADS)

    Tulloch, S.

    2000-03-01

    Work on this camera started at the RGO in Cambridge shortly before its closure and was continued at the ATC in Edinburgh and subsequently ING. The design incorporates two EEV42-80 CCDs mounted in a standard 2.5l Oxford Instruments cryostat normally use d for single chip cameras. This camera therefore has the same mechanical interface to the telescope and could in principle be used at any port where a single chip camera is normally used. In practice it will be dedicated for use at Prime Focus (PF) and at UES on the WHT.

  14. What's Your Angle?

    NSDL National Science Digital Library

    2010-01-01

    In this activity, students devise procedures for using a protractor to measure the number of degrees in an angle, and use inductive reasoning to develop "angle sense." Then they describe circumstances and careers that require a working knowledge of angles and their measurements.

  15. Angles All Around

    NSDL National Science Digital Library

    Mrs. Bennett

    2011-12-14

    Standard: Identify and measure right, obtuse, and acute angles. This is a two day activity. OBJECTIVE: We have learned about five different types of angles: right, acute, obtuse, straight, and reflex. We have also learned how to use a protractor to measure angles. With this lesson, you will practice what ...

  16. Particle friction angles in steep mountain channels

    NASA Astrophysics Data System (ADS)

    Prancevic, Jeff P.; Lamb, Michael P.

    2015-02-01

    Sediment transport rates in steep mountain channels are typically an order of magnitude lower than predicted by models developed for lowland rivers. One hypothesis for this observation is that particles are more stable in mountain channels due to particle-particle interlocking or bridging across the channel width. This hypothesis has yet to be tested, however, because we lack direct measurements of particle friction angles in steep mountain channels. Here we address this data gap by directly measuring the minimum force required to dislodge sediment (pebbles to boulders) and the sediment weight in mountain channels using a handheld force gauge. At eight sites in California, with reach-averaged bed angles ranging from 0.5° to 23° and channel widths ranging from 2 m to 16 m, we show that friction angles in natural streams average 68° and are 16° larger than those typically measured in laboratory experiments, which is likely due to particle interlocking and burial. Results also show that larger grains are disproportionately more stable than predicted by existing models and that grains organized into steps are twice as stable as grains outside of steps. However, the mean particle friction angle does not vary systematically with bed slope. These results do not support systematic increases in friction angle in steeper and narrower channels to explain the observed low sediment transport rates in mountain channels. Instead, the spatial pattern and grain-size dependence of particle friction angles may indirectly lower transport rates in steep, narrow channels by stabilizing large clasts and channel-spanning steps, which act as momentum sinks due to form drag.

  17. Camera Calibration for Uav Application Using Sensor of Mobile Camera

    NASA Astrophysics Data System (ADS)

    Takahashi, Y.; Chikatsu, H.

    2015-05-01

    Recently, 3D measurements using small unmanned aerial vehicles (UAVs) have increased in Japan, because small type UAVs is easily available at low cost and the analysis software can be created the easily 3D models. However, small type UAVs have a problem: they have very short flight times and a small payload. In particular, as the payload of a small type UAV increases, its flight time decreases. Therefore, it is advantageous to use lightweight sensors in small type UAVs. A mobile camera is lightweight and has many sensors such as an accelerometer, a magnetic field, and a gyroscope. Moreover, these sensors can be used simultaneously. Therefore, the authors think that the problems of small UAVs can be solved using the mobile camera. The authors executed camera calibration using a test target for evaluating sensor values measured using a mobile camera. Consequently, the authors confirmed the same accuracy with normal camera calibration.

  18. Calibration of a stereo system with small relative angles

    Microsoft Academic Search

    Behrooz Kamgar-Parsi; Roger D. Eastman

    1988-01-01

    The authors present an algorithm for the calibration of a stereo system with small relative angles in an uncontrolled environment. This algorithm has two advantages: (1) it is more accurate than the existing algorithms in the computer vision and photogrammetry literatures; and (2) it provides useful insight into the problem of camera calibration and relative orientation. This is done by

  19. Neutron Imaging Camera

    NASA Technical Reports Server (NTRS)

    Hunter, Stanley D.; DeNolfo, Georgia; Floyd, Sam; Krizmanic, John; Link, Jason; Son, Seunghee; Guardala, Noel; Skopec, Marlene; Stark, Robert

    2008-01-01

    We describe the Neutron Imaging Camera (NIC) being developed for DTRA applications by NASA/GSFC and NSWC/Carderock. The NIC is based on the Three-dimensional Track Imager (3-DTI) technology developed at GSFC for gamma-ray astrophysics applications. The 3-DTI, a large volume time-projection chamber, provides accurate, approximately 0.4 mm resolution. 3-D tracking of charged particles. The incident direction of fast neutrons, E(sub N) > 0.5 MeV. arc reconstructed from the momenta and energies of the proton and triton fragments resulting from 3He(n,p)3H interactions in the 3-DTI volume. We present angular and energy resolution performance of the NIC derived from accelerator tests.

  20. A Motionless Camera

    NASA Technical Reports Server (NTRS)

    1994-01-01

    Omniview, a motionless, noiseless, exceptionally versatile camera was developed for NASA as a receiving device for guiding space robots. The system can see in one direction and provide as many as four views simultaneously. Developed by Omniview, Inc. (formerly TRI) under a NASA Small Business Innovation Research (SBIR) grant, the system's image transformation electronics produce a real-time image from anywhere within a hemispherical field. Lens distortion is removed, and a corrected "flat" view appears on a monitor. Key elements are a high resolution charge coupled device (CCD), image correction circuitry and a microcomputer for image processing. The system can be adapted to existing installations. Applications include security and surveillance, teleconferencing, imaging, virtual reality, broadcast video and military operations. Omniview technology is now called IPIX. The company was founded in 1986 as TeleRobotics International, became Omniview in 1995, and changed its name to Interactive Pictures Corporation in 1997.

  1. The Field Camera Unit project for the WSO-UV space telescope

    Microsoft Academic Search

    S. Scuderi; I. Pagano; M. Fiorini; L. Gambicorti; A. Gherardi; F. Gianotti; D. Magrin; M. Miccolis; M. Munari; E. Pace; C. Pontoni; M. Trifoglio; M. Uslenghi; B. Shustov

    2007-01-01

    The Field Camera Unit (FCU) is one of the focal plane instruments aboard the WSO-UV telescope, a 1.7 m UV optimized instrument that will investigate numerous astrophysical phenomena from planetary science to cosmology. The FCU will perform deep UV and diffraction limited optical imaging in both wide and narrow band filters using three channels (FUV, NUV and UVO) optimized in

  2. Challenges in SingleCamera Remote Eye Tracking Martin Bhme and Erhardt Barth

    E-print Network

    unsatisfactory for most AAC (Augmentative and Alternative Communication) applications. Socalled "remote" eye highresolution image of the eye. Because of the narrow field of view, the user's head movements must field of view. With recent increases in the resolution of CCD and CMOS cameras, it has become feasible

  3. Testing of the Apollo 15 Metric Camera System.

    NASA Technical Reports Server (NTRS)

    Helmering, R. J.; Alspaugh, D. H.

    1972-01-01

    Description of tests conducted (1) to assess the quality of Apollo 15 Metric Camera System data and (2) to develop production procedures for total block reduction. Three strips of metric photography over the Hadley Rille area were selected for the tests. These photographs were utilized in a series of evaluation tests culminating in an orbitally constrained block triangulation solution. Results show that film deformations up to 25 and 5 microns are present in the mapping and stellar materials, respectively. Stellar reductions can provide mapping camera orientations with an accuracy that is consistent with the accuracies of other parameters in the triangulation solutions. Pointing accuracies of 4 to 10 microns can be expected for the mapping camera materials, depending on variations in resolution caused by changing sun angle conditions.

  4. Star Identification Algorithm for Uncalibrated, Wide FOV Cameras

    NASA Astrophysics Data System (ADS)

    Ajdadi, Mohamad Javad; Ghafarzadeh, Mahdi; Taheri, Mojtaba; Mosadeq, Ehsan; Khakian Ghomi, Mahdi

    2015-06-01

    A novel method is proposed for star identification via uncalibrated cameras with wide fields of view (FOVs). In this approach some of the triangles created by the stars in the FOV are selected for pattern recognition. The triangles are selected considering the sensitivity of their interior angles to the calibration error. The algorithm is based on the intersection between sets of triangles that are found in the database for each selected triangle of the image. By this method, most of the image stars contribute to pattern recognition and thereby it is very robust against the noise and the calibration error. The algorithm is performed on 150 night sky images, which are taken by an uncalibrated camera in FOV of 114° ± 12° with a success rate of 94% and no false positives. Based on the identification approach, an adaptive method is also developed for calibrating and obtaining the projection function of an uncalibrated camera.

  5. Pointing with the eyes: Gaze estimation using a static\\/active camera system and 3D iris disk model

    Microsoft Academic Search

    Michael Reale; Terry Hung; Lijun Yin

    2010-01-01

    The ability to capture the direction the eyes point in while the subject is a distance away from the camera offers the potential for intuitive human-computer interfaces, allowing for a greater interactivity, more intelligent HCI behavior, and increased flexibility. In this paper, we present a two-camera system that detects the face from a fixed, wide-angle camera, estimates a rough location

  6. Tacoma Narrows Bridge: Extreme History

    NSDL National Science Digital Library

    Stretching across the southern portion of Puget Sound, the elegant Tacoma Narrows bridge is considered one of the finest suspension bridges in the United States. The current bridge is the second on the site, as it was constructed in 1950 to serve as a replacement to the famous "Galloping Gertie" bridge, which collapsed in a windstorm in the fall of 1940. Currently, the Washington State Department of Transportation is building a bridge to replace the existing structure, and it is anticipated that it will be completed in 2007. This site offers a host of materials on all three structures, including ample information on the construction of the bridges and their aesthetic appeal. Along with these materials, the site also provides a glossary of related terms, Weird Facts, and some information about the dog "Tubby", who perished when "Galloping Gertie" collapsed on that fateful fall day back in 1940.

  7. Distributed Calibration of Smart Cameras

    Microsoft Academic Search

    John Jannotti; Jie Mao

    Localization in sensornets determines the location of sensor nodes, and allows applications to make geograph- ically sensitive queries. Smart camera networks must not only be localized, but calibrated. Calibration goes beyond localization to include orientation and position information that is sufficiently fine-grained to allow fu- sion between overlapping camera views. This paper introduces Lighthouse, a distributed calibra- tion system that

  8. A Shaped Temporal Filter Camera

    Microsoft Academic Search

    Martin Fuchs; Tongbo Chen; Oliver Wang; Ramesh Raskar; Hans-Peter Seidel; Hendrik P. A. Lensch

    2009-01-01

    Digital movie cameras only perform a discrete sam- pling of real-world imagery. While spatial sampling effects are well studied in the literature, there has not been as much work in regards to temporal sam- pling. As cameras get faster and faster, the need for conventional frame-rate video that matches the abilities of human perception remains. In this ar- ticle, we

  9. Multi-PSPMT scintillation camera

    SciTech Connect

    Pani, R.; Pellegrini, R.; Trotta, G.; Scopinaro, F. [Univ. of Rome (Italy). Dept. of Experimental Medicine] [Univ. of Rome (Italy). Dept. of Experimental Medicine; Soluri, A.; Vincentis, G. de [CNR (Italy). Inst. of Biomedical Technologies] [CNR (Italy). Inst. of Biomedical Technologies; Scafe, R. [ENEA-INN, Rome (Italy)] [ENEA-INN, Rome (Italy); Pergola, A. [PSDD, Rome (Italy)] [PSDD, Rome (Italy)

    1999-06-01

    Gamma ray imaging is usually accomplished by the use of a relatively large scintillating crystal coupled to either a number of photomultipliers (PMTs) (Anger Camera) or to a single large Position Sensitive PMT (PSPMT). Recently the development of new diagnostic techniques, such as scintimammography and radio-guided surgery, have highlighted a number of significant limitations of the Anger camera in such imaging procedures. In this paper a dedicated gamma camera is proposed for clinical applications with the aim of improving image quality by utilizing detectors with an appropriate size and shape for the part of the body under examination. This novel scintillation camera is based upon an array of PSPMTs (Hamamatsu R5900-C8). The basic concept of this camera is identical to the Anger Camera with the exception of the substitution of PSPMTs for the PMTs. In this configuration it is possible to use the high resolution of the PSPMTs and still correctly position events lying between PSPMTs. In this work the test configuration is a 2 by 2 array of PSPMTs. Some advantages of this camera are: spatial resolution less than 2 mm FWHM, good linearity, thickness less than 3 cm, light weight, lower cost than equivalent area PSPMT, large detection area when coupled to scintillating arrays, small dead boundary zone (< 3 mm) and flexibility in the shape of the camera.

  10. THE DEATH OF THE CAMERA

    Microsoft Academic Search

    Warren Buckland

    2006-01-01

    In this paper I examine how Edward Branigan, in his new book Projecting a Camera: Language?Games in Film Theory (2006), uses Wittgenstein's later philosophy to describe the multiple, contradictory, literal and metaphorical meanings of fundamental concepts in film theory—such as ‘movement’, ‘point of view’, ‘camera’, ‘frame’ and ‘causality’. Towards the end of the paper I rationally reconstruct Branigan's main arguments

  11. Mars Exploration Rover engineering cameras

    USGS Publications Warehouse

    Maki, J.N.; Bell, J.F., III; Herkenhoff, K.E.; Squyres, S.W.; Kiely, A.; Klimesh, M.; Schwochert, M.; Litwin, T.; Willson, R.; Johnson, Aaron H.; Maimone, M.; Baumgartner, E.; Collins, A.; Wadsworth, M.; Elliot, S.T.; Dingizian, A.; Brown, D.; Hagerott, E.C.; Scherr, L.; Deen, R.; Alexander, D.; Lorre, J.

    2003-01-01

    NASA's Mars Exploration Rover (MER) Mission will place a total of 20 cameras (10 per rover) onto the surface of Mars in early 2004. Fourteen of the 20 cameras are designated as engineering cameras and will support the operation of the vehicles on the Martian surface. Images returned from the engineering cameras will also be of significant importance to the scientific community for investigative studies of rock and soil morphology. The Navigation cameras (Navcams, two per rover) are a mast-mounted stereo pair each with a 45?? square field of view (FOV) and an angular resolution of 0.82 milliradians per pixel (mrad/pixel). The Hazard Avoidance cameras (Hazcams, four per rover) are a body-mounted, front- and rear-facing set of stereo pairs, each with a 124?? square FOV and an angular resolution of 2.1 mrad/pixel. The Descent camera (one per rover), mounted to the lander, has a 45?? square FOV and will return images with spatial resolutions of ???4 m/pixel. All of the engineering cameras utilize broadband visible filters and 1024 x 1024 pixel detectors. Copyright 2003 by the American Geophysical Union.

  12. CZT gamma camera for scintimammography.

    PubMed

    Blevis, Ira M; O'connor, M K; Keidar, Z; Pansky, A; Altman, H; Hugg, J W

    2006-01-01

    A high performance prototype gamma camera based on the semiconductor radiation detector Cd(Zn)Te is described. The camera features high spatial resolution, high-energy resolution, a reduced dead space on the edge of the field of view, and a compact format. The camera performance was first examined by comparison of small field of view examinations with those from an Elscint SP6HR standard clinical gamma camera. The new camera was found to give equal or improved image quality. The camera was then used for a systematic phantom study of small lesions in a background as would be found in breast cancer imaging. In this study the camera was able to systematically detect smaller, deeper, and fainter lesions. The camera is presently being used in a clinical trial aimed to assess its value in scintimammography where previous limitations of image quality and detector size have restricted the use of the functional imaging techniques. Preliminary results from 40 patients show high sensitivity and specificity with respect to X-ray mammography and surgery. PMID:17645995

  13. Sensitivity analysis of camera calibration

    Microsoft Academic Search

    Jan Heikkila

    1992-01-01

    To utilize the full potential of CCD cameras a careful design must be performed. The main contribution with respect to the final precision and reliability comes from the camera calibration. Both the precision of the estimated parameters or any functions of them (e.g., object coordinates) and the sensitivity of the system with respect to the undetected model errors are of

  14. SEOS frame camera applications study

    NASA Technical Reports Server (NTRS)

    1974-01-01

    A research and development satellite is discussed which will provide opportunities for observation of transient phenomena that fall within the fixed viewing circle of the spacecraft. The evaluation of possible applications for frame cameras, for SEOS, are studied. The computed lens characteristics for each camera are listed.

  15. High-speed camera synchronization

    NASA Technical Reports Server (NTRS)

    Rojec, E. A.

    1968-01-01

    Photoelectric sensor enables synchronization of the rotating mirror in a high-speed framing camera with the passage of a very-high-velocity droplet to obtain direct photographic data on droplet breakup. It detects droplet movement across a high intensity light beam and generates a signal triggering the camera.

  16. Video Cameras on School Buses.

    ERIC Educational Resources Information Center

    Fields, Lynette J.

    1998-01-01

    Because student misbehavior on school buses can endanger the driver, other students, motorists, and pedestrians, schools are considering technological solutions, such as mounted video cameras. The cameras deter misbehavior and help administrators detect inappropriate activities and determine appropriate action. Program implementation is…

  17. An Educational PET Camera Model

    ERIC Educational Resources Information Center

    Johansson, K. E.; Nilsson, Ch.; Tegner, P. E.

    2006-01-01

    Positron emission tomography (PET) cameras are now in widespread use in hospitals. A model of a PET camera has been installed in Stockholm House of Science and is used to explain the principles of PET to school pupils as described here.

  18. The "All Sky Camera Network"

    ERIC Educational Resources Information Center

    Caldwell, Andy

    2005-01-01

    In 2001, the "All Sky Camera Network" came to life as an outreach program to connect the Denver Museum of Nature and Science (DMNS) exhibit "Space Odyssey" with Colorado schools. The network is comprised of cameras placed strategically at schools throughout Colorado to capture fireballs--rare events that produce meteorites. Meteorites have great…

  19. Digital camera self-calibration

    NASA Astrophysics Data System (ADS)

    Fraser, C.

    1997-08-01

    Over the 25 years since the introduction of analytical camera self-calibration there has been a revolution in close-range photogrammetric image acquisition systems. High-resolution, large-area "digital" CCD sensors have all but replaced film cameras. Throughout the period of this transition, self-calibration models have remained essentially unchanged. This paper reviews the application of analytical self-calibration to digital cameras. Computer vision perspectives are touched upon, the quality of self-calibration is discussed, and an overview is given of each of the four main sources of departures from collinearity in CCD cameras. Practical issues are also addressed and experimental results are used to highlight important characteristics of digital camera self-calibration.

  20. Digital camera self-calibration

    NASA Astrophysics Data System (ADS)

    Fraser, Clive S.

    Over the 25 years since the introduction of analytical camera self-calibration there has been a revolution in close-range photogrammetric image acquisition systems. High-resolution, large-area 'digital' CCD sensors have all but replaced film cameras. Throughout the period of this transition, self-calibration models have remained essentially unchanged. This paper reviews the application of analytical self-calibration to digital cameras. Computer vision perspectives are touched upon, the quality of self-calibration is discussed, and an overview is given of each of the four main sources of departures from collinearity in CCD cameras. Practical issues are also addressed and experimental results are used to highlight important characteristics of digital camera self-calibration.

  1. Accuracy in fixing ship's positions by camera survey of bearings

    NASA Astrophysics Data System (ADS)

    Naus, Krzysztof; W??, Mariusz

    2011-01-01

    The paper presents the results of research on the possibilities of fixing ship position coordinates based on results of surveying bearings on navigational marks with the use of the CCD camera. Accuracy of the determination of ship position coordinates, expressed in terms of the mean error, was assumed to be the basic criterion of this estimation. The first part of the paper describes the method of the determination of the resolution and the mean error of the angle measurement, taken with a camera, and also the method of the determination of the mean error of position coordinates when two or more bearings were measured. There have been defined three software applications assigned for the development of navigational sea charts with accuracy areas mapped on. The second part contains the results of studying accuracy in fixing ship position coordinates, carried out in the Gulf of Gdansk, with the use of bearings taken obtained with the Rolleiflex and Sony cameras. The results are presented in a form of diagrams of the mean error of angle measurement, also in the form of navigational charts with accuracy fields mapped on. In the final part, basing on results obtained, the applicability of CCD cameras in automation of coastal navigation performance process is discussed.

  2. Augmented heat transfer in rectangular channels of narrow aspect ratios with rib turbulators

    NASA Technical Reports Server (NTRS)

    Han, J. C.; Ou, S.; Park, J. S.; Lei, C. K.

    1989-01-01

    The effects of the rib angle-of-attack on the distributions of the local heat transfer coefficient and on the friction factors in short rectangular channels of narrow aspect ratios with a pair of opposite rib-roughened walls are determined for Reynolds numbers from 10,000 to 60,000. The channel width-to-height ratios are 2/4 and 1/4; the corresponding rib angles-of-attack are 90, 60, 45, and 30 deg, respectively. The results indicate that the narrow-aspect-ratio channels give better heat transfer performance than the wide-aspect-ratio channels for a constant pumping power. Semiempirical friction and heat transfer correlations are obtained. The results can be used in the design of turbine cooling channels of narrow aspect ratios.

  3. Overview of Neutrino Mixing Models and Their Mixing Angle Predictions

    SciTech Connect

    Albright, Carl H.

    2009-11-01

    An overview of neutrino-mixing models is presented with emphasis on the types of horizontal flavor and vertical family symmetries that have been invoked. Distributions for the mixing angles of many models are displayed. Ways to differentiate among the models and to narrow the list of viable models are discussed.

  4. Design of motion compensation mechanism of satellite remote sensing camera

    NASA Astrophysics Data System (ADS)

    Gu, Song; Yan, Yong; Xu, Kai; Jin, Guang

    2011-08-01

    With the development of aerospace remote sensing technology, the ground resolution of remote sensing camera enhances continuously. Since there is relative motion between camera and ground target when taking pictures, the target image recorded in recording media is moved and blurred. In order to enhance the imaging quality and resolution of the camera, the image motion had to be compensated. In order to abate the effect of image motion to image quality of space camera and improve the resolution of the camera, the compensation method of image motion to space camera is researched. First, the reason of producing drift angle and adjustment principle are analyzed in this paper. This paper introduce the composition and transmission principle of image motion compensation mechanism. Second, the system adopts 80C31 as controller of drift angle, and adopts stepping motor for actuators, and adopts absolute photoelectric encoder as the drift Angle measuring element. Then the control mathematical model of the image motion compensation mechanism are deduced, and it achieve the closed-loop control of the drift angle position. At the last, this paper analyses the transmission precision of the mechanism. Through the experiment, we measured the actual precision of the image motion compensation mechanism, and compared with the theoretical analysis.There are two major contributions in this paper. First, the traditional image motion compensation mechanism is big volume and quality heavy. This has not fit for the development trend of space camera miniaturization and lightweight. But if reduce the volume and quality of mechanism, it will bring adverse effects for the precision and stiffness of mechanism. For this problem, This paper designed a image motion compensation that have some advantages such as small size, light weight at the same time, high precision, stiffness and so on. This image motion compensation can be applicable to the small optics cameras with high resolution. Second, the traditional mechanism control need to corrected, fitting and iterative for the control formula of mechanism. Only in this way, we can get the optimal control mathematical model. This paper has high precision of the control formula derived. It can achieve the high precision control without fitting, It also simplify the difficulty of control mathematical model establishment.This paper designed the range of adjusting of image motion compensation mechanism between -5°~ +5°. Based on choosing-5°, -4°, -3°, -2°, -1°, 0°, +1°, +2, +3°, +4°, +4° as the expectation value of the imaginary drift angle, we get ten groups of the fact data in adjusting drift angle measured. The test results show that the precision of the drift angle control system can be achieved in 1. It can meet the system requirements that the precision of the control system is less than 3 ', and it can achieve the high-precision image motion compensation.

  5. Reductions in Injury Crashes Associated With Red Light Camera Enforcement in Oxnard, California

    PubMed Central

    Retting, Richard A.; Kyrychenko, Sergey Y.

    2002-01-01

    Objectives. This study estimated the impact of red light camera enforcement on motor vehicle crashes in one of the first US communities to employ such cameras—Oxnard, California. Methods. Crash data were analyzed for Oxnard and for 3 comparison cities. Changes in crash frequencies were compared for Oxnard and control cities and for signalized and nonsignalized intersections by means of a generalized linear regression model. Results. Overall, crashes at signalized intersections throughout Oxnard were reduced by 7% and injury crashes were reduced by 29%. Right-angle crashes, those most associated with red light violations, were reduced by 32%; right-angle crashes involving injuries were reduced by 68%. Conclusions. Because red light cameras can be a permanent component of the transportation infrastructure, crash reductions attributed to camera enforcement should be sustainable. (Am J Public Health. 2002;92:1822–1825) PMID:12406815

  6. Camera sensitivity study

    NASA Astrophysics Data System (ADS)

    Schlueter, Jonathan; Murphey, Yi L.; Miller, John W. V.; Shridhar, Malayappan; Luo, Yun; Khairallah, Farid

    2004-12-01

    As the cost/performance Ratio of vision systems improves with time, new classes of applications become feasible. One such area, automotive applications, is currently being investigated. Applications include occupant detection, collision avoidance and lane tracking. Interest in occupant detection has been spurred by federal automotive safety rules in response to injuries and fatalities caused by deployment of occupant-side air bags. In principle, a vision system could control airbag deployment to prevent this type of mishap. Employing vision technology here, however, presents a variety of challenges, which include controlling costs, inability to control illumination, developing and training a reliable classification system and loss of performance due to production variations due to manufacturing tolerances and customer options. This paper describes the measures that have been developed to evaluate the sensitivity of an occupant detection system to these types of variations. Two procedures are described for evaluating how sensitive the classifier is to camera variations. The first procedure is based on classification accuracy while the second evaluates feature differences.

  7. Interline transfer CCD camera

    SciTech Connect

    Prokop, M.S.; McCurnin, T.W.; Stump, C.J.; Stradling, G.L.

    1993-12-31

    An interline CCD sensing device for use in a camera system, includes an imaging area sensitive to impinging light, for generating charges corresponding to the intensity of the impinging light. Sixteen independent registers R1 - R16 sequentially receive the interline data from the imaging area, corresponding to the generated charges. Sixteen output amplifiers S1 - S16 and sixteen ports P1 - P16 for sequentially transferring the interline data, one pixel at a time, in order to supply a desired image transfer speed. The imaging area is segmented into sixteen independent imaging segments A1 - A16, each of which corresponds to one register, on output amplifier, and one output port. Each one of the imaging segments A1 - A16 includes an array of rows and columns of pixels. Each pixel includes a photogate area, an interline CCD channel area, and an anti-blooming area. The anti-blooming area is, in turn, divided into an anti-blooming barrier and an anti-blooming drain.

  8. Cameras for digital microscopy.

    PubMed

    Spring, Kenneth R

    2013-01-01

    This chapter reviews the fundamental characteristics of charge-coupled devices (CCDs) and related detectors, outlines the relevant parameters for their use in microscopy, and considers promising recent developments in the technology of detectors. Electronic imaging with a CCD involves three stages--interaction of a photon with the photosensitive surface, storage of the liberated charge, and readout or measurement of the stored charge. The most demanding applications in fluorescence microscopy may require as much as four orders of greater magnitude sensitivity. The image in the present-day light microscope is usually acquired with a CCD camera. The CCD is composed of a large matrix of photosensitive elements (often referred to as "pixels" shorthand for picture elements, which simultaneously capture an image over the entire detector surface. The light-intensity information for each pixel is stored as electronic charge and is converted to an analog voltage by a readout amplifier. This analog voltage is subsequently converted to a numerical value by a digitizer situated on the CCD chip, or very close to it. Several (three to six) amplifiers are required for each pixel, and to date, uniform images with a homogeneous background have been a problem because of the inherent difficulties of balancing the gain in all of the amplifiers. Complementary metal oxide semiconductor sensors also exhibit relatively high noise associated with the requisite high-speed switching. Both of these deficiencies are being addressed, and sensor performance is nearing that required for scientific imaging. PMID:23931507

  9. Fast camera objective designs for spectrograph of Mont Megantique telescope

    NASA Astrophysics Data System (ADS)

    Thibault, Simon; Wang, Min

    2004-02-01

    All-reflective optics is conventionally required for extended spectral region observations in astronomical spectrograph. But the spatial resolution is usually not good enough while the large-size CCD will be used for observation in all-reflective optics. In this paper, all-refractive optics has been investigated to design a fast (F/1.55) and wide angle camera objective for large spectral coverage, from UV to VIS and up to NIR, when a large-size CCD is used on the focal plane of the spectrograph of Mont Megantique telescope. The case of achromatic and apochromatic condition has been investigated for axial and lateral color controls. The new proposed solutions have been optimized from two to three different glass combinations in order to have higher throughputs for large spectral coverage, especially in UV region. The used components have been minimized to reduce the light inherent lost. The monochromatic aberrations have been corrected and controlled by using optimized lens bending and shapes to make the camera have the CCD pixel resolution. Ray tracing results displayed the good optical performance of the camera to cover from 350 nm to 1000 nm spectral region with high resolution. The broadband AR coating, enhanced on UV region, will be used on each surface of the lenses in the camera. Final throughputs for the designed camera has been estimated and given in the paper.

  10. X-ray Pinhole Camera Measurements

    SciTech Connect

    Nelson, D. S. [NSTec; Berninger, M. J. [NSTec; Flores, P. A. [NSTec; Good, D. E. [NSTec; Henderson, D. J. [NSTec; Hogge, K. W. [NSTec; Huber, S. R. [NSTec; Lutz, S. S. [NSTec; Mitchell, S. E. [NSTec; Howe, R. A. [NSTec; Mitton, C. V. [NSTec; Molina, I. [NSTec; Bozman, D. R. [SNL; Cordova, S. R. [SNL; Mitchell, D. R. [SNL; Oliver, B. V. [SNL; Ormond, E. C. [SNL

    2013-07-01

    The development of the rod pinch diode [1] has led to high-resolution radiography for dynamic events such as explosive tests. Rod pinch diodes use a small diameter anode rod, which extends through the aperture of a cathode plate. Electrons borne off the aperture surface can self-insulate and pinch onto the tip of the rod, creating an intense, small x-ray source (Primary Pinch). This source has been utilized as the main diagnostic on numerous experiments that include high-value, single-shot events. In such applications there is an emphasis on machine reliability, x-ray reproducibility, and x-ray quality [2]. In tests with the baseline rod pinch diode, we have observed that an additional pinch (Secondary Pinch) occurs at the interface near the anode rod and the rod holder. This suggests that stray electrons exist that are not associated with the Primary Pinch. In this paper we present measurements on both pinches using an x-ray pinhole camera. The camera is placed downstream of the Primary Pinch at an angle of 60° with respect to the diode centerline. This diagnostic will be employed to diagnose x-ray reproducibility and quality. In addition, we will investigate the performance of hybrid diodes relating to the formation of the Primary and Secondary Pinches.

  11. Focal Plane Metrology for the LSST Camera

    SciTech Connect

    A Rasmussen, Andrew P.; Hale, Layton; Kim, Peter; Lee, Eric; Perl, Martin; Schindler, Rafe; Takacs, Peter; Thurston, Timothy; /SLAC

    2007-01-10

    Meeting the science goals for the Large Synoptic Survey Telescope (LSST) translates into a demanding set of imaging performance requirements for the optical system over a wide (3.5{sup o}) field of view. In turn, meeting those imaging requirements necessitates maintaining precise control of the focal plane surface (10 {micro}m P-V) over the entire field of view (640 mm diameter) at the operating temperature (T {approx} -100 C) and over the operational elevation angle range. We briefly describe the hierarchical design approach for the LSST Camera focal plane and the baseline design for assembling the flat focal plane at room temperature. Preliminary results of gravity load and thermal distortion calculations are provided, and early metrological verification of candidate materials under cold thermal conditions are presented. A detailed, generalized method for stitching together sparse metrology data originating from differential, non-contact metrological data acquisition spanning multiple (non-continuous) sensor surfaces making up the focal plane, is described and demonstrated. Finally, we describe some in situ alignment verification alternatives, some of which may be integrated into the camera's focal plane.

  12. Camera self-calibration method suitable for variant camera constraints

    NASA Astrophysics Data System (ADS)

    Teng, Chin-Hung; Chen, Yung-Sheng; Hsu, Wen-Hsing

    2006-02-01

    This paper presents a self-calibration algorithm that seeks the camera intrinsic parameters to minimize the sum of squared distances between the measured and reprojected image points. By exploiting the constraints provided by the fundamental matrices, the function to be minimized can be directly reduced to a function of the camera intrinsic parameters; thus variant camera constraints such as fixed or varying focal lengths can be easily imposed by controlling the parameters of the resulting function. We employed the simplex method to minimize the resulting function and tested the proposed algorithm on some simulated and real data. The experimental results demonstrate that our algorithm performs well for variant camera constraints and for two-view and multiple-view cases.

  13. Roll Angle Estimation for Motorcycles: Comparing Video and Inertial Sensor Approaches

    E-print Network

    Schlipsing, Marc

    Roll Angle Estimation for Motorcycles: Comparing Video and Inertial Sensor Approaches Marc such modules to motorcycles, the camera pose has to be taken into account, as e. g. large roll angles produce,schroeter,winner}@fzd.tu-darmstadt.de Z Y X Fig. 1. Illustration of the motorcycle coordinate system. instance, recognition of obstacles

  14. Dark energy survey and camera

    SciTech Connect

    William Wester

    2004-08-16

    The authors describe the Dark Energy Survey and Camera. The survey will image 5000 sq. deg. in the southern sky to collect 300 million galaxies, 30,000 galaxy clusters and 2000 Type Ia supernovae. They expect to derive a value for the dark energy equation of state parameters, w, to a precision of 5% by combining four distinct measurement techniques. They describe the mosaic camera that will consist of CCDs with enhanced sensitivity in the near infrared. The camera will be mounted at the prime focus of the 4m Blanco telescope.

  15. The importance of craniovertebral and cervicomedullary angles in cervicogenic headache

    PubMed Central

    Çoban, Gökçen; Çöven, ?lker; Çifçi, Bilal Egemen; Y?ld?r?m, Erkan; Yaz?c?, Ay?e Canan; Horasanl?, Bahriye

    2014-01-01

    PURPOSE Many studies have indicated that cervicogenic headache may originate from the cervical structures innervated by the upper cervical spinal nerves. To date, no study has investigated whether narrowing of the craniovertebral angle (CVA) or cervicomedullary angle (CMA) affects the three upper cervical spinal nerves. The aim of this study was to investigate the effect of CVA and/or CMA narrowing on the occurrence of cervicogenic headache. MATERIALS AND METHODS Two hundred and five patients diagnosed with cervicogenic headache were included in the study. The pain scores of patients were determined using a visual analog scale. The nonheadache control group consisted of 40 volunteers. CVA and CMA values were measured on sagittal T2-weighted magnetic resonance imaging (MRI), on two occasions by two radiologists. Angle values and categorized pain scores were compared statistically between the groups. RESULTS Intraobserver and interobserver agreement was over 97% for all measurements. Pain scores increased with decreasing CVA and CMA values. Mean angle values were significantly different among the pain categories (P < 0.001). The pain score was negatively correlated with CMA (Spearman correlation coefficient, rs, ?0.676; P < 0.001) and CVA values (rs, ?0.725; P < 0.001). CONCLUSION CVA or CMA narrowing affects the occurrence of cervicogenic headache. There is an inverse relationship between the angle values and pain scores. PMID:24317332

  16. Special Angle Pairs Discovery Activity

    NSDL National Science Digital Library

    Barbara Henry

    2012-04-16

    This lesson uses a discovery approach to identify the special angles formed when a set of parallel lines is cut by a transversal. During this lesson students identify the angle pair and the relationship between the angles. Students use this relationship and special angle pairs to make conjectures about which angle pairs are considered special angles.

  17. Polygon Angle Applet

    NSDL National Science Digital Library

    Nicholas Exner

    2000-05-31

    This interactive Java applet supports the investigation of the relationship between the number of vertices of a polygon and its interior angle sum. Learners choose and locate the vertices, the angle measures are displayed, and then the student can drag the measures into a circle to see them summed relative to 360 degrees.

  18. Vision Sensors and Cameras

    NASA Astrophysics Data System (ADS)

    Hoefflinger, Bernd

    Silicon charge-coupled-device (CCD) imagers have been and are a specialty market ruled by a few companies for decades. Based on CMOS technologies, active-pixel sensors (APS) began to appear in 1990 at the 1 ?m technology node. These pixels allow random access, global shutters, and they are compatible with focal-plane imaging systems combining sensing and first-level image processing. The progress towards smaller features and towards ultra-low leakage currents has provided reduced dark currents and ?m-size pixels. All chips offer Mega-pixel resolution, and many have very high sensitivities equivalent to ASA 12.800. As a result, HDTV video cameras will become a commodity. Because charge-integration sensors suffer from a limited dynamic range, significant processing effort is spent on multiple exposure and piece-wise analog-digital conversion to reach ranges >10,000:1. The fundamental alternative is log-converting pixels with an eye-like response. This offers a range of almost a million to 1, constant contrast sensitivity and constant colors, important features in professional, technical and medical applications. 3D retino-morphic stacking of sensing and processing on top of each other is being revisited with sub-100 nm CMOS circuits and with TSV technology. With sensor outputs directly on top of neurons, neural focal-plane processing will regain momentum, and new levels of intelligent vision will be achieved. The industry push towards thinned wafers and TSV enables backside-illuminated and other pixels with a 100% fill-factor. 3D vision, which relies on stereo or on time-of-flight, high-speed circuitry, will also benefit from scaled-down CMOS technologies both because of their size as well as their higher speed.

  19. A lexicon for Camera Obscura

    E-print Network

    Rosinsky, Robert David

    1984-01-01

    The camera obscura has allowed artists, scientists, and philosophers to view the world as a flat image. Two - dimensional renditions of visual reality seem to be more manageable and easier to grasp than reality itself. A ...

  20. Digital Camera Find out which

    E-print Network

    Rogers, John A.

    design could be used in imaging technology in the field. And while the concept of an electronic eye. Boaz Tamir The Differential Analyzer Book Review The Bomb that Never Was Page 1 of 2A Spherical Camera

  1. The future of consumer cameras

    NASA Astrophysics Data System (ADS)

    Battiato, Sebastiano; Moltisanti, Marco

    2015-03-01

    In the last two decades multimedia, and in particular imaging devices (camcorders, tablets, mobile phones, etc.) have been dramatically diffused. Moreover the increasing of their computational performances, combined with an higher storage capability, allows them to process large amount of data. In this paper an overview of the current trends of consumer cameras market and technology will be given, providing also some details about the recent past (from Digital Still Camera up today) and forthcoming key issues.

  2. Infants Experience Perceptual Narrowing for Nonprimate Faces

    ERIC Educational Resources Information Center

    Simpson, Elizabeth A.; Varga, Krisztina; Frick, Janet E.; Fragaszy, Dorothy

    2011-01-01

    Perceptual narrowing--a phenomenon in which perception is broad from birth, but narrows as a function of experience--has previously been tested with primate faces. In the first 6 months of life, infants can discriminate among individual human and monkey faces. Though the ability to discriminate monkey faces is lost after about 9 months, infants…

  3. Sensitive IR narrow band optimized microspectrometer

    Microsoft Academic Search

    David L Wetzel

    2002-01-01

    Customization of a standard model confocally masked FT-IR microspectrometer to maximize the signal for a particular narrow band of the spectrum and minimize noise is described. In this case the motivation was to detect minor concentrations of deuterated species in a matrix of tissue. However, the instrumental modifications used for this particular task are applicable to narrow band sensitization in

  4. Fundus Camera Guided Photoacoustic Ophthalmoscopy

    PubMed Central

    Liu, Tan; Li, Hao; Song, Wei; Jiao, Shuliang; Zhang, Hao F.

    2014-01-01

    Purpose To demonstrate the feasibility of fundus camera guided photoacoustic ophthalmoscopy (PAOM) system and its multimodal imaging capabilities. Methods We integrated PAOM and a fundus camera consisting of a white-light illuminator and a high-sensitivity, high-speed CCD. The fundus camera captures both retinal anatomy and PAOM illumination at the same time to provide a real-time feedback when we position the PAOM illuminating light. We applied the integrated system to image rat eyes in vivo and used full-spectrum, visible (VIS), and near infrared (NIR) illuminations in fundus photography. Results Both albino and pigmented rat eyes were imaged in vivo. During alignment, different trajectories of PAOM laser scanning were successfully visualized by the fundus camera, which reduced the PAOM alignment time from several minutes to 30 s. In albino eyes, in addition to retinal vessels, main choroidal vessels were observed using VIS-illumination, which is similar to PAOM images. In pigmented eyes, the radial striations of retinal nerve fiber layer were visualized by fundus photography using full-spectrum illumination; meanwhile, PAOM imaged both retinal vessels and the retinal pigmented epithelium melanin distribution. Conclusions The results demonstrated that PAOM can be well-integrated with fundus camera without affecting its functionality. The fundus camera guidance is faster and easier comparing with our previous work. The integrated system also set the stage for the next-step verification between oximetry methods based on PAOM and fundus photography. PMID:24131226

  5. Versatility of the CFR algorithm for limited angle reconstruction

    SciTech Connect

    Fujieda, I.; Heiskanen, K.; Perez-Mendez, V. (Lawrence Berkeley Lab., CA (USA))

    1990-04-01

    The constrained Fourier reconstruction (CFR) algorithm and the iterative reconstruction-reprojection (IRR) algorithm are evaluated based on their accuracy for three types of limited angle reconstruction problems. The cFR algorithm performs better for problems such as Xray CT imaging of a nuclear reactor core with one large data gap due to structural blocking of the source and detector pair. For gated heart imaging by Xray CT, radioisotope distribution imaging by PET or SPECT, using a polygonal array of gamma cameras with insensitive gaps between camera boundaries, the IRR algorithm has a slight advantage over the CFR algorithm but the difference is not significant.

  6. Configuration control of a forklift vehicle using vision system with limited angle of view

    Microsoft Academic Search

    Augie Widyotriatmo; Gi-Yong Hong; Keum-Shik Hong

    2010-01-01

    This paper presents a configuration control of a forklift vehicle using a camera-based vision system with limited angle-of-view (AOV). The configuration (i.e., position and orientation) of the vehicle is transformed to navigation variables, which are the distance left to the goal position, the angle made by the x-axis of the target coordinate and the vehicle-to-target (v-to-t) vector, and the angle

  7. Vector 1 Vector 2 Zone Axis (h k l) d (h k l) d [U V W] Angle

    E-print Network

    Cambridge, University of

    ------------------------------------------------------------------------------ Vector 1 Vector 2 of d-spacings obtained from two reciprocal lattice vectors, and the acute angle between these vectors. If the camera constant is unknown, then the ratio of the vectors may be used instead

  8. Laser angle sensor development

    NASA Technical Reports Server (NTRS)

    Pond, C. R.; Texeira, P. D.

    1980-01-01

    Electrical and optical parameters were developed for a two axis (pitch/roll) laser angle sensor. The laser source and detector were mounted in the plenum above the model. Two axis optical distortion measurements of flow characteristics in a 0.3 transonic cryogenic tunnel were made with a shearing interferometer. The measurement results provide a basis for estimating the optical parameters of the laser angle sensor. Experimental and analytical information was generated on model windows to cover the reflector. A two axis breadboard was assembled to evaluate different measurement concepts. The measurement results were used to develop a preliminary design of a laser angle sensor. Schematics and expected performance specifications are included.

  9. Multi-Angle View of the Canary Islands

    NASA Technical Reports Server (NTRS)

    2000-01-01

    A multi-angle view of the Canary Islands in a dust storm, 29 February 2000. At left is a true-color image taken by the Multi-angle Imaging SpectroRadiometer (MISR) instrument on NASA's Terra satellite. This image was captured by the MISR camera looking at a 70.5-degree angle to the surface, ahead of the spacecraft. The middle image was taken by the MISR downward-looking (nadir) camera, and the right image is from the aftward 70.5-degree camera. The images are reproduced using the same radiometric scale, so variations in brightness, color, and contrast represent true variations in surface and atmospheric reflectance with angle. Windblown dust from the Sahara Desert is apparent in all three images, and is much brighter in the oblique views. This illustrates how MISR's oblique imaging capability makes the instrument a sensitive detector of dust and other particles in the atmosphere. Data for all channels are presented in a Space Oblique Mercator map projection to facilitate their co-registration. The images are about 400 km (250 miles)wide, with a spatial resolution of about 1.1 kilometers (1,200 yards). North is toward the top. MISR was built and is managed by NASA's Jet Propulsion Laboratory, Pasadena, CA, for NASA's Office of Earth Science, Washington, DC. The Terra satellite is managed by NASA's Goddard Space Flight Center, Greenbelt, MD. JPL is a division of the California Institute of Technology.

  10. The MC and LFC cameras. [metric camera (MC); large format camera (LFC)

    NASA Technical Reports Server (NTRS)

    Norton, Clarice L.; Schroeder, Manfried; Mollberg, Bernard

    1986-01-01

    The characteristics of the shuttle-borne Large Format Camera are listed. The LFC focal plane format was 23 by 46 cm, double the usual size, thereby acquiring approximately double the ground area. Forward motion compensation was employed. With the stable platform (shuttle) it was possible to use the slow exposure, high resolution, Kodak aerial films; 3414 and 3412 black and white, SO-242 color, and SO-131 aerochrome infrared. The camera was designed to maintain stability during varying temperature extremes of space.

  11. Autoconfiguration of a Dynamic Nonoverlapping Camera Network

    Microsoft Academic Search

    Imran N. Junejo; Xiaochun Cao; Hassan Foroosh

    2007-01-01

    In order to monitor sufficiently large areas of interest for surveillance or any event detection, we need to look beyond stationary cameras and employ an automatically configurable network of nonoverlapping cameras. These cameras need not have an overlapping field of view and should be allowed to move freely in space. Moreover, features like zooming in\\/out, readily available in security cameras

  12. Potentials of large format camera photography

    Microsoft Academic Search

    R. C. Malhotra

    1986-01-01

    From the viewpoint of improved precision, resolution, area coverage, and other terrain mapping considerations, a large format camera of 30-cm focal length and a pair of stellar cameras to determine camera attitude were recommended for Apollo Missions. In this paper, the potentials of a Large Format Camera (LFC) photography are explored specifically for the purpose of carrying out photogrammetic control

  13. Binocular Camera Calibration Using Rectification Error

    Microsoft Academic Search

    Derek Bradley; Wolfgang Heidrich

    2010-01-01

    Reprojection error is a commonly used measure for comparing the quality of different camera calibrations, for example when choosing the best calibration from a set. While this measure is suitable for single cameras, we show that we can improve calibrations in a binocular or multi-camera setup by calibrating the cameras in pairs using a rectification error. The rectification error determines

  14. Photometric Calibration of Consumer Video Cameras

    NASA Technical Reports Server (NTRS)

    Suggs, Robert; Swift, Wesley, Jr.

    2007-01-01

    Equipment and techniques have been developed to implement a method of photometric calibration of consumer video cameras for imaging of objects that are sufficiently narrow or sufficiently distant to be optically equivalent to point or line sources. Heretofore, it has been difficult to calibrate consumer video cameras, especially in cases of image saturation, because they exhibit nonlinear responses with dynamic ranges much smaller than those of scientific-grade video cameras. The present method not only takes this difficulty in stride but also makes it possible to extend effective dynamic ranges to several powers of ten beyond saturation levels. The method will likely be primarily useful in astronomical photometry. There are also potential commercial applications in medical and industrial imaging of point or line sources in the presence of saturation.This development was prompted by the need to measure brightnesses of debris in amateur video images of the breakup of the Space Shuttle Columbia. The purpose of these measurements is to use the brightness values to estimate relative masses of debris objects. In most of the images, the brightness of the main body of Columbia was found to exceed the dynamic ranges of the cameras. A similar problem arose a few years ago in the analysis of video images of Leonid meteors. The present method is a refined version of the calibration method developed to solve the Leonid calibration problem. In this method, one performs an endto- end calibration of the entire imaging system, including not only the imaging optics and imaging photodetector array but also analog tape recording and playback equipment (if used) and any frame grabber or other analog-to-digital converter (if used). To automatically incorporate the effects of nonlinearity and any other distortions into the calibration, the calibration images are processed in precisely the same manner as are the images of meteors, space-shuttle debris, or other objects that one seeks to analyze. The light source used to generate the calibration images is an artificial variable star comprising a Newtonian collimator illuminated by a light source modulated by a rotating variable neutral- density filter. This source acts as a point source, the brightness of which varies at a known rate. A video camera to be calibrated is aimed at this source. Fixed neutral-density filters are inserted in or removed from the light path as needed to make the video image of the source appear to fluctuate between dark and saturated bright. The resulting video-image data are analyzed by use of custom software that determines the integrated signal in each video frame and determines the system response curve (measured output signal versus input brightness). These determinations constitute the calibration, which is thereafter used in automatic, frame-by-frame processing of the data from the video images to be analyzed.

  15. Video camera use at nuclear power plants

    SciTech Connect

    Estabrook, M.L.; Langan, M.O.; Owen, D.E. (ENCORE Technical Resources, Inc., Middletown, PA (USA))

    1990-08-01

    A survey of US nuclear power plants was conducted to evaluate video camera use in plant operations, and determine equipment used and the benefits realized. Basic closed circuit television camera (CCTV) systems are described and video camera operation principles are reviewed. Plant approaches for implementing video camera use are discussed, as are equipment selection issues such as setting task objectives, radiation effects on cameras, and the use of disposal cameras. Specific plant applications are presented and the video equipment used is described. The benefits of video camera use --- mainly reduced radiation exposure and increased productivity --- are discussed and quantified. 15 refs., 6 figs.

  16. Anger perceptually and conceptually narrows cognitive scope.

    PubMed

    Gable, Philip A; Poole, Bryan D; Harmon-Jones, Eddie

    2015-07-01

    For the last 50 years, research investigating the effect of emotions on scope of cognitive processing was based on models proposing that affective valence determined cognitive scope. More recently, our motivational intensity model suggests that this past work had confounded valence with motivational intensity. Research derived from this model supports the idea that motivational intensity, rather than affective valence, explains much of the variance emotions have on cognitive scope. However, the motivational intensity model is limited in that the empirical work has examined only positive affects high in approach and negative affects high in avoidance motivation. Thus, perhaps only approach-positive and avoidance-negative states narrow cognitive scope. The present research was designed to clarify these conceptual issues by examining the effect of anger, a negatively valenced approach-motivated state, on cognitive scope. Results revealed that anger narrowed attentional scope relative to a neutral state and that attentional narrowing to anger was similar to the attentional narrowing caused by high approach-motivated positive affects (Study 1). This narrowing of attention was related to trait approach motivation (Studies 2 and Study 3). Anger also narrowed conceptual cognitive categorization (Study 4). Narrowing of categorization related to participants' approach motivation toward anger stimuli. Together, these results suggest that anger, an approach-motivated negative affect, narrows perceptual and conceptual cognitive scope. More broadly, these results support the conceptual model that motivational intensity per se, rather than approach-positive and avoidance-negative states, causes a narrowing of cognitive scope. (PsycINFO Database Record PMID:26011662

  17. Exploring the Moon at High-Resolution: First Results From the Lunar Reconnaissance Orbiter Camera (LROC)

    NASA Astrophysics Data System (ADS)

    Robinson, Mark; Hiesinger, Harald; McEwen, Alfred; Jolliff, Brad; Thomas, Peter C.; Turtle, Elizabeth; Eliason, Eric; Malin, Mike; Ravine, A.; Bowman-Cisneros, Ernest

    The Lunar Reconnaissance Orbiter (LRO) spacecraft was launched on an Atlas V 401 rocket from the Cape Canaveral Air Force Station Launch Complex 41 on June 18, 2009. After spending four days in Earth-Moon transit, the spacecraft entered a three month commissioning phase in an elliptical 30×200 km orbit. On September 15, 2009, LRO began its planned one-year nominal mapping mission in a quasi-circular 50 km orbit. A multi-year extended mission in a fixed 30×200 km orbit is optional. The Lunar Reconnaissance Orbiter Camera (LROC) consists of a Wide Angle Camera (WAC) and two Narrow Angle Cameras (NACs). The WAC is a 7-color push-frame camera, which images the Moon at 100 and 400 m/pixel in the visible and UV, respectively, while the two NACs are monochrome narrow-angle linescan imagers with 0.5 m/pixel spatial resolution. LROC was specifically designed to address two of the primary LRO mission requirements and six other key science objectives, including 1) assessment of meter-and smaller-scale features in order to select safe sites for potential lunar landings near polar resources and elsewhere on the Moon; 2) acquire multi-temporal synoptic 100 m/pixel images of the poles during every orbit to unambiguously identify regions of permanent shadow and permanent or near permanent illumination; 3) meter-scale mapping of regions with permanent or near-permanent illumination of polar massifs; 4) repeat observations of potential landing sites and other regions to derive high resolution topography; 5) global multispectral observations in seven wavelengths to characterize lunar resources, particularly ilmenite; 6) a global 100-m/pixel basemap with incidence angles (60° -80° ) favorable for morphological interpretations; 7) sub-meter imaging of a variety of geologic units to characterize their physical properties, the variability of the regolith, and other key science questions; 8) meter-scale coverage overlapping with Apollo-era panoramic images (1-2 m/pixel) to document the number of small impacts since 1971-1972. LROC allows us to determine the recent impact rate of bolides in the size range of 0.5 to 10 meters, which is currently not well known. Determining the impact rate at these sizes enables engineering remediation measures for future surface operations and interplanetary travel. The WAC has imaged nearly the entire Moon in seven wavelengths. A preliminary global WAC stereo-based topographic model is in preparation [1] and global color processing is underway [2]. As the mission progresses repeat global coverage will be obtained as lighting conditions change providing a robust photometric dataset. The NACs are revealing a wealth of morpho-logic features at the meter scale providing the engineering and science constraints needed to support future lunar exploration. All of the Apollo landing sites have been imaged, as well as the majority of robotic landing and impact sites. Through the use of off-nadir slews a collection of stereo pairs is being acquired that enable 5-m scale topographic mapping [3-7]. Impact mor-phologies (terraces, impact melt, rays, etc) are preserved in exquisite detail at all Copernican craters and are enabling new studies of impact mechanics and crater size-frequency distribution measurements [8-12]. Other topical studies including, for example, lunar pyroclastics, domes, and tectonics are underway [e.g., 10-17]. The first PDS data release of LROC data will be in March 2010, and will include all images from the commissioning phase and the first 3 months of the mapping phase. [1] Scholten et al. (2010) 41st LPSC, #2111; [2] Denevi et al. (2010a) 41st LPSC, #2263; [3] Beyer et al. (2010) 41st LPSC, #2678; [4] Archinal et al. (2010) 41st LPSC, #2609; [5] Mattson et al. (2010) 41st LPSC, #1871; [6] Tran et al. (2010) 41st LPSC, #2515; [7] Oberst et al. (2010) 41st LPSC, #2051; [8] Bray et al. (2010) 41st LPSC, #2371; [9] Denevi et al. (2010b) 41st LPSC, #2582; [10] Hiesinger et al. (2010a) 41st LPSC, #2278; [11] Hiesinger et al. (2010b) 41st LPSC, #2304; [12] van der Bogert et al. (2010) 41st LPSC, #2165;

  18. Development of filter exchangeable 3CCD camera for multispectral imaging acquisition

    NASA Astrophysics Data System (ADS)

    Lee, Hoyoung; Park, Soo Hyun; Kim, Moon S.; Noh, Sang Ha

    2012-05-01

    There are a lot of methods to acquire multispectral images. Dynamic band selective and area-scan multispectral camera has not developed yet. This research focused on development of a filter exchangeable 3CCD camera which is modified from the conventional 3CCD camera. The camera consists of F-mounted lens, image splitter without dichroic coating, three bandpass filters, three image sensors, filer exchangeable frame and electric circuit for parallel image signal processing. In addition firmware and application software have developed. Remarkable improvements compared to a conventional 3CCD camera are its redesigned image splitter and filter exchangeable frame. Computer simulation is required to visualize a pathway of ray inside of prism when redesigning image splitter. Then the dimensions of splitter are determined by computer simulation which has options of BK7 glass and non-dichroic coating. These properties have been considered to obtain full wavelength rays on all film planes. The image splitter is verified by two line lasers with narrow waveband. The filter exchangeable frame is designed to make swap bandpass filters without displacement change of image sensors on film plane. The developed 3CCD camera is evaluated to application of detection to scab and bruise on Fuji apple. As a result, filter exchangeable 3CCD camera could give meaningful functionality for various multispectral applications which need to exchange bandpass filter.

  19. Wide field camera observations of Baade's Window

    NASA Technical Reports Server (NTRS)

    Holtzman, Jon A.; Light, R. M.; Baum, William A.; Worthey, Guy; Faber, S. M.; Hunter, Deidre A.; O'Neil, Earl J., Jr.; Kreidl, Tobias J.; Groth, E. J.; Westphal, James A.

    1993-01-01

    We have observed a field in Baade's Window using the Wide Field Camera (WFC) of the Hubble Space Telescope (HST) and obtain V- and I-band photometry down to V approximately 22.5. These data go several magnitudes fainter than previously obtained from the ground. The location of the break in the luminosity function suggests that there are a significant number of intermediate age (less than 10 Gyr) stars in the Galactic bulge. This conclusion rests on the assumptions that the extinction towards our field is similar to that seen in other parts of Baade's Window, that the distance to the bulge is approximately 8 kpc, and that we can determine fairly accurate zero points for the HST photometry. Changes in any one of these assumptions could increase the inferred age, but a conspiracy of lower reddening, a shorter distance to the bulge, and/or photometric zero-point errors would be needed to imply a population entirely older than 10 Gyr. We infer an initial mass function slope for the main-sequence stars, and find that it is consistent with that measured in the solar neighborhood; unfortunately, the slope is poorly constrained because we sample only a narrow range of stellar mass and because of uncertainties in the observed luminosity function at the faint end.

  20. IMAGE-BASED PAN-TILT CAMERA CONTROL IN A MULTI-CAMERA SURVEILLANCE ENVIRONMENT

    E-print Network

    Davis, Larry

    IMAGE-BASED PAN-TILT CAMERA CONTROL IN A MULTI-CAMERA SURVEILLANCE ENVIRONMENT Ser-Nam Lim, Ahmed the cameras accurately. Each cam- era must be able to pan-tilt such that an object detected in the scene camera is assigned a pan-tilt zero- position. Position of an object detected in one camera is related

  1. 'Magic Angle Precession'

    SciTech Connect

    Binder, Bernd [Quanics.com, Germany, 88679 Salem, P.O. Box 1247 (United States)], E-mail: binder@quanics.com

    2008-01-21

    An advanced and exact geometric description of nonlinear precession dynamics modeling very accurately natural and artificial couplings showing Lorentz symmetry is derived. In the linear description it is usually ignored that the geometric phase of relativistic motion couples back to the orbital motion providing for a non-linear recursive precession dynamics. The high coupling strength in the nonlinear case is found to be a gravitomagnetic charge proportional to the precession angle and angular velocity generated by geometric phases, which are induced by high-speed relativistic rotations and are relevant to propulsion technologies but also to basic interactions. In the quantum range some magic precession angles indicating strong coupling in a phase-locked chaotic system are identified, emerging from a discrete time dynamical system known as the cosine map showing bifurcations at special precession angles relevant to heavy nuclei stability. The 'Magic Angle Precession' (MAP) dynamics can be simulated and visualized by cones rolling in or on each other, where the apex and precession angles are indexed by spin, charge or precession quantum numbers, and corresponding magic angles. The most extreme relativistic warping and twisting effect is given by the Dirac spinor half spin constellation with 'Hyperdiamond' MAP, which resembles quark confinement.

  2. The Clementine longwave infrared camera

    SciTech Connect

    Priest, R.E.; Lewis, I.T.; Sewall, N.R.; Park, H.S.; Shannon, M.J.; Ledebuhr, A.G.; Pleasance, L.D. [Lawrence Livermore National Lab., CA (United States); Massie, M.A. [Pacific Advanced Technology, Solvang, CA (United States); Metschuleit, K. [Amber/A Raytheon Co., Goleta, CA (United States)

    1995-04-01

    The Clementine mission provided the first ever complete, systematic surface mapping of the moon from the ultra-violet to the near-infrared regions. More than 1.7 million images of the moon, earth and space were returned from this mission. The longwave-infrared (LWIR) camera supplemented the UV/Visible and near-infrared mapping cameras providing limited strip coverage of the moon, giving insight to the thermal properties of the soils. This camera provided {approximately}100 m spatial resolution at 400 km periselene, and a 7 km across-track swath. This 2.1 kg camera using a 128 x 128 Mercury-Cadmium-Telluride (MCT) FPA viewed thermal emission of the lunar surface and lunar horizon in the 8.0 to 9.5 {micro}m wavelength region. A description of this light-weight, low power LWIR camera along with a summary of lessons learned is presented. Design goals and preliminary on-orbit performance estimates are addressed in terms of meeting the mission`s primary objective for flight qualifying the sensors for future Department of Defense flights.

  3. A wide-field infrared camera for the Observatoire du mont Mégantic

    NASA Astrophysics Data System (ADS)

    Artigau, Etienne; Doyon, Rene; Nadeau, Daniel; Vallee, Philippe; Thibault, Simon

    2003-03-01

    A wide-field near-infrared (0.8 2.4 ?m) camera for the 1.6 m telescope of the Observatoire du mont Mégantic (OMM), is currently under construction at the Université de Montréal. The field of view is 30' × 30' and will have very little distortion. The optics comprise 8 spherical cryogenic lenses. The instrument features two filter wheels with provision for 10 filters including broad band I, z, J, H, K and other narrow-band filters. The camera is based on a 2048 × 2048 HgCdTe Hawaii-2 detector driven by a 32-output SDSU-II controller operating at ~250 kHz.

  4. Characterization of a PET Camera Optimized for ProstateImaging

    SciTech Connect

    Huber, Jennifer S.; Choong, Woon-Seng; Moses, William W.; Qi,Jinyi; Hu, Jicun; Wang, G.C.; Wilson, David; Oh, Sang; Huesman, RonaldH.; Derenzo, Stephen E.

    2005-11-11

    We present the characterization of a positron emission tomograph for prostate imaging that centers a patient between a pair of external curved detector banks (ellipse: 45 cm minor, 70 cm major axis). The distance between detector banks adjusts to allow patient access and to position the detectors as closely as possible for maximum sensitivity with patients of various sizes. Each bank is composed of two axial rows of 20 HR+ block detectors for a total of 80 detectors in the camera. The individual detectors are angled in the transaxial plane to point towards the prostate to reduce resolution degradation in that region. The detectors are read out by modified HRRT data acquisition electronics. Compared to a standard whole-body PET camera, our dedicated-prostate camera has the same sensitivity and resolution, less background (less randoms and lower scatter fraction) and a lower cost. We have completed construction of the camera. Characterization data and reconstructed images of several phantoms are shown. Sensitivity of a point source in the center is 946 cps/mu Ci. Spatial resolution is 4 mm FWHM in the central region.

  5. Measuring BRDF using a single still digital camera

    NASA Astrophysics Data System (ADS)

    Xiong, Hanwei; Xu, Jun; Wang, Jinming

    2011-11-01

    BRDF (Bidirectional Reflective Distribution Function)is broadly used in many fields, such as physics, heat transformation, remote sensing, and computer graphics. Traditional methods to measure BRDF are expensive for most peoples, and image based approach becomes a novel direction. Until now, for such an image based system, at least a video camera and a still camera are indispensible, and the operations are not easily carried out under a convenient condition. In this paper, a method using only one still camera is proposed, with the help of a light source, a cylinder support, and a sphere. The material to be measured is painted on the sphere, putting on the cylinder support painted with BRDF- known material. Around the cylinder support, a simple control points nets are distributed. In the measurement process, the light source and the support are fixed, operators goes around the sphere to obtain pictures at different view angles and the rest work is finished automatically by a set of programs. The pictures are first processed by a photogrammetric program to get the geometry in the scene, including the positions, directions, and the shapes of light source, the support, the sphere, and the cameras. The BRDF samples are calculated from the image intensity and the obtained geometric relations, which are approximated by a multivariable spline to get a full BRDF description. Three different materials are tested with the method.

  6. Dark Energy Camera for Blanco

    SciTech Connect

    Binder, Gary A.; /Caltech /SLAC

    2010-08-25

    In order to make accurate measurements of dark energy, a system is needed to monitor the focus and alignment of the Dark Energy Camera (DECam) to be located on the Blanco 4m Telescope for the upcoming Dark Energy Survey. One new approach under development is to fit out-of-focus star images to a point spread function from which information about the focus and tilt of the camera can be obtained. As a first test of a new algorithm using this idea, simulated star images produced from a model of DECam in the optics software Zemax were fitted. Then, real images from the Mosaic II imager currently installed on the Blanco telescope were used to investigate the algorithm's capabilities. A number of problems with the algorithm were found, and more work is needed to understand its limitations and improve its capabilities so it can reliably predict camera alignment and focus.

  7. THZ EMISSION SPECTROSCOPY OF NARROW BANDGAP SEMICONDUCTORS

    E-print Network

    Wilke, Ingrid

    THZ EMISSION SPECTROSCOPY OF NARROW BANDGAP SEMICONDUCTORS By Ricardo Asc´azubi A Thesis Submitted-Domain Spectroscopy . . . . . . . . . . . . . . . . . . . . . . . . 4 2.1 Optically Excited THz Emission Processes Description . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19 5.1 THz-TDS Setup

  8. Geiger-mode ladar cameras

    NASA Astrophysics Data System (ADS)

    Yuan, Ping; Sudharsanan, Rengarajan; Bai, Xiaogang; Boisvert, Joseph; McDonald, Paul; Labios, Eduardo; Morris, Bryan; Nicholson, John P.; Stuart, Gary M.; Danny, Harrison; Van Duyne, Stephen; Pauls, Greg; Gaalema, Stephen

    2011-06-01

    The performance of Geiger-mode LAser Detection and Ranging (LADAR) cameras is primarily defined by individual pixel attributes, such as dark count rate (DCR), photon detection efficiency (PDE), jitter, and crosstalk. However, for the expanding LADAR imaging applications, other factors, such as image uniformity, component tolerance, manufacturability, reliability, and operational features, have to be considered. Recently we have developed new 32×32 and 32×128 Read-Out Integrated Circuits (ROIC) for LADAR applications. With multiple filter and absorber structures, the 50-?m-pitch arrays demonstrate pixel crosstalk less than 100 ppm level, while maintaining a PDE greater than 40% at 4 V overbias. Besides the improved epitaxial and process uniformity of the APD arrays, the new ROICs implement a Non-uniform Bias (NUB) circuit providing 4-bit bias voltage tunability over a 2.5 V range to individually bias each pixel. All these features greatly increase the performance uniformity of the LADAR camera. Cameras based on these ROICs were integrated with a data acquisition system developed by Boeing DES. The 32×32 version has a range gate of up to 7 ?s and can cover a range window of about 1 km with 14-bit and 0.5 ns timing resolution. The 32×128 camera can be operated at a frame rate of up to 20 kHz with 0.3 ns and 14-bit time resolution through a full CameraLink. The performance of the 32×32 LADAR camera has been demonstrated in a series of field tests on various vehicles.

  9. Frequency-narrowed diode array bar.

    PubMed

    Babcock, Earl; Chann, Bien; Nelson, Ian A; Walker, Thad G

    2005-05-20

    We describe a method to frequency narrow multielement high-power diode bars. Using a commercial 60-W, 49-element, 1-cm-long diode array bar at 795 nm running at 45 W, we narrow the linewidth from 1000 to 64 GHz with only a loss of 33% in output power. The resulting laser light is well suited for spin-exchange optical pumping of noble gas nuclei. PMID:15929304

  10. Solder wetting kinetics in narrow V-grooves

    SciTech Connect

    Yost, F.G.; Rye, R.R. [Sandia National Labs., Albuquerque, NM (United States)] [Sandia National Labs., Albuquerque, NM (United States); Mann, J.A. Jr. [Case Western Reserve Univ., Cleveland, OH (United States)] [Case Western Reserve Univ., Cleveland, OH (United States)

    1997-12-01

    Experiments are performed to observe capillary flow in grooves cut into copper surfaces. Flow kinetics of two liquids, 1-heptanol and eutectic Sn-Pb solder, are modeled with modified Washburn kinetics and compared to flow data. It is shown that both liquids flow parabolically in narrow V-grooves, and the data scale as predicted by the modified Washburn model. The early portions of the flow kinetics are characterized by curvature in the length vs time relationship which is not accounted for in the modified Washburn model. This effect is interpreted in terms of a dynamic contact angle. It is concluded that under conditions of rapid flow, solder spreading can be understood as a simple fluid flow process. Slower kinetics, e.g. solder droplet spreading on flat surfaces, may be affected by subsidiary chemical processes such as reaction.

  11. Measurement of small angle using phase shifted Lau interferometry

    NASA Astrophysics Data System (ADS)

    Disawal, Reena; Dhanotia, Jitendra; Prakash, Shashi

    2014-10-01

    An incoherent white-light source illuminates a set of two identical gratings placed in tandem, resulting in the generation of the Fresnel image. This image is projected onto a reflecting object and the reflected images from the object are projected onto the third grating. The resulting moiré fringes are recorded using CCD camera. Inclination angle of the object is a function of the interferometric phase. Phase shifting interferometry has been used for the determination of interferometric phase. Hence accurate determination of small tilt angle of object surface could be successfully undertaken. Technique is automated and provides high precision in measurement.

  12. Analysis and protection of stray light for the space camera at geosynchronous orbit

    NASA Astrophysics Data System (ADS)

    Jin, Xiaorui; Lin, Li

    2012-11-01

    Stray light is the general term for all non-normal transmission of light in the optical system. The influence of stray light is different according to optical system's structure. Large area array camera at geosynchronous orbit is facing more serious influence of stray light, especially for the small incident angle of sunlight on the system. It is in dire need of a detailed analysis of stray light of the basic shape of the optical system .In the paper, the influence on the camera used in space from stray light and the necessity to eliminate stray light are presented. The definition of the stray light coefficient and PST(point source transmittance) is briefed. In Tracepro, analysis of the impact of sunlight incident was made at different angles on the space camera, in the case of stray light factor for the quantitative evaluation. The design principle of the inside and outside hood is presented for the R-C (Ritchey Chretien) optical system. On this basis, in order to reduce stray light interference for the space camera, the primary and secondary mirror's hoods were designed. Finally, when the incidence angle of sunlight is more than 3° incidence on the space camera, the coefficient of stray light is less than 2%. It meets the engineering requirements.

  13. What Is the Angle?

    NSDL National Science Digital Library

    This activity will help students understand how the angle of the Sun affects temperatures around the globe. After experimenting with a heat lamp and thermometers at differing angles, students apply what they learned to explain temperature variations on Earth. The printable six-page handout includes a series of inquiry-based questions to get students thinking about what they already know about temperature patterns, detailed experiment directions, and a worksheet that will help students use the experiment results to gain a deeper understanding of seasonal temperature changes and why Antarctica is always so cold. The students will explore all the angles of sunlight with a few thermometers and a heat lamp and understand why there is such a dramatic temperature change between the equator and the South Pole.

  14. SuperWASP: Wide Angle Search for Planets

    E-print Network

    R. A. Street; D. L. Pollacco; A. Fitzsimmons; F. P. Keenan; Keith Horne; S. Kane; A. Collier Cameron; T. A. Lister; C. Haswell; A. J. Norton; B. W. Jones; I. Skillen; S. Hodgkin; P. Wheatley; R. West; D. Brett

    2002-08-12

    SuperWASP is a fully robotic, ultra-wide angle survey for planetary transits. Currently under construction, it will consist of 5 cameras, each monitoring a 9.5 x 9.5 deg field of view. The Torus mount and enclosure will be fully automated and linked to a built-in weather station. We aim to begin observations at the beginning of 2003.

  15. Contact Angle Measurements Using a Simplified Experimental Setup

    ERIC Educational Resources Information Center

    Lamour, Guillaume; Hamraoui, Ahmed; Buvailo, Andrii; Xing, Yangjun; Keuleyan, Sean; Prakash, Vivek; Eftekhari-Bafrooei, Ali; Borguet, Eric

    2010-01-01

    A basic and affordable experimental apparatus is described that measures the static contact angle of a liquid drop in contact with a solid. The image of the drop is made with a simple digital camera by taking a picture that is magnified by an optical lens. The profile of the drop is then processed with ImageJ free software. The ImageJ contact…

  16. What's the Angle?

    NSDL National Science Digital Library

    American Museum of Natural History

    2002-01-01

    This activity helps learners understand how the angle of the Sun affects temperatures around the globe. After experimenting with a heat lamp and thermometers at differing angles, learners apply what they learned to explain temperature variations on Earth. The printable six-page handout includes a series of inquiry-based questions to get learners thinking about what they already know about temperature patterns, detailed experiment directions, and a worksheet that helps learners use the experiment results to gain a deeper understanding of seasonal temperature changes and why Antarctica is always cold.

  17. Camera assisted multimodal user interaction

    NASA Astrophysics Data System (ADS)

    Hannuksela, Jari; Silvén, Olli; Ronkainen, Sami; Alenius, Sakari; Vehviläinen, Markku

    2010-01-01

    Since more processing power, new sensing and display technologies are already available in mobile devices, there has been increased interest in building systems to communicate via different modalities such as speech, gesture, expression, and touch. In context identification based user interfaces, these independent modalities are combined to create new ways how the users interact with hand-helds. While these are unlikely to completely replace traditional interfaces, they will considerably enrich and improve the user experience and task performance. We demonstrate a set of novel user interface concepts that rely on built-in multiple sensors of modern mobile devices for recognizing the context and sequences of actions. In particular, we use the camera to detect whether the user is watching the device, for instance, to make the decision to turn on the display backlight. In our approach the motion sensors are first employed for detecting the handling of the device. Then, based on ambient illumination information provided by a light sensor, the cameras are turned on. The frontal camera is used for face detection, while the back camera provides for supplemental contextual information. The subsequent applications triggered by the context can be, for example, image capturing, or bar code reading.

  18. LROC - Lunar Reconnaissance Orbiter Camera

    Microsoft Academic Search

    M. S. Robinson; E. Bowman-Cisneros; S. M. Brylow; E. Eliason; H. Hiesinger; B. L. Jolliff; A. S. McEwen; M. C. Malin; D. Roberts; P. C. Thomas; E. Turtle

    2006-01-01

    The Lunar Reconnaissance Orbiter Camera (LROC) is designed to address two of the prime LRO measurement requirements. 1) Assess meter and smaller-scale features to facilitate safety analysis for potential lunar landing sites near polar resources, and elsewhere on the Moon. 2) Acquire multi-temporal synoptic imaging of the poles every orbit to characterize the polar illumination environment (100 m scale), identifying

  19. Multi-camera video surveillance

    Microsoft Academic Search

    Tim Ellis

    2002-01-01

    This paper describes the development of a multi-view video surveillance and the algorithms to detect and track objects (generally low densities of pedestrians, cyclists and motor vehicles) moving through an outdoor environment imaged by a network of video surveillance cameras. The system is designed to adapt to the widely varying illumination conditions present in such outdoor scenes, as well as

  20. High-speed pulse camera

    NASA Technical Reports Server (NTRS)

    Lawson, J. R.

    1968-01-01

    Miniaturized, 16 mm high speed pulse camera takes spectral photometric photographs upon instantaneous command. The design includes a low-friction, low-inertia film transport, a very thin beryllium shutter driven by a low-inertia stepper motor for minimum actuation time after a pulse command, and a binary encoder.

  1. TRACKING WITH A PAN-TILT-ZOOM CAMERA FOR AN ACC SYSTEM X. Clady, F. Collange, F. Jurie and P. Martinet

    E-print Network

    Paris-Sud XI, Université de

    TRACKING WITH A PAN-TILT-ZOOM CAMERA FOR AN ACC SYSTEM X. Clady, F. Collange, F. Jurie and P of frontal view in intelli- gent cars is considered. A Pan-Tilt-Zoom (PTZ) camera is used to track preceding, controlled for zoom and pan tilt angles, represents the hardware part of the sys- tem. It provides image

  2. Applications of visible CCD cameras on the Alcator C-Mod C. J. Boswell, J. L. Terry, B. Lipschultz, J. Stillerman

    E-print Network

    Boswell, Christopher

    a wide-angle view of the tokamak. All five of the CCD camera are off-the-shelf remote-head "pencil field coils and magnetic fields of up to 4 T. Fig. 1 shows the location of the cameras in the reentrant

  3. Contribution to the study of narrow low mass hadronic structures

    E-print Network

    B. Tatischeff; E. Tomasi-Gustafsson

    2008-02-01

    New data are presented, concerning narrow exotic structures in mesons, baryons and dibaryons. The sequence of narrow baryons is quite well described starting from the sequence of narrow mesons. In the same way, the sequence of narrow dibaryons is rather well described starting from the sequence of narrow baryons. Lastly it is shown that the masses of these narrow hadronic structures lie on straight line Regge-like trajectories.

  4. In-flight calibration of the EOS\\/ Multi-angle Imaging SpectroRadiometer (MISR)

    Microsoft Academic Search

    Carol J. Bruegge; Brian G. Chafin; David J. Diner; Robert R. Ando

    2001-01-01

    ABSTRACT The Multi-angle Imaging SpectroRadiometer (MISR) is one of five instruments on the EOS\\/ Terra spacecraft. MISR consists of nine Earth-viewing cameras which continuously acquire global data sets in view perspectives from nadir to 70°. In order to maintain the radiometric calibration ofthe cameras, the instrument is equipped with an on-board calibrator. Spectralon panels, deployed at bi-monthly intervals, reflect sunlight

  5. Optimal Positioning of Multiple Cameras for Object Recognition using Cramer-Rao Lower Bound

    Microsoft Academic Search

    F. Farshidi; Shahin Sirouspour; Thia Kirubarajan

    2006-01-01

    In this paper the problem of active object recognition\\/pose estimation is investigated. The principle component analysis is used to produce an observation vector from images captured simultaneously by multiple cameras from different view angles of an object belonging to a set of a priori known objects. Models of occlusion and sensor noise have been incorporated into a probabilistic model of

  6. The effect of illumination on the precision of photogrammetric measurements using Apollo metric camera photographs

    Microsoft Academic Search

    S. S. C. Wu

    1976-01-01

    The effect of illumination conditions on the precision or standard error of measurements using photogrammetric techniques is studied because of the scientific importance of the metric camera photography. Two factors related to illumination conditions are major considerations for selecting and planning lunar photography for photogrammetric purposes: amount of area shadowed and unusable, and effect of high sun angles on image

  7. Casting and Angling.

    ERIC Educational Resources Information Center

    Little, Mildred J.; Bunting, Camille

    The self-contained packet contains background information, lesson plans, 15 transparency and student handout masters, drills and games, 2 objective examinations, and references for teaching a 15-day unit on casting and angling to junior high and senior high school students, either as part of a regular physical education program or as a club…

  8. Ring laser angle encoder

    NASA Technical Reports Server (NTRS)

    Coccoli, J. D.; Lawson, J. R.; Mc Garty, T. P.; Nickles, J. E.

    1969-01-01

    Ring laser angle encoder with a scanning photometer autocollimator and an isolation axis, provides continuous digital readout. It measures the angular difference in inertial attitudes of target /any phenomena generating or reflecting a light beam/ two at a time relative to target one at a time.

  9. Casting and Angling.

    ERIC Educational Resources Information Center

    Smith, Julian W.

    As part of a series of books and pamphlets on outdoor education, this manual consists of easy-to-follow instructions for fishing activities dealing with casting and angling. The manual may be used as a part of the regular physical education program in schools and colleges or as a club activity for the accomplished weekend fisherman or the…

  10. Robust Estimation of 3-D Camera Motion for Uncalibrated Augmented Reality

    Microsoft Academic Search

    Annie Yao; Andrew Calway

    2002-01-01

    We describe initial work on a system for augmenting video sequences with 3-D graphics or animations so that they appear to be present within the scene. Our aim is to do this in real-time for sequences captured by uncalibrated 'live' cameras, such as a hand-held or wearable. These se- quences typically contain jitter and can have narrow base- lines between

  11. An assessment of the on-orbit performance of the CALIPSO wide field camera

    Microsoft Academic Search

    Michael C. Pitts; Larry W. Thomason; Yongxiang Hu; David M. Winker

    2007-01-01

    The Wide Field Camera (WFC) is one of three instruments in the CALIPSO science payload, with the other two being the Cloud-Aerosol LIdar with Orthogonal Polarization (CALIOP) and the Infrared Imaging Radiometer (IIR). The WFC is a narrow-band, push-broom imager that provides continuous high-spatial-resolution imagery during the daylight segments of the orbit over a swath centered on the CALIOP footprint.

  12. Boiling Visualization and Critical Heat Flux Phenomena In Narrow Rectangular Gap

    SciTech Connect

    J. J. Kim; Y. H. Kim; S. J. Kim; S. W. Noh; K. Y. Suh; J. Rempe; F. B. Cheung; S. B. Kim

    2004-12-01

    An experimental study was performed to investifate the pool boling critical hear flux (CHF) on one-dimensional inclined rectangular channels with narrow gaps by changing the orientation of a copper test heater assembly. In a pool of saturated water at atmospheric pressure, the test parameters include the gap sizes of 1,2,5, and 10 mm, andthe surface orientation angles from the downward facing position (180 degrees) to the vertical position (90 degress) respectively.

  13. Image quality testing of assembled IR camera modules

    NASA Astrophysics Data System (ADS)

    Winters, Daniel; Erichsen, Patrik

    2013-10-01

    Infrared (IR) camera modules for the LWIR (8-12_m) that combine IR imaging optics with microbolometer focal plane array (FPA) sensors with readout electronics are becoming more and more a mass market product. At the same time, steady improvements in sensor resolution in the higher priced markets raise the requirement for imaging performance of objectives and the proper alignment between objective and FPA. This puts pressure on camera manufacturers and system integrators to assess the image quality of finished camera modules in a cost-efficient and automated way for quality control or during end-of-line testing. In this paper we present recent development work done in the field of image quality testing of IR camera modules. This technology provides a wealth of additional information in contrast to the more traditional test methods like minimum resolvable temperature difference (MRTD) which give only a subjective overall test result. Parameters that can be measured are image quality via the modulation transfer function (MTF) for broadband or with various bandpass filters on- and off-axis and optical parameters like e.g. effective focal length (EFL) and distortion. If the camera module allows for refocusing the optics, additional parameters like best focus plane, image plane tilt, auto-focus quality, chief ray angle etc. can be characterized. Additionally, the homogeneity and response of the sensor with the optics can be characterized in order to calculate the appropriate tables for non-uniformity correction (NUC). The technology can also be used to control active alignment methods during mechanical assembly of optics to high resolution sensors. Other important points that are discussed are the flexibility of the technology to test IR modules with different form factors, electrical interfaces and last but not least the suitability for fully automated measurements in mass production.

  14. Silicone Contamination Camera for Developed for Shuttle Payloads

    NASA Technical Reports Server (NTRS)

    1996-01-01

    On many shuttle missions, silicone contamination from unknown sources from within or external to the shuttle payload bay has been a chronic problem plaguing experiment payloads. There is currently a wide range of silicone usage on the shuttle. Silicones are used to coat the shuttle tiles to enhance their ability to shed rain, and over 100 kg of RTV 560 silicone is used to seal white tiles to the shuttle surfaces. Silicones are also used in electronic components, potting compounds, and thermal control blankets. Efforts to date to identify and eliminate the sources of silicone contamination have not been highly successful and have created much controversy. To identify the sources of silicone contamination on the space shuttle, the NASA Lewis Research Center developed a contamination camera. This specially designed pinhole camera utilizes low-Earth-orbit atomic oxygen to develop a picture that identifies sources of silicone contamination on shuttle-launched payloads. The volatile silicone species travel through the aperture of the pinhole camera, and since volatile silicone species lose their hydrocarbon functionalities under atomic oxygen attack, the silicone adheres to the substrate as SiO_x. This glassy deposit should be spatially arranged in the image of the sources of silicone contamination. To view the contamination image, one can use ultrasensitive thickness measurement techniques, such as scanning variable-angle ellipsometry, to map the surface topography of the camera's substrate. The demonstration of a functional contamination camera would resolve the controversial debate concerning the amount and location of contamination sources, would allow corrective actions to be taken, and would demonstrate a useful tool for contamination documentation on future shuttle payloads, with near negligible effect on cost and weight.

  15. General linear cameras : theory and applications

    E-print Network

    Yu, Jingyi, 1978-

    2005-01-01

    I present a General Linear Camera (GLC) model that unifies many previous camera models into a single representation. The GLC model describes all perspective (pinhole), orthographic, and many multiperspective (including ...

  16. Collaborative real-time scheduling of multiple PTZ cameras for multiple object tracking in video surveillance

    NASA Astrophysics Data System (ADS)

    Liu, Yu-Che; Huang, Chung-Lin

    2013-03-01

    This paper proposes a multi-PTZ-camera control mechanism to acquire close-up imagery of human objects in a surveillance system. The control algorithm is based on the output of multi-camera, multi-target tracking. Three main concerns of the algorithm are (1) the imagery of human object's face for biometric purposes, (2) the optimal video quality of the human objects, and (3) minimum hand-off time. Here, we define an objective function based on the expected capture conditions such as the camera-subject distance, pan tile angles of capture, face visibility and others. Such objective function serves to effectively balance the number of captures per subject and quality of captures. In the experiments, we demonstrate the performance of the system which operates in real-time under real world conditions on three PTZ cameras.

  17. Angle Sense: A Valuable Connector.

    ERIC Educational Resources Information Center

    Rubenstein, Rheta N.; And Others

    1993-01-01

    Proposes angle sense as a fundamental connector between mathematical concepts for middle grade students. Introduces the use of pattern blocks and a goniometer, a tool to measure angles, to help students develop angle sense. Discusses connections between angle measurement and the concepts of rational numbers, circles, area, number theory,…

  18. A Different Angle on Perspective

    ERIC Educational Resources Information Center

    Frantz, Marc

    2012-01-01

    When a plane figure is photographed from different viewpoints, lengths and angles appear distorted. Hence it is often assumed that lengths, angles, protractors, and compasses have no place in projective geometry. Here we describe a sense in which certain angles are preserved by projective transformations. These angles can be constructed with…

  19. Measurement of the surface wavelength distribution of narrow-band radiation by a colorimetric method

    SciTech Connect

    Kraiskii, A V; Mironova, T V; Sultanov, T T [P N Lebedev Physical Institute, Russian Academy of Sciences, Moscow (Russian Federation)

    2010-09-10

    A method is suggested for determining the wavelength of narrow-band light from a digital photograph of a radiating surface. The digital camera used should be appropriately calibrated. The accuracy of the wavelength measurement is better than 1 nm. The method was tested on the yellow doublet of mercury spectrum and on the adjacent continuum of the incandescent lamp radiation spectrum. By means of the method suggested the homogeneity of holographic sensor swelling was studied in stationary and transient cases. (laser applications and other topics in quantum electronics)

  20. Normal Q-angle in an Adult Nigerian Population

    Microsoft Academic Search

    Bade B. Omololu; Olusegun S. Ogunlade; Vinod K. Gopaldasani

    2009-01-01

    The Q-angle has been studied among the adult Caucasian population with the establishment of reference values. Scientists are\\u000a beginning to accept the concept of different human races. Physical variability exists between various African ethnic groups\\u000a and Caucasians as exemplified by differences in anatomic features such as a flat nose compared with a pointed nose, wide rather\\u000a than narrow faces, and

  1. Radio properties of narrow-lined Seyfert 1 galaxies

    NASA Technical Reports Server (NTRS)

    Ulvestad, James S.; Antonucci, Robert R. J.; Goodrich, Robert W.

    1995-01-01

    We have observed seven narrow-linedd Seyfert 1 (NLS1) galaxies and one high-ionization Seyfert 2 galaxy with the Very Large Array (VLA). Combining these observations with published data, we summarize the radio properties of the NLS1 galaxies for which spectropolarimetry was reported by Goodrich. Fifteen of these 17 objects now have published radio observations of high sensitivity, and only nine of those have been detected. For a Hubble parameter of 75 km/s/Mpc, the 6 cm radio powers range from 10(exp 20) to 10(exp 23) W/Hz, within the range previously found for other types of Seyfert galaxy. The median radio size of the nine VLA-detected galaxies is no larger than 300 pc, similar to the median size found by Ulvestad & Wilson for a distance-limited sample of Seyfert galaxies. Of the six NLS1 galaxies known to have significant intrinsic optical polarization, three have measurable radio axes. Two of those three galaxies have radio major axes close to 90 deg from their polarization position angles, while the third has an inner radio axis that may be nearly parallel to the polarization position angle. The former relationship is expected for a Seyfert 1 in a unified model of Seyfert galaxies, assuming no intrinsic continuum polarization.

  2. A method for measuring the base angle of axicon lens based on chromatic dispersion

    NASA Astrophysics Data System (ADS)

    Zhang, Yunbo; Zeng, Aijun; Wang, Ying; Huang, Huijie

    2015-07-01

    A method for measuring the base angle of axicon lens is presented. This method utilizes two coaxial laser beams with different wavelengths. When the two laser beams passing through the axicon lens, there will be a small divergence angle between them resulted from chromatic dispersion. After collected by an achromatic lens, these two laser beams will generate two spots on an image camera. The base angle can be figured out with the distance between two spots recorded by the image sensor. Furthermore, this method can also be used to calculate the cone angle of axicon lens.

  3. Camera cooperation for achieving visual attention

    Microsoft Academic Search

    Radu Horaud; David Knossow; Markus Michaelis

    2006-01-01

    In this paper we address the problem of establish- ing a computational model for visual attention using coop- eration between two cameras. More specifically we wish to maintain a visual event within the field of view of a rotating and zooming camera through the understanding and model- ing of the geometric and kinematic coupling between a static camera and an

  4. CSc 165 Lecture Note Slides Camera Control

    E-print Network

    Gordon, Scott

    rotations... 8 CSc 165 Lecture Notes Camera Control Constrained Cameras 6 DoF "flight": pitch + yaw games (looking around shouldn't cause roll) Control by using local pitch, but global yaw 9 Local pitch Local yaw Global yaw CSc 165 Lecture Notes Camera Control Transformation matrix for Local Yaw: new

  5. Apogee Imaging Systems Camera Installation Guide

    E-print Network

    Kleinfeld, David

    Apogee Imaging Systems Camera Installation Guide Version 1.6 #12;Apogee Imaging Systems Camera Installation Guide Page 2 of 26 Disclaimer Apogee Imaging Systems, Inc. assumes no liability for the use in this document are subject to change without notice. Support The Apogee Imaging Systems Camera Installation Guide

  6. Theoretical error analysis with camera parameter calibration

    Microsoft Academic Search

    Takashi Fujimoto; Yoshihiko Nomura; Dili Zhang

    2004-01-01

    The camera calibration for the intrinsic parameters such as the principal point and the principal distance is one of the most important techniques for the 3-D measurement applications based on the cameras' 2D images: the principal point is the intersection of optical axis of camera and image plane, and the principal distance is the distance between the center of lens

  7. Multiple Camera Types Simultaneous Stereo Calibration

    Microsoft Academic Search

    Guillaume Caron; Damien Eynard

    2011-01-01

    Calibration is a classical issue in computer vision needed to retrieve 3D information from image measurements. This work presents a calibration approach for hybrid stereo rig involving multiple central camera types (perspective, fisheye, catadioptric). The paper extends the method of monocular perspective camera calibration using virtual visual servoing. The simultaneous intrinsic and extrinsic calibration of central cameras rig, using different

  8. Advisory Surveillance Cameras Page 1 of 2

    E-print Network

    Liebling, Michael

    Advisory ­ Surveillance Cameras May 2008 Page 1 of 2 ADVISORY -- USE OF CAMERAS/VIDEO SURVEILLANCE, balancing the employer's right to video surveillance and the employee's privacy rights. When considering ON CAMPUS Surveillance cameras or devices may be acquired, installed, modified, replaced or removed only

  9. Analysis of Camera Behavior During Tracking

    Microsoft Academic Search

    Swarup Reddi; George Loizou

    1995-01-01

    A camera is mounted on a moving robot and can rotate, relative to the robot, about two axes. We show how the optical flow field can be used to control the camera's motion to keep a target at the center of the camera's field of view, but that this is not always possible when the target lies close to the

  10. Event Photography Setting up the Camera

    E-print Network

    Zhou, Chongwu

    Event Photography Guide #12;Setting up the Camera Double-check that the battery is fully charged and the memory card is in the camera. When you turn the camera on, the battery meter will appear in the lower of awards, etc. for writing captions later. » Take some test shots and review them to ensure lighting is ok

  11. Camera Self-Calibration: Theory and Experiments

    Microsoft Academic Search

    Olivier D. Faugeras; Quang-tuan Luong; Stephen J. Maybank

    1992-01-01

    The problem of finding the internal orientation of a camera (camera calibration) is extremely important for practical applications. In this paper a complete method for calibrating a camera is presented. In contrast with existing methods it does not require a calibration object with a known 3D shape. The new method requires only point matches from image sequences. It is shown,

  12. Smart Camera Networks in Virtual Reality

    E-print Network

    Qureshi, Faisal Z.

    and handoff, is robust against camera and communication failures, and requires no camera calibration, detailed visual sensor networks will rely on smart cameras for sensing, computation, and communication. Smart circuitry, (wireless) communication interfaces, and on-board pro- cessing and storage capabilities

  13. Direct readout devices for streak cameras

    Microsoft Academic Search

    J. C. Cheng; S. W. Thomas; E. K. Storm; W. R. McLerran; G. R. Tripp; L. W. Coleman

    1977-01-01

    Two techniques are used to obtain a direct readout of an ultrafast streak camera. The first method uses a linear solid-state Reticon diode array, and the second technique entails the use of a SEC vidicon camera. Both methods use fiber optics to couple the light from the output of the streak camera to the sensor. In addition, the SEC vidicon

  14. Direct readout devices for streak cameras

    Microsoft Academic Search

    J. C. Cheng; S. W. Thomas; E. K. Storm; W. R. McLerran; G. R. Tripp; L. W. Coleman

    1976-01-01

    Two techniques are used to obtain a direct readout of an ultrafast streak camera. The first method uses a linear solid state diode array and the second technique entails the use of a vidicon camera. Both methods use fiber optics to couple the light from the output of the streak camera to the sensor. In addition, the vidicon is interfaced

  15. Pinhole Camera For Viewing Electron Beam Materials Processing

    NASA Astrophysics Data System (ADS)

    Rushford, M. C.; Kuzmenko, P. J.

    1986-10-01

    A very rugged, compact (4x4x10 inches), gas purged "PINHOLE CAMERA" has been developed for viewing electron beam materials processing (e.g. melting or vaporizing metal). The video image is computer processed, providing dimensional and temperature measurements of objects within the field of view, using an IBM PC. The "pinhole camera" concept is similar to a TRW optics system for viewing into a coal combustor through a 2 mm hole. Gas is purged through the hole to repel particulates from optical surfaces. In our system light from the molten metal passes through the 2 mm hole "PINHOLE", reflects off an aluminum coated glass substrate and passes through a window into a vacuum tight container holding the camera and optics at atmospheric pressure. The mirror filters out X rays which pass through the AL layer and are absorbed in the glass mirror substrate. Since metallic coatings are usually reflective, the image quality is not severely degraded by small amounts of vapor that overcome the gas purge to reach the mirror. Coating thicknesses of up to 2 microns can be tolerated. The mirror is the only element needing occasional servicing. We used a telescope eyepiece as a convenient optical design, but with the traditional optical path reversed. The eyepiece images a scene through a small entrance aperture onto an image plane where a CCD camera is placed. Since the iris of the eyepiece is fixed and the scene intensity varies it was necessary to employ a variable neutral density filter for brightness control. Devices used for this purpose include PLZT light valve from Motorola, mechanically rotated linear polarizer sheets, and nematic liquid crystal light valves. These were placed after the mirror and entrance aperture but before the lens to operate as a voltage variable neutral density filter. The molten metal surface temp being viewed varies from 4000 to 1200 degrees Kelvin. The resultant intensity change (at 488 nm with 10 nm bandwidth) is seven orders of magnitude. This surface intensity variation is contrast reduced if the observation wavelength is a narrow band as far red as high intensity blooming will allow an observable picture. A three eyepiece camera allows an image plane where photo gray glass functions as a neutral density filter only over the high intensity portion of the image, thus reducing blooming. This system is enclosed in a water-cooled housing which can dissipate 15 watts/cm2, keeping the camera below 40 degrees Celsius. Single frames of video output are acquired for feature enhancement and location by a Data Translation DT2803 image processing board housed in an IBM PC.

  16. Earth elevation map production and high resolution sensing camera imaging analysis

    NASA Astrophysics Data System (ADS)

    Yang, Xiubin; Jin, Guang; Jiang, Li; Dai, Lu; Xu, Kai

    2010-11-01

    The Earth's digital elevation which impacts space camera imaging has prepared and imaging has analysed. Based on matching error that TDI CCD integral series request of the speed of image motion, statistical experimental methods-Monte Carlo method is used to calculate the distribution histogram of Earth's elevation in image motion compensated model which includes satellite attitude changes, orbital angular rate changes, latitude, longitude and the orbital inclination changes. And then, elevation information of the earth's surface from SRTM is read. Earth elevation map which produced for aerospace electronic cameras is compressed and spliced. It can get elevation data from flash according to the shooting point of latitude and longitude. If elevation data between two data, the ways of searching data uses linear interpolation. Linear interpolation can better meet the rugged mountains and hills changing requests. At last, the deviant framework and camera controller are used to test the character of deviant angle errors, TDI CCD camera simulation system with the material point corresponding to imaging point model is used to analyze the imaging's MTF and mutual correlation similarity measure, simulation system use adding cumulation which TDI CCD imaging exceeded the corresponding pixel horizontal and vertical offset to simulate camera imaging when stability of satellite attitude changes. This process is practicality. It can effectively control the camera memory space, and meet a very good precision TDI CCD camera in the request matches the speed of image motion and imaging.

  17. Advanced camera image data acquisition system for Pi-of-the-Sky

    NASA Astrophysics Data System (ADS)

    Kwiatkowski, Maciej; Kasprowicz, Grzegorz; Pozniak, Krzysztof; Romaniuk, Ryszard; Wrochna, Grzegorz

    2008-11-01

    The paper describes a new generation of high performance, remote control, CCD cameras designed for astronomical applications. A completely new camera PCB was designed, manufactured, tested and commissioned. The CCD chip was positioned in a different way than previously resulting in better performance of the astronomical video data acquisition system. The camera was built using a low-noise, 4Mpixel CCD circuit by STA. The electronic circuit of the camera is highly parameterized and reconfigurable, as well as modular in comparison with the solution of first generation, due to application of open software solutions and FPGA circuit, Altera Cyclone EP1C6. New algorithms were implemented into the FPGA chip. There were used the following advanced electronic circuit in the camera system: microcontroller CY7C68013a (core 8051) by Cypress, image processor AD9826 by Analog Devices, GigEth interface RTL8169s by Realtec, memory SDRAM AT45DB642 by Atmel, CPU typr microprocessor ARM926EJ-S AT91SAM9260 by ARM and Atmel. Software solutions for the camera and its remote control, as well as image data acquisition are based only on the open source platform. There were used the following image interfaces ISI and API V4L2, data bus AMBA, AHB, INDI protocol. The camera will be replicated in 20 pieces and is designed for continuous on-line, wide angle observations of the sky in the research program Pi-of-the-Sky.

  18. Rover mast calibration, exact camera pointing, and camara handoff for visual target tracking

    NASA Technical Reports Server (NTRS)

    Kim, Won S.; Ansar, Adnan I.; Steele, Robert D.

    2005-01-01

    This paper presents three technical elements that we have developed to improve the accuracy of the visual target tracking for single-sol approach-and-instrument placement in future Mars rover missions. An accurate, straightforward method of rover mast calibration is achieved by using a total station, a camera calibration target, and four prism targets mounted on the rover. The method was applied to Rocky8 rover mast calibration and yielded a 1.1-pixel rms residual error. Camera pointing requires inverse kinematic solutions for mast pan and tilt angles such that the target image appears right at the center of the camera image. Two issues were raised. Mast camera frames are in general not parallel to the masthead base frame. Further, the optical axis of the camera model in general does not pass through the center of the image. Despite these issues, we managed to derive non-iterative closed-form exact solutions, which were verified with Matlab routines. Actual camera pointing experiments aver 50 random target image paints yielded less than 1.3-pixel rms pointing error. Finally, a purely geometric method for camera handoff using stereo views of the target has been developed. Experimental test runs show less than 2.5 pixels error on high-resolution Navcam for Pancam-to-Navcam handoff, and less than 4 pixels error on lower-resolution Hazcam for Navcam-to-Hazcam handoff.

  19. Imaging and radiometric performance simulation for a new high-performance dual-band airborne reconnaissance camera

    NASA Astrophysics Data System (ADS)

    Seong, Sehyun; Yu, Jinhee; Ryu, Dongok; Hong, Jinsuk; Yoon, Jee-Yeon; Kim, Sug-Whan; Lee, Jun-Ho; Shin, Myung-Jin

    2009-05-01

    In recent years, high performance visible and IR cameras have been used widely for tactical airborne reconnaissance. The process improvement for efficient discrimination and analysis of complex target information from active battlefields requires for simultaneous multi-band measurement from airborne platforms at various altitudes. We report a new dual band airborne camera designed for simultaneous registration of both visible and IR imagery from mid-altitude ranges. The camera design uses a common front end optical telescope of around 0.3m in entrance aperture and several relay optical sub-systems capable of delivering both high spatial resolution visible and IR images to the detectors. The camera design is benefited from the use of several optical channels packaged in a compact space and the associated freedom to choose between wide (~3 degrees) and narrow (~1 degree) field of view. In order to investigate both imaging and radiometric performances of the camera, we generated an array of target scenes with optical properties such as reflection, refraction, scattering, transmission and emission. We then combined the target scenes and the camera optical system into the integrated ray tracing simulation environment utilizing Monte Carlo computation technique. Taking realistic atmospheric radiative transfer characteristics into account, both imaging and radiometric performances were then investigated. The simulation results demonstrate successfully that the camera design satisfies NIIRS 7 detection criterion. The camera concept, details of performance simulation computation, the resulting performances are discussed together with future development plan.

  20. Cooling the dark energy camera instrument

    SciTech Connect

    Schmitt, R.L.; Cease, H.; /Fermilab; DePoy, D.; /Ohio State U.; Diehl, H.T.; Estrada, J.; Flaugher, B.; /Fermilab; Kuhlmann, S.; /Ohio State U.; Onal, Birce; Stefanik, A.; /Fermilab

    2008-06-01

    DECam, camera for the Dark Energy Survey (DES), is undergoing general design and component testing. For an overview see DePoy, et al in these proceedings. For a description of the imager, see Cease, et al in these proceedings. The CCD instrument will be mounted at the prime focus of the CTIO Blanco 4m telescope. The instrument temperature will be 173K with a heat load of 113W. In similar applications, cooling CCD instruments at the prime focus has been accomplished by three general methods. Liquid nitrogen reservoirs have been constructed to operate in any orientation, pulse tube cryocoolers have been used when tilt angles are limited and Joule-Thompson or Stirling cryocoolers have been used with smaller heat loads. Gifford-MacMahon cooling has been used at the Cassegrain but not at the prime focus. For DES, the combined requirements of high heat load, temperature stability, low vibration, operation in any orientation, liquid nitrogen cost and limited space available led to the design of a pumped, closed loop, circulating nitrogen system. At zenith the instrument will be twelve meters above the pump/cryocooler station. This cooling system expected to have a 10,000 hour maintenance interval. This paper will describe the engineering basis including the thermal model, unbalanced forces, cooldown time, the single and two-phase flow model.

  1. Small angle electron diffraction and deflection

    NASA Astrophysics Data System (ADS)

    Koyama, T.; Takayanagi, K.; Togawa, Y.; Mori, S.; Harada, K.

    2012-03-01

    Electron optical system is constructed in order to obtain small angle diffraction and Lorentz deflection of electrons at the order of down to 10-6 radian in the reciprocal space. Long-distance camera length up to 3000 m is achieved in a conventional transmission electron microscope with LaB6 thermal emission type. The diffraction pattern at 5 × 10-6 radian is presented in a carbon replica grating with 500 nm lattice spacing while the magnetic deflection pattern at 2 × 10-5 radian is exhibited in Permalloy elements. A simultaneous recording of electron diffraction and Lorentz deflection is also demonstrated in 180 degree striped magnetic domains of La0.825Sr0.175MnO3.

  2. Digital camera focus assessment using a camera flange-mounted fiber optic probe

    Microsoft Academic Search

    Michael A. Marcus; Jiann-Rong Lee; Stanley Gross; T. Trembley

    1999-01-01

    During the assembly of high-end digital cameras, it is necessary to determine the location and orientation of the imager plane in order to assess the camera's focusing capability. An apparatus based on non-coherent light interferometry has been developed, which performs these test immediately after the digital imager is installed into the camera body. The instrument includes a camera lens flange-

  3. Search for a narrow resonant transfer and excitation resonance of titanium projectiles channeled in a gold crystal

    SciTech Connect

    Dittner, P.F.; Vane, C.R.; Krause, H.F.; Gomez del Campo, J.; Jones, N.L.; Zeijlmans van Emmichoven, P.A.; Bechthold, U.; Datz, S. (Physics Division, Oak Ridge National Laboratory, Oak Ridge, Tennessee 37831 (United States))

    1992-03-01

    Transfer and excitation, resulting from simultaneous electron capture and {ital K}-shell excitation in a single collision, has been measured 280--310-MeV Ti{sup 20+} ions channeled in the {l angle}100{r angle} axis of a thin Au single crystal. The 19+ charge-state fraction of the Ti ions exiting the Au crystal was measured as a function of ion energy and showed no narrow peak attributable to resonant transfer and excitation (RTE). The number of Ti {ital K}{alpha} x rays, emitted by the Ti ions due to RTE, in coincidence with Ti{sup 19+} was also measured at two energies, on and off the previously reported narrow resonance, and again no evidence for a narrow resonance was observed.

  4. Hubble Space Telescope, Wide Field Planetary Camera

    NASA Technical Reports Server (NTRS)

    1981-01-01

    This illustration is a diagram of the Hubble Space Telescope's (HST's), Wide Field Planetary Camera (WF/PC), one of the five Scientific Instruments. The WF/PC uses a four-sided pyramid mirror to split a light image into quarters. It then focuses each quadrant onto one of two sets of four sensors. The sensors are charge-coupled detectors and function as the electronic equivalent of extremely sensitive photographic plates. The WF/PC operates in two modes. The Wide-Field mode that will view 7.2-arcmin sections of the sky, and the Planetary mode that will look at narrower fields of view, such as planets or areas within other galaxies. The purpose of the HST, the most complex and sensitive optical telescope ever made, is to study the cosmos from a low-Earth orbit. By placing the telescope in space, astronomers are able to collect data that is free of the Earth's atmosphere. The HST detects objects 25 times fainter than the dimmest objects seen from Earth and provides astronomers with an observable universe 250 times larger than visible from ground-based telescopes, perhaps as far away as 14 billion light-years. The HST views galaxies, stars, planets, comets, possibly other solar systems, and even unusual phenomena such as quasars, with 10 times the clarity of ground-based telescopes. The HST was deployed from the Space Shuttle Discovery (STS-31 mission) into Earth orbit in April 1990. The Marshall Space Flight Center had responsibility for design, development, and construction of the HST. The Perkin-Elmer Corporation, in Danbury, Cornecticut, developed the optical system and guidance sensors.

  5. ANIR : Atacama Near-Infrared Camera for the 1.0-m miniTAO Telescope

    E-print Network

    Konishi, Masahiro; Tateuchi, Ken; Takahashi, Hidenori; Kitagawa, Yutaro; Kato, Natsuko; Sako, Shigeyuki; Uchimoto, Yuka K; Toshikawa, Koji; Ohsawa, Ryou; Yamamuro, Tomoyasu; Asano, Kentaro; Ita, Yoshifusa; Kamizuka, Takafumi; Komugi, Shinya; Koshida, Shintaro; Manabe, Sho; Matsunaga, Noriyuki; Minezaki, Takeo; Morokuma, Tomoki; Nakashima, Asami; Takagi, Toshinobu; Tanabé, Toshihiko; Uchiyama, Mizuho; Aoki, Tsutomu; Doi, Mamoru; Handa, Toshihiro; Kato, Daisuke; Kawara, Kimiaki; Kohno, Kotaro; Miyata, Takashi; Nakamura, Tomohiko; Okada, Kazushi; Soyano, Takao; Tamura, Yoichi; Tanaka, Masuo; Tarusawa, Ken'ichi; Yoshii, Yuzuru

    2015-01-01

    We have developed a near-infrared camera called ANIR (Atacama Near-InfraRed camera) for the University of Tokyo Atacama Observatory 1.0m telescope (miniTAO) installed at the summit of Cerro Chajnantor (5640 m above sea level) in northern Chile. The camera provides a field of view of 5'.1 $\\times$ 5'.1 with a spatial resolution of 0".298 /pixel in the wavelength range of 0.95 to 2.4 $\\mu$m. Taking advantage of the dry site, the camera is capable of hydrogen Paschen-$\\alpha$ (Pa$\\alpha$, $\\lambda=$1.8751 $\\mu$m in air) narrow-band imaging observations, at which wavelength ground-based observations have been quite difficult due to deep atmospheric absorption mainly from water vapor. We have been successfully obtaining Pa$\\alpha$ images of Galactic objects and nearby galaxies since the first-light observation in 2009 with ANIR. The throughputs at the narrow-band filters ($N1875$, $N191$) including the atmospheric absorption show larger dispersion (~10%) than those at broad-band filters (a few %), indicating that ...

  6. Development of Nikon Space Camera

    NASA Astrophysics Data System (ADS)

    Goto, Tetsuro

    After Soviet cosmonaut Gagarin succeeded as the first human to orbit the Earth in 1961, American astronaut Glenn succeeded in a similar mission the following year, 1962, aboard the Friendship 7 spacecraft for the Mercury-Atlas 6 mission. Since before this event, the National Aeronautics and Space Administration (NASA) has used a large amount of imaging equipment to successfully record major astronomical phenomena and acquire analysis data. Nikon has made significant contributions to the American space program since the Apollo Program by continuously providing NASA with space cameras that meet their strict demands in terms of reliability, quality and durability under the most extreme conditions. The following details our achievements and specifics regarding modifications necessary for use in space, and also touches on space cameras provided by manufacturers other than Nikon, for which information may be quite limited.

  7. The Dark Energy Camera (DECam)

    NASA Astrophysics Data System (ADS)

    DePoy, D. L.; Abbott, T.; Annis, J.; Antonik, M.; Barceló, M.; Bernstein, R.; Bigelow, B.; Brooks, D.; Buckley-Geer, E.; Campa, J.; Cardiel, L.; Castander, F.; Castilla, J.; Cease, H.; Chappa, S.; Dede, E.; Derylo, G.; Diehl, H. T.; Doel, P.; DeVicente, J.; Estrada, J.; Finley, D.; Flaugher, B.; Gaztanaga, E.; Gerdes, D.; Gladders, M.; Guarino, V.; Gutierrez, G.; Hamilton, J.; Haney, M.; Holland, S.; Honscheid, K.; Huffman, D.; Karliner, I.; Kau, D.; Kent, S.; Kozlovsky, M.; Kubik, D.; Kuehn, K.; Kuhlmann, S.; Kuk, K.; Leger, F.; Lin, H.; Martinez, G.; Martinez, M.; Merritt, W.; Mohr, J.; Moore, P.; Moore, T.; Nord, B.; Ogando, R.; Olsen, J.; Onal, B.; Peoples, J.; Qian, T.; Roe, N.; Sanchez, E.; Scarpine, V.; Schmidt, R.; Schmitt, R.; Schubnell, M.; Schultz, K.; Selen, M.; Shaw, T.; Simaitis, V.; Slaughter, J.; Smith, C.; Spinka, H.; Stefanik, A.; Stuermer, W.; Talaga, R.; Tarle, G.; Thaler, J.; Tucker, D.; Walker, A.; Worswick, S.; Zhao, A.

    2008-07-01

    We describe the Dark Energy Camera (DECam), which will be the primary instrument used in the Dark Energy Survey. DECam will be a 3 sq. deg. mosaic camera mounted at the prime focus of the Blanco 4m telescope at the Cerro-Tololo International Observatory (CTIO). DECam includes a large mosaic CCD focal plane, a five element optical corrector, five filters (g,r,i,z,Y), and the associated infrastructure for operation in the prime focus cage. The focal plane consists of 62 2K x 4K CCD modules (0.27"/pixel) arranged in a hexagon inscribed within the roughly 2.2 degree diameter field of view. The CCDs will be 250 micron thick fully-depleted CCDs that have been developed at the Lawrence Berkeley National Laboratory (LBNL). Production of the CCDs and fabrication of the optics, mechanical structure, mechanisms, and control system for DECam are underway; delivery of the instrument to CTIO is scheduled for 2010.

  8. Camera calibration using identical objects

    Microsoft Academic Search

    Ruiyan Wang; Guang Jiang; Long Quan; Chengke Wu

    This paper describes a method for camera calibration using identical products. In this paper, we postulate an imaginative\\u000a rigid motion between any two identical products, and the imaginative rigid motion could offer a pair of circular points. As\\u000a is known, three pairs of projections of the circular points are needed to result in the closed-form solution for calibration.\\u000a In our

  9. Digital camera based fingerprint recognition

    Microsoft Academic Search

    B. Y. Hiew; B. J. Andrew; Y. H. Pang

    2007-01-01

    Touch-less fingerprint recognition deserves increasing attention as it lets off the problems of deformation, maintenance, latent fingerprint problems and so on that still exist in the touch-based fingerprint technology. However, problems such as the low ridges-valleys contrast in the fingerprint images, defocus and motion blurriness raise when developing a digital camera based fingerprint recognition system. The system comprises of preprocessing,

  10. Far UV camera/spectrograph

    NASA Technical Reports Server (NTRS)

    Carruthers, G. R.; Page, T.

    1972-01-01

    The far UV camera/spectrograph deployed in the Apollo 16 mission recorded light in the invisible band of wavelengths between 50 and 160 nm, approximately one-third the wavelength that can penetrate the atmosphere of the earth to ground based telescopes. The photographs obtained show hydrogen and other gases in the solar wind and interplanetary media, and provide new data on stars, nebulae, and galaxies. The instrument is described, the experimental goals outlined, and the preliminary results discussed.

  11. Graphic design of pinhole cameras

    NASA Technical Reports Server (NTRS)

    Edwards, H. B.; Chu, W. P.

    1979-01-01

    The paper describes a graphic technique for the analysis and optimization of pinhole size and focal length. The technique is based on the use of the transfer function of optical elements described by Scott (1959) to construct the transfer function of a circular pinhole camera. This transfer function is the response of a component or system to a pattern of lines having a sinusoidally varying radiance at varying spatial frequencies. Some specific examples of graphic design are presented.

  12. Superconducting cameras for optical astronomy

    Microsoft Academic Search

    D. D. E. Martin; P. Verhoeve; J. H. J. de Bruijne; A. P. Reynolds; A. Van Dordrecht; J. Verveer; J. Page; N. Rando; A. Peacock

    2002-01-01

    Superconducting Tunnel Junctions (STJs) have been extensively investigated as photon detectors covering the range from near-infrared to X-ray energies. A 6x6 array of Tantalum junctions has already been used in an optical spectro-photometer. With this camera, the European Space Agency has performed multiple astronomical observations of optical sources using the William Herschel 4.2m telescope at La Palma. Following the success

  13. Head-Free, Remote Eye-Gaze Detection System with Easy Calibration Using Stereo-Calibrated Two Video Cameras

    Microsoft Academic Search

    Yoshinobu Ebisawa; Kazuki Abo; Kiyotaka Fukumoto

    \\u000a The video-based, head-free, remote eye-gaze detection system based on detection of the pupil and the corneal reflection was\\u000a developed using stereocalibrated two cameras. The gaze detection theory assumed the linear relationship; ??=?k|r?|. Here, ? is the angle between the line of sight and the line connecting between the pupil and the camera, and |r’| indicates the size\\u000a of the corneal

  14. Angling hydraulic jumps

    NASA Astrophysics Data System (ADS)

    Belmonte, Andrew; Thiffeault, Jean-Luc

    2008-11-01

    We present an experimental and mathematical study of the normal impact of a jet onto an inclined solid surface, focusing on the characteristics of the hydraulic jump. The angle of the surface is varied between vertical and horizontal positions, using both flat and curved (patterned) surfaces. Comparisons of the outer envelope of the hydraulic jump are made with the ballistic theory and the model of Edwards, Howison, Ockendon, & Ockendon.

  15. Laser angle sensor

    NASA Technical Reports Server (NTRS)

    Pond, C. R.; Texeira, P. D.

    1985-01-01

    A laser angle measurement system was designed and fabricated for NASA Langley Research Center. The instrument is a fringe counting interferometer that monitors the pitch attitude of a model in a wind tunnel. A laser source and detector are mounted above the model. Interference fringes are generated by a small passive element on the model. The fringe count is accumulated and displayed by a processor in the wind tunnel control room. This report includes optical and electrical schematics, system maintenance and operation procedures.

  16. Small Angle Neutron Scattering

    SciTech Connect

    Urban, Volker S [ORNL

    2012-01-01

    Small Angle Neutron Scattering (SANS) probes structural details at the nanometer scale in a non-destructive way. This article gives an introduction to scientists who have no prior small-angle scattering knowledge, but who seek a technique that allows elucidating structural information in challenging situations that thwart approaches by other methods. SANS is applicable to a wide variety of materials including metals and alloys, ceramics, concrete, glasses, polymers, composites and biological materials. Isotope and magnetic interactions provide unique methods for labeling and contrast variation to highlight specific structural features of interest. In situ studies of a material s responses to temperature, pressure, shear, magnetic and electric fields, etc., are feasible as a result of the high penetrating power of neutrons. SANS provides statistical information on significant structural features averaged over the probed sample volume, and one can use SANS to quantify with high precision the structural details that are observed, for example, in electron microscopy. Neutron scattering is non-destructive; there is no need to cut specimens into thin sections, and neutrons penetrate deeply, providing information on the bulk material, free from surface effects. The basic principles of a SANS experiment are fairly simple, but the measurement, analysis and interpretation of small angle scattering data involves theoretical concepts that are unique to the technique and that are not widely known. This article includes a concise description of the basics, as well as practical know-how that is essential for a successful SANS experiment.

  17. Unassisted 3D camera calibration

    NASA Astrophysics Data System (ADS)

    Atanassov, Kalin; Ramachandra, Vikas; Nash, James; Goma, Sergio R.

    2012-03-01

    With the rapid growth of 3D technology, 3D image capture has become a critical part of the 3D feature set on mobile phones. 3D image quality is affected by the scene geometry as well as on-the-device processing. An automatic 3D system usually assumes known camera poses accomplished by factory calibration using a special chart. In real life settings, pose parameters estimated by factory calibration can be negatively impacted by movements of the lens barrel due to shaking, focusing, or camera drop. If any of these factors displaces the optical axes of either or both cameras, vertical disparity might exceed the maximum tolerable margin and the 3D user may experience eye strain or headaches. To make 3D capture more practical, one needs to consider unassisted (on arbitrary scenes) calibration. In this paper, we propose an algorithm that relies on detection and matching of keypoints between left and right images. Frames containing erroneous matches, along with frames with insufficiently rich keypoint constellations, are detected and discarded. Roll, pitch yaw , and scale differences between left and right frames are then estimated. The algorithm performance is evaluated in terms of the remaining vertical disparity as compared to the maximum tolerable vertical disparity.

  18. The Spacelab Wide Angle Telescope (SWAT)

    NASA Technical Reports Server (NTRS)

    West, R. M.; Gull, T. R.; Henize, K. G.; Bertola, F.

    1979-01-01

    A fast wide angle telescope that will be capable of imaging to the darker sky limit and in the ultraviolet wavelength region available above the atmosphere is described. The telescope (SWAT) has a resolution comparable to that of the large ground-based Schmidt telescope and a field of at least five degrees. A number of astrophysically important investigations can only be accomplished with such a telescope, e.g., detection of hidden, hot objects like hot white dwarfs and subwarfs in stellar binary systems, and energetic regions in globular clusters and galaxy nuclei. It permits unique studies of the UV-morphology of extended objects and allows discovery of very faint extensions, halos, jets, and filaments in galaxies. It can contribute to the investigation of dust in the Milky Way and in other galaxies and, with an objective prism, spectra of very faint objects can be obtained. The SWAT will localize objects for further study with the narrow-field Space Telescope.

  19. Narrow-Band Applications of Communications Satellites.

    ERIC Educational Resources Information Center

    Cowlan, Bert; Horowitz, Andrew

    This paper attempts to describe the advantages of "narrow-band" applications of communications satellites for education. It begins by discussing the general controversy surrounding the use of satellites in education, by placing the concern within the larger context of the general debate over the uses of new technologies in education, and by…

  20. Adaptive suppression of narrow-band vibrations

    Microsoft Academic Search

    E. Bertran; G. Montoro

    1998-01-01

    The aim of the paper is to develop a theoretical analysis and implementation of an active control canceller devoted to eliminate narrow-band vibrations in rotary machines. The proposed system uses a bank of digital adaptive notch filters each of them adjusted by an LMS algorithm. The theoretical results are applied to a vibrating DC motor considered as a benchmark problem.

  1. Adverse effects of prohibiting narrow provider networks.

    PubMed

    Howard, David H

    2014-08-14

    Many insurers participating in the new insurance exchanges are controlling costs by offering plans with narrow provider networks. Proposed regulations would promote network adequacy, but a pro-provider stance may not be inherently pro-consumer or even pro-patient. PMID:25119604

  2. Assessing camera performance for quantitative microscopy.

    PubMed

    Lambert, Talley J; Waters, Jennifer C

    2014-01-01

    Charge-coupled device and, increasingly, scientific complementary metal oxide semiconductor cameras are the most common digital detectors used for quantitative microscopy applications. Manufacturers provide technical specification data on the average or expected performance characteristics for each model of camera. However, the performance of individual cameras may vary, and many of the characteristics that are important for quantitation can be easily measured. Though it may seem obvious, it is important to remember that the digitized image you collect is merely a representation of the sample itself--and no camera can capture a perfect representation of an optical image. A clear understanding and characterization of the sources of noise and imprecision in your camera are important for rigorous quantitative analysis of digital images. In this chapter, we review the camera performance characteristics that are most critical for generating accurate and precise quantitative data and provide a step-by-step protocol for measuring these characteristics in your camera. PMID:24974021

  3. Optimum Projection Angle for Attaining Maximum Distance in a Soccer Punt Kick

    PubMed Central

    Linthorne, Nicholas P.; Patel, Dipesh S.

    2011-01-01

    To produce the greatest horizontal distance in a punt kick the ball must be projected at an appropriate angle. Here, we investigated the optimum projection angle that maximises the distance attained in a punt kick by a soccer goalkeeper. Two male players performed many maximum-effort kicks using projection angles of between 10° and 90°. The kicks were recorded by a video camera at 100 Hz and a 2 D biomechanical analysis was conducted to obtain measures of the projection velocity, projection angle, projection height, ball spin rate, and foot velocity at impact. The player’s optimum projection angle was calculated by substituting mathematical equations for the relationships between the projection variables into the equations for the aerodynamic flight of a soccer ball. The calculated optimum projection angles were in agreement with the player’s preferred projection angles (40° and 44°). In projectile sports even a small dependence of projection velocity on projection angle is sufficient to produce a substantial shift in the optimum projection angle away from 45°. In the punt kicks studied here, the optimum projection angle was close to 45° because the projection velocity of the ball remained almost constant across all projection angles. This result is in contrast to throwing and jumping for maximum distance, where the projection velocity the athlete is able to achieve decreases substantially with increasing projection angle and so the optimum projection angle is well below 45°. Key points The optimum projection angle that maximizes the distance of a punt kick by a soccer goalkeeper is about 45°. The optimum projection angle is close to 45° because the projection velocity of the ball is almost the same at all projection angles. This result is in contrast to throwing and jumping for maximum distance, where the optimum projection angle is well below 45° because the projection velocity the athlete is able to achieve decreases substantially with increasing projection angle. PMID:24149315

  4. The effects of orientation angle, subcooling, heat flux, mass flux, and pressure on bubble growth and detachment in subcooled flow boiling

    E-print Network

    Sugrue, Rosemary M

    2012-01-01

    The effects of orientation angle, subcooling, heat flux, mass flux, and pressure on bubble growth and detachment in subcooled flow boiling were studied using a high-speed video camera in conjunction with a two-phase flow ...

  5. Three-dimensional location and attitude evaluation for rendezvous and docking operation using a single camera

    NASA Astrophysics Data System (ADS)

    Wang, Zhiling; Losito, S.; Mugnuolo, Raffaele; Pasquariello, Guido

    1993-01-01

    In the automatic rendezvous and docking manoeuvre (RVD) of space activity, determining the 3-D location and attitude between two vehicles is most important. A vision system to perform the docking manipulation in RVD is described in this paper. An improved algorithm is used for calibrating the geometric parameters of a camera fixed on the tracking vehicle off-line. Because the line-off-sight angles of four markers on the target vehicle to the lens center of the camera can be computed according to the optical principle and vector theory, the locations of the vehicle are obtained from the solution for a set of nonlinear equations from the triangular theory. The attitude angles for the vehicles are solved by a translational matrix of target frame to vehicle frame. As the vehicle closes in to the target, sets of markers having different distance intervals or a list of calibration parameters for cameras with different fields of view are selected at the proper moment to improve the situation when at least one of the markers exceeds the field of camera view. The series of experiments is given. The vision system is run on a SUN-4/330 Sparc station system equipped with one image board IT-151 and a CCD TV camera. All software is written in C language.

  6. A state observer for using a slow camera as a sensor for fast control applications

    NASA Astrophysics Data System (ADS)

    Gahleitner, Reinhard; Schagerl, Martin

    2013-03-01

    This contribution concerns about a problem that often arises in vision based control, when a camera is used as a sensor for fast control applications, or more precisely, when the sample rate of the control loop is higher than the frame rate of the camera. In control applications for mechanical axes, e.g. in robotics or automated production, a camera and some image processing can be used as a sensor to detect positions or angles. The sample time in these applications is typically in the range of a few milliseconds or less and this demands the use of a camera with a high frame rate up to 1000 fps. The presented solution is a special state observer that can work with a slower and therefore cheaper camera to estimate the state variables at the higher sample rate of the control loop. To simplify the image processing for the determination of positions or angles and make it more robust, some LED markers are applied to the plant. Simulation and experimental results show that the concept can be used even if the plant is unstable like the inverted pendulum.

  7. [The influence of camera-to-object distance and focal length on the representation of faces].

    PubMed

    Verhoff, Marcel A; Witzel, Carsten; Ramsthaler, Frank; Kreutz, Kerstin

    2007-01-01

    When one thinks of the so-called barrel or wide-angle distortion, grotesquely warped faces may come to mind. For less extreme cases with primarily inconspicuous facial proportions, the question, however, still arises whether there may be a resulting impact on the identification of faces. In the first experiment, 3 test persons were photographed at a fixed camera-to-object distance of 2 m. In the second experiment, 18 test persons were each photographed at a distance of 0.5 m and 2.0 m. For both experiments photographs were taken from a fixed angle of view in alignment with the Frankfurt Plane. An isolated effect of the focal length on facial proportions could not be demonstrated. On the other hand, changes in the camera-to-object distance clearly influenced facial proportions and shape. A standardized camera-to-object distance for passport photos, as well as reconstruction of the camera-to-object distance from crime scene photos and the use of this same distance in taking photographs for comparison of suspects are called for. A proposal to refer to wide-angle distortion as the nearness effect is put forward. PMID:17879705

  8. Fruit Detectability Analysis for Different Camera Positions in Sweet-Pepper †

    PubMed Central

    Hemming, Jochen; Ruizendaal, Jos; Hofstee, Jan Willem; van Henten, Eldert J.

    2014-01-01

    For robotic harvesting of sweet-pepper fruits in greenhouses a sensor system is required to detect and localize the fruits on the plants. Due to the complex structure of the plant, most fruits are (partially) occluded when an image is taken from one viewpoint only. In this research the effect of multiple camera positions and viewing angles on fruit visibility and detectability was investigated. A recording device was built which allowed to place the camera under different azimuth and zenith angles and to move the camera horizontally along the crop row. Fourteen camera positions were chosen and the fruit visibility in the recorded images was manually determined for each position. For images taken from one position only with the criterion of maximum 50% occlusion per fruit, the fruit detectability (FD) was in no case higher than 69%. The best single positions were the front views and looking with a zenith angle of 60° upwards. The FD increased when a combination was made of multiple viewpoint positions. With a combination of five favourite positions the maximum FD was 90%. PMID:24681670

  9. Angle-resolved scattering spectroscopy of explosives using an external cavity quantum cascade laser

    SciTech Connect

    Suter, Jonathan D.; Bernacki, Bruce E.; Phillips, Mark C.

    2012-04-01

    Investigation of angle-resolved scattering from solid explosives residues on a car door for non-contact sensing geometries. Illumination with a mid-infrared external cavity quantum cascade laser tuning between 7 and 8 microns was detected both with a sensitive single point detector and a hyperspectral imaging camera. Spectral scattering phenomena were discussed and possibilities for hyperspectral imaging at large scattering angles were outlined.

  10. The multi-angle view of MISR detects oil slicks under sun glitter conditions

    Microsoft Academic Search

    Guillem Chust; Yolanda Sagarminaga

    2007-01-01

    We tested the use of the Multi-angle Imaging SpectroRadiometer (MISR) for detecting oil spills in the Lake Maracaibo, Venezuela, that were caused by a series of accidents between December 2002 and March 2003. The MISR sensor, onboard the Terra satellite, utilises nine cameras pointed at fixed angles, ranging from nadir to ±70.5°. Based upon the Bidirectional Reflectance Factor, a contrast

  11. Laser angle measurement system

    NASA Technical Reports Server (NTRS)

    Pond, C. R.; Texeira, P. D.; Wilbert, R. E.

    1980-01-01

    The design and fabrication of a laser angle measurement system is described. The instrument is a fringe counting interferometer that monitors the pitch attitude of a model in a wind tunnel. A laser source and detector are mounted above the mode. Interference fringes are generated by a small passive element on the model. The fringe count is accumulated and displayed by a processor in the wind tunnel control room. Optical and electrical schematics, system maintenance and operation procedures are included, and the results of a demonstration test are given.

  12. Shapes and Angles

    NSDL National Science Digital Library

    NASA

    2012-05-08

    In this activity (page 7 of PDF), learners will identify the general two-dimensional geometric shape of the uppermost cross section of an impact crater. They will also draw connections between the general two-dimensional geometric shape of an impact crater and the projectile's angle of impact. There are two versions of this activity: Challenge, where students construct a launcher and create their own craters; and Non-Challenge where students analyze pictures of craters. The Moon Math: Craters! guide follows a 5E approach, applying concepts of geometry, modeling, data analysis to the NASA lunar spacecraft mission, LCROSS.

  13. Texture Components and Euler Angles

    E-print Network

    Rollett, Anthony D.

    1 Texture Components and Euler Angles 27-750 Texture, Microstructure components and their associated Euler angles. · The overall aim is to be able to describe a texture component by a single point (in orienta

  14. An attentive multi-camera system

    NASA Astrophysics Data System (ADS)

    Napoletano, Paolo; Tisato, Francesco

    2014-03-01

    Intelligent multi-camera systems that integrate computer vision algorithms are not error free, and thus both false positive and negative detections need to be revised by a specialized human operator. Traditional multi-camera systems usually include a control center with a wall of monitors displaying videos from each camera of the network. Nevertheless, as the number of cameras increases, switching from a camera to another becomes hard for a human operator. In this work we propose a new method that dynamically selects and displays the content of a video camera from all the available contents in the multi-camera system. The proposed method is based on a computational model of human visual attention that integrates top-down and bottom-up cues. We believe that this is the first work that tries to use a model of human visual attention for the dynamic selection of the camera view of a multi-camera system. The proposed method has been experimented in a given scenario and has demonstrated its effectiveness with respect to the other methods and manually generated ground-truth. The effectiveness has been evaluated in terms of number of correct best-views generated by the method with respect to the camera views manually generated by a human operator.

  15. Radiometric calibration for MWIR cameras

    NASA Astrophysics Data System (ADS)

    Yang, Hyunjin; Chun, Joohwan; Seo, Doo Chun; Yang, Jiyeon

    2012-06-01

    Korean Multi-purpose Satellite-3A (KOMPSAT-3A), which weighing about 1,000 kg is scheduled to be launched in 2013 and will be located at a sun-synchronous orbit (SSO) of 530 km in altitude. This is Korea's rst satellite to orbit with a mid-wave infrared (MWIR) image sensor, which is currently being developed at Korea Aerospace Research Institute (KARI). The missions envisioned include forest re surveillance, measurement of the ocean surface temperature, national defense and crop harvest estimate. In this paper, we shall explain the MWIR scene generation software and atmospheric compensation techniques for the infrared (IR) camera that we are currently developing. The MWIR scene generation software we have developed taking into account sky thermal emission, path emission, target emission, sky solar scattering and ground re ection based on MODTRAN data. Here, this software will be used for generating the radiation image in the satellite camera which requires an atmospheric compensation algorithm and the validation of the accuracy of the temperature which is obtained in our result. Image visibility restoration algorithm is a method for removing the eect of atmosphere between the camera and an object. This algorithm works between the satellite and the Earth, to predict object temperature noised with the Earth's atmosphere and solar radiation. Commonly, to compensate for the atmospheric eect, some softwares like MODTRAN is used for modeling the atmosphere. Our algorithm doesn't require an additional software to obtain the surface temperature. However, it needs to adjust visibility restoration parameters and the precision of the result still should be studied.

  16. Optimising camera traps for monitoring small mammals.

    PubMed

    Glen, Alistair S; Cockburn, Stuart; Nichols, Margaret; Ekanayake, Jagath; Warburton, Bruce

    2013-01-01

    Practical techniques are required to monitor invasive animals, which are often cryptic and occur at low density. Camera traps have potential for this purpose, but may have problems detecting and identifying small species. A further challenge is how to standardise the size of each camera's field of view so capture rates are comparable between different places and times. We investigated the optimal specifications for a low-cost camera trap for small mammals. The factors tested were 1) trigger speed, 2) passive infrared vs. microwave sensor, 3) white vs. infrared flash, and 4) still photographs vs. video. We also tested a new approach to standardise each camera's field of view. We compared the success rates of four camera trap designs in detecting and taking recognisable photographs of captive stoats (Mustelaerminea), feral cats (Felis catus) and hedgehogs (Erinaceuseuropaeus). Trigger speeds of 0.2-2.1 s captured photographs of all three target species unless the animal was running at high speed. The camera with a microwave sensor was prone to false triggers, and often failed to trigger when an animal moved in front of it. A white flash produced photographs that were more readily identified to species than those obtained under infrared light. However, a white flash may be more likely to frighten target animals, potentially affecting detection probabilities. Video footage achieved similar success rates to still cameras but required more processing time and computer memory. Placing two camera traps side by side achieved a higher success rate than using a single camera. Camera traps show considerable promise for monitoring invasive mammal control operations. Further research should address how best to standardise the size of each camera's field of view, maximise the probability that an animal encountering a camera trap will be detected, and eliminate visible or audible cues emitted by camera traps. PMID:23840790

  17. Second-Generation Multi-Angle Imaging Spectroradiometer

    NASA Technical Reports Server (NTRS)

    Macenka, Steven; Hovland, Larry; Preston, Daniel; Zellers, Brian; Downing, Kevin

    2004-01-01

    A report discusses an early phase in the development of the MISR-2 C, a second, improved version of the Multi-angle Imaging SpectroRadiometer (MISR), which has been in orbit around the Earth aboard NASA's Terra spacecraft since 1999. Like the MISR, the MISR-2 would contain a pushbroom array of nine charge-coupled- device (CCD) cameras one aimed at the nadir and the others aimed at different angles sideways from the nadir. The major improvements embodied in the MISR-2 would be the following: A new folded-reflective-optics design would render the MISR-2 only a third as massive as the MISR. Smaller filters and electronic circuits would enable a reduction in volume to a sixth of that of the MISR. The MISR-2 would generate images in two infrared spectral bands in addition to the blue, green, red, and near-infrared spectral bands of the MISR. Miniature polarization filters would be incorporated to add a polarization-sensing capability. Calibration would be performed nonintrusively by use of a gimbaled tenth camera. The main accomplishment thus far has been the construction of an extremely compact all-reflective-optics CCD camera to demonstrate feasibility.

  18. Triangles: Finding Interior Angle Measures

    NSDL National Science Digital Library

    2012-11-25

    In this lesson plan, students will start with a hands-on activity and then experiment with a GeoGebra-based computer model to investigate and discover the Triangle Angle Sum Theorem. Then they will use the Triangle Angle Sum Theorem to write and solve equations and find missing angle measures in a variety of examples.

  19. Synchronous generator load angle estimation

    Microsoft Academic Search

    H. Cucek; D. Sumina; N. Svigir

    2010-01-01

    In the paper is proposed a load angle estimation method for synchronous generators. The estimation method is based on synchronous generator corresponding voltage-current vector diagram and parameters of generator, transformer and transmission lines. In addition measurement of the load angle is presented. The estimation results were compared with the measured ones. The estimation method gives satisfactory accuracy for load angles

  20. A grazing incidence x-ray streak camera for ultrafast, single-shot measurements

    SciTech Connect

    Feng, Jun; Engelhorn, K.; Cho, B.I.; Lee, H.J.; Greaves, M.; Weber, C.P.; Falcone, R.W.; Padmore, H. A.; Heimann, P.A.

    2010-02-18

    An ultrafast x-ray streak camera has been realized using a grazing incidence reflection photocathode. X-rays are incident on a gold photocathode at a grazing angle of 20 degree and photoemitted electrons are focused by a large aperture magnetic solenoid lens. The streak camera has high quantum efficiency, 600fs temporal resolution, and 6mm imaging length in the spectral direction. Its single shot capability eliminates temporal smearing due to sweep jitter, and allows recording of the ultrafast dynamics of samples that undergo non-reversible changes.

  1. Advantages of improved timing accuracy in PET cameras using LSOscintillator

    SciTech Connect

    Moses, William W.

    2002-12-02

    PET scanners based on LSO have the potential forsignificantly better coincidence timing resolution than the 6 ns fwhmtypically achieved with BGO. This study analyzes the performanceenhancements made possible by improved timing as a function of thecoincidence time resolution. If 500 ps fwhm coincidence timing resolutioncan be achieved in a complete PET camera, the following four benefits canbe realized for whole-body FDG imaging: 1) The random event rate can bereduced by using a narrower coincidence timing window, increasing thepeak NECR by~;50 percent. 2) Using time-of-flight in the reconstructionalgorithm will reduce the noise variance by a factor of 5. 3) Emissionand transmission data can be acquired simultaneously, reducing the totalscan time. 4) Axial blurring can be reduced by using time-of-flight todetermine the correct axial plane that each event originated from. Whiletime-of-flight was extensively studied in the 1980's, practical factorslimited its effectiveness at that time and little attention has been paidto timing in PET since then. As these potential improvements aresubstantial and the advent of LSO PET cameras gives us the means toobtain them without other sacrifices, efforts to improve PET timingshould resume after their long dormancy.

  2. Electromechanical Model of Electrically Actuated Narrow Microbeams

    Microsoft Academic Search

    Romesh C. Batra; Maurizio Porfiri; Davide Spinello

    2006-01-01

    A consistent one-dimensional distributed electromechanical model of an electrically actuated narrow microbeam with width\\/height between 0.5–2.0 is derived, and the needed pull-in parameters are extracted with different methods. The model accounts for the position-dependent electrostatic loading, the fringing field effects due to both the finite width and the finite thickness of a microbeam, the mid-plane stretching, the mechanical distributed stiffness,

  3. Thermal tuning On narrow linewidth fiber laser

    Microsoft Academic Search

    Peiqi Han; Tianshan Liu; Xincun Gao; Shiwei Ren

    2010-01-01

    At present, people have been dedicated to high-speed and large-capacity optical fiber communication system. Studies have been shown that optical wavelength division multiplexing (WDM) technology is an effective means of communication to increase the channel capacity. Tunable lasers have very important applications in high-speed, largecapacity optical communications, and distributed sensing, it can provide narrow linewidth and tunable laser for highspeed

  4. Challenges of low-angle metal surface (crosshead) inspection

    NASA Astrophysics Data System (ADS)

    Sacha, Jaroslaw P.; Luster, Spencer D.; Shabestari, Behrouz N.; Miller, John W. V.; Hamel, P.

    1999-08-01

    The Crosshead Inspection System, CIS, utilizes machine vision technology for on-line inspection of a diesel engine component - a crosshead. The system includes three functional modules. 1) Part handling subsystem - presents parts for inspection and accepts or rejects them based on signals for the image analysis software. 2) Image acquisition hardware - Optics, light sources and two video cameras collect images of inspected parts. 3) Image analysis software - analyzes the images and sends pass/fail decision signals to the handling subsystem. The CIS acquires and inspects two images of each part. The upper camera generates an image of the part's top surface, while the lower camera generates an image of the so-called 'pockets' of the lower half. Both images are acquired when a part-in-place signal is received from the handling system. The surface inspection camera and light source are positioned at opposed low angles relative to the surface. Irregularities manifest themselves as shadows on the surface image. These shadows are detected, measured and compared to user specifications. The pocket inspection detects the presence of tumbler stones. The contrast of these stones is enhanced with circularly polarized lighting and imaging. The graphical user interface of the CIS provides easy setup and debugging of the image processing algorithms. A database module collects, archives and present part inspection statistics to the user. The inspection rate is sixty parts per minute.

  5. Electron correlation in narrow band systems

    NASA Astrophysics Data System (ADS)

    Kishore, R.

    1983-03-01

    The effect of the electron correlations in narrow bands, such as d(f) bands in the transition (rare earth) metals and their compounds and the impurity bands in doped semiconductors, are studied. The narrow band systems are described by the Hubbard Hamiltonian. By proposing a local self-energy for the interacting electrons, we found that our results are exact in both atomic and band limits and reduce to the Hartree Fock results for U/delta yield 0, where U is the intra-atomic Coulomb interaction and delta is the bandwidth of the noninteracting electrons. For the Lorentzian form of the density of states of the noninteracting electrons, our approximation turns out to be equivalent to the third Hubbard approximation. A simple argument, based on the mean free path obtained from the imaginary part of the self energy, shows how the electron correlations can give rise to a discontinuous metal-nonmetal transition as proposed by Mott. The band narrowing and the existence of the satellite below the Fermi energy in Ni, found in photoemission experiments, can also be understood.

  6. Multi-camera calibration based on openCV and multi-view registration

    NASA Astrophysics Data System (ADS)

    Deng, Xiao-ming; Wan, Xiong; Zhang, Zhi-min; Leng, Bi-yan; Lou, Ning-ning; He, Shuai

    2010-10-01

    For multi-camera calibration systems, a method based on OpenCV and multi-view registration combining calibration algorithm is proposed. First of all, using a Zhang's calibration plate (8X8 chessboard diagram) and a number of cameras (with three industrial-grade CCD) to be 9 group images shooting from different angles, using OpenCV to calibrate the parameters fast in the camera. Secondly, based on the corresponding relationship between each camera view, the computation of the rotation matrix and translation matrix is formulated as a constrained optimization problem. According to the Kuhn-Tucker theorem and the properties on the derivative of the matrix-valued function, the formulae of rotation matrix and translation matrix are deduced by using singular value decomposition algorithm. Afterwards an iterative method is utilized to get the entire coordinate transformation of pair-wise views, thus the precise multi-view registration can be conveniently achieved and then can get the relative positions in them(the camera outside the parameters).Experimental results show that the method is practical in multi-camera calibration .

  7. Dynamic-angle spinning and double rotation of quadrupolar nuclei

    SciTech Connect

    Mueller, K.T. (Lawrence Berkeley Lab., CA (United States) California Univ., Berkeley, CA (United States). Dept. of Chemistry)

    1991-07-01

    Nuclear magnetic resonance (NMR) spectroscopy of quadrupolar nuclei is complicated by the coupling of the electric quadrupole moment of the nucleus to local variations in the electric field. The quadrupolar interaction is a useful source of information about local molecular structure in solids, but it tends to broaden resonance lines causing crowding and overlap in NMR spectra. Magic- angle spinning, which is routinely used to produce high resolution spectra of spin-{1/2} nuclei like carbon-13 and silicon-29, is incapable of fully narrowing resonances from quadrupolar nuclei when anisotropic second-order quadrupolar interactions are present. Two new sample-spinning techniques are introduced here that completely average the second-order quadrupolar coupling. Narrow resonance lines are obtained and individual resonances from distinct nuclear sites are identified. In dynamic-angle spinning (DAS) a rotor containing a powdered sample is reoriented between discrete angles with respect to high magnetic field. Evolution under anisotropic interactions at the different angles cancels, leaving only the isotropic evolution of the spin system. In the second technique, double rotation (DOR), a small rotor spins within a larger rotor so that the sample traces out a complicated trajectory in space. The relative orientation of the rotors and the orientation of the larger rotor within the magnetic field are selected to average both first- and second-order anisotropic broadening. The theory of quadrupolar interactions, coherent averaging theory, and motional narrowing by sample reorientation are reviewed with emphasis on the chemical shift anisotropy and second-order quadrupolar interactions experienced by half-odd integer spin quadrupolar nuclei. The DAS and DOR techniques are introduced and illustrated with application to common quadrupolar systems such as sodium-23 and oxygen-17 nuclei in solids.

  8. Television camera video level control system

    NASA Technical Reports Server (NTRS)

    Kravitz, M.; Freedman, L. A.; Fredd, E. H.; Denef, D. E. (inventors)

    1985-01-01

    A video level control system is provided which generates a normalized video signal for a camera processing circuit. The video level control system includes a lens iris which provides a controlled light signal to a camera tube. The camera tube converts the light signal provided by the lens iris into electrical signals. A feedback circuit in response to the electrical signals generated by the camera tube, provides feedback signals to the lens iris and the camera tube. This assures that a normalized video signal is provided in a first illumination range. An automatic gain control loop, which is also responsive to the electrical signals generated by the camera tube 4, operates in tandem with the feedback circuit. This assures that the normalized video signal is maintained in a second illumination range.

  9. Toward the camera rain gauge

    NASA Astrophysics Data System (ADS)

    Allamano, P.; Croci, A.; Laio, F.

    2015-03-01

    We propose a novel technique based on the quantitative detection of rain intensity from images, i.e., from pictures taken in rainy conditions. The method is fully analytical and based on the fundamentals of camera optics. A rigorous statistical framing of the technique allows one to obtain the rain rate estimates in terms of expected values and associated uncertainty. We show that the method can be profitably applied to real rain events, and we obtain promising results with errors of the order of ±25%. A precise quantification of the method's accuracy will require a more systematic and long-term comparison with benchmark measures. The significant step forward with respect to standard rain gauges resides in the possibility to retrieve measures at very high temporal resolution (e.g., 30 measures per minute) at a very low cost. Perspective applications include the possibility to dramatically increase the spatial density of rain observations by exporting the technique to crowdsourced pictures of rain acquired with cameras and smartphones.

  10. The Dark Energy Camera (DECam)

    E-print Network

    Honscheid, K; Abbott, T; Annis, J; Antonik, M; Barcel, M; Bernstein, R; Bigelow, B; Brooks, D; Buckley-Geer, E; Campa, J; Cardiel, L; Castander, F; Castilla, J; Cease, H; Chappa, S; Dede, E; Derylo, G; Diehl, T; Doel, P; De Vicente, J; Eiting, J; Estrada, J; Finley, D; Flaugher, B; Gaztañaga, E; Gerdes, D; Gladders, M; Guarino, V; Gutíerrez, G; Hamilton, J; Haney, M; Holland, S; Huffman, D; Karliner, I; Kau, D; Kent, S; Kozlovsky, M; Kubik, D; Kühn, K; Kuhlmann, S; Kuk, K; Leger, F; Lin, H; Martínez, G; Martínez, M; Merritt, W; Mohr, J; Moore, P; Moore, T; Nord, B; Ogando, R; Olsen, J; Onal, B; Peoples, J; Qian, T; Roe, N; Sánchez, E; Scarpine, V; Schmidt, R; Schmitt, R; Schubnell, M; Schultz, K; Selen, M; Shaw, T; Simaitis, V; Slaughter, J; Smith, C; Spinka, H; Stefanik, A; Stuermer, W; Talaga, R; Tarle, G; Thaler, J; Tucker, D; Walker, A; Worswick, S; Zhao, A

    2008-01-01

    In this paper we describe the Dark Energy Camera (DECam), which will be the primary instrument used in the Dark Energy Survey. DECam will be a 3 sq. deg. mosaic camera mounted at the prime focus of the Blanco 4m telescope at the Cerro-Tololo International Observatory (CTIO). It consists of a large mosaic CCD focal plane, a five element optical corrector, five filters (g,r,i,z,Y), a modern data acquisition and control system and the associated infrastructure for operation in the prime focus cage. The focal plane includes of 62 2K x 4K CCD modules (0.27"/pixel) arranged in a hexagon inscribed within the roughly 2.2 degree diameter field of view and 12 smaller 2K x 2K CCDs for guiding, focus and alignment. The CCDs will be 250 micron thick fully-depleted CCDs that have been developed at the Lawrence Berkeley National Laboratory (LBNL). Production of the CCDs and fabrication of the optics, mechanical structure, mechanisms, and control system for DECam are underway; delivery of the instrument to CTIO is scheduled ...

  11. Vehicles' Motion Parameters Detection in Intersection Based on Videos of Low View Angle

    Microsoft Academic Search

    Lu Guangquan; Liu Miaomiao; Xia Shaojun; Deng Cheng

    2010-01-01

    By taking Visual C++ and OpenCV as tools and processing video image sequences gathered by one camera under low view angle, suitable foreground detection method of intersection was selected. Through improving blob tracking system provided by OpenCV, the program was developed to detect motion parameters of vehicles in intersection. The motion parameters were smoothed by Kalman filter. It made the

  12. Investigating at the Moon With new Eyes: The Lunar Reconnaissance Orbiter Mission Camera (LROC)

    NASA Astrophysics Data System (ADS)

    Hiesinger, H.; Robinson, M. S.; McEwen, A. S.; Turtle, E. P.; Eliason, E. M.; Jolliff, B. L.; Malin, M. C.; Thomas, P. C.

    The Lunar Reconnaissance Orbiter Mission Camera (LROC) H. Hiesinger (1,2), M.S. Robinson (3), A.S. McEwen (4), E.P. Turtle (4), E.M. Eliason (4), B.L. Jolliff (5), M.C. Malin (6), and P.C. Thomas (7) (1) Brown Univ., Dept. of Geological Sciences, Providence RI 02912, Harald_Hiesinger@brown.edu, (2) Westfaelische Wilhelms-University, (3) Northwestern Univ., (4) LPL, Univ. of Arizona, (5) Washington Univ., (6) Malin Space Science Systems, (7) Cornell Univ. The Lunar Reconnaissance Orbiter (LRO) mission is scheduled for launch in October 2008 as a first step to return humans to the Moon by 2018. The main goals of the Lunar Reconnaissance Orbiter Camera (LROC) are to: 1) assess meter and smaller- scale features for safety analyses for potential lunar landing sites near polar resources, and elsewhere on the Moon; and 2) acquire multi-temporal images of the poles to characterize the polar illumination environment (100 m scale), identifying regions of permanent shadow and permanent or near permanent illumination over a full lunar year. In addition, LROC will return six high-value datasets such as 1) meter-scale maps of regions of permanent or near permanent illumination of polar massifs; 2) high resolution topography through stereogrammetric and photometric stereo analyses for potential landing sites; 3) a global multispectral map in 7 wavelengths (300-680 nm) to characterize lunar resources, in particular ilmenite; 4) a global 100-m/pixel basemap with incidence angles (60-80 degree) favorable for morphologic interpretations; 5) images of a variety of geologic units at sub-meter resolution to investigate physical properties and regolith variability; and 6) meter-scale coverage overlapping with Apollo Panoramic images (1-2 m/pixel) to document the number of small impacts since 1971-1972, to estimate hazards for future surface operations. LROC consists of two narrow-angle cameras (NACs) which will provide 0.5-m scale panchromatic images over a 5-km swath, a wide-angle camera (WAC) to acquire images at about 100 m/pixel in seven color bands over a 100-km swath, and a common Sequence and Compressor System (SCS). Each NAC has a 700-mm-focal-length optic that images onto a 5000-pixel CCD line-array, providing a cross-track field-of-view (FOV) of 2.86 degree. The NAC readout noise is better than 100 e- , and the data are sampled at 12 bits. Its internal buffer holds 256 MB of uncompressed data, enough for a full-swath image 25-km long or a 2x2 binned image 100-km long. The WAC has two 6-mm- focal-length lenses imaging onto the same 1000 x 1000 pixel, electronically shuttered CCD area-array, one imaging in the visible/near IR, and the other in the UV. Each has a cross-track FOV of 90 degree. From the nominal 50-km orbit, the WAC will have a resolution of 100 m/pixel in the visible, and a swath width of ˜100 km. The seven-band color capability of the WAC is achieved by color filters mounted directly 1 over the detector, providing different sections of the CCD with different filters [1]. The readout noise is less than 40 e- , and, as with the NAC, pixel values are digitized to 12-bits and may be subsequently converted to 8-bit values. The total mass of the LROC system is about 12 kg; the total LROC power consumption averages at 22 W (30 W peak). Assuming a downlink with lossless compression, LRO will produce a total of 20 TeraBytes (TB) of raw data. Production of higher-level data products will result in a total of 70 TB for Planetary Data System (PDS) archiving, 100 times larger than any previous missions. [1] Malin et al., JGR, 106, 17651-17672, 2001. 2

  13. Stationary Camera Aims And Zooms Electronically

    NASA Technical Reports Server (NTRS)

    Zimmermann, Steven D.

    1994-01-01

    Microprocessors select, correct, and orient portions of hemispherical field of view. Video camera pans, tilts, zooms, and provides rotations of images of objects of field of view, all without moving parts. Used for surveillance in areas where movement of camera conspicuous or constrained by obstructions. Also used for closeup tracking of multiple objects in field of view or to break image into sectors for simultaneous viewing, thereby replacing several cameras.

  14. Electrostatic camera system functional design study

    NASA Technical Reports Server (NTRS)

    Botticelli, R. A.; Cook, F. J.; Moore, R. F.

    1972-01-01

    A functional design study for an electrostatic camera system for application to planetary missions is presented. The electrostatic camera can produce and store a large number of pictures and provide for transmission of the stored information at arbitrary times after exposure. Preliminary configuration drawings and circuit diagrams for the system are illustrated. The camera system's size, weight, power consumption, and performance are characterized. Tradeoffs between system weight, power, and storage capacity are identified.

  15. Wide-Brightness-Range Video Camera

    NASA Technical Reports Server (NTRS)

    Craig, G. D.

    1986-01-01

    Television camera selectively attenuates bright areas in scene without affecting dim areas. Camera views scenes containing extremes of light and dark without overexposing light areas and underexposing dark ones. Camera uses liquid-crystal light valve for selective attenuation. Feedback cathoderay tube locally alters reflection characteristics of liquid-crystal light valve. Results in point-to-point optoelectronic automatic gain control to enable viewing of both dark and very bright areas within scene.

  16. A Flexible New Technique for Camera Calibration

    Microsoft Academic Search

    Zhengyou Zhang

    2000-01-01

    Abstract—We propose,a flexible new,technique,to easily calibrate a camera. It only requires the camera,to observe,a planar pattern shown,at a few (at least two) different orientations. Either the camera,or the planar pattern can be freely moved. The motion,need not be known. Radial lens distortion is modeled. The proposed procedure consists of a closed-form solution, followed by a nonlinear refinement based,on the maximum,likelihood

  17. Equilibrium contact angle or the most-stable contact angle?

    PubMed

    Montes Ruiz-Cabello, F J; Rodríguez-Valverde, M A; Cabrerizo-Vílchez, M A

    2014-04-01

    It is well-established that the equilibrium contact angle in a thermodynamic framework is an "unattainable" contact angle. Instead, the most-stable contact angle obtained from mechanical stimuli of the system is indeed experimentally accessible. Monitoring the susceptibility of a sessile drop to a mechanical stimulus enables to identify the most stable drop configuration within the practical range of contact angle hysteresis. Two different stimuli may be used with sessile drops: mechanical vibration and tilting. The most stable drop against vibration should reveal the changeless contact angle but against the gravity force, it should reveal the highest resistance to slide down. After the corresponding mechanical stimulus, once the excited drop configuration is examined, the focus will be on the contact angle of the initial drop configuration. This methodology needs to map significantly the static drop configurations with different stable contact angles. The most-stable contact angle, together with the advancing and receding contact angles, completes the description of physically realizable configurations of a solid-liquid system. Since the most-stable contact angle is energetically significant, it may be used in the Wenzel, Cassie or Cassie-Baxter equations accordingly or for the surface energy evaluation. PMID:24140073

  18. Cerebellopontine Angle Lipoma

    PubMed Central

    Schuhmann, Martin U.; Lüdemann, Wolf O.; Schreiber, Hartwig; Samii, Madjid

    1997-01-01

    Intracranial lipomas in an infratentorial and extra-axial location are extremely rare. The presented case of an extensive lipoma of the cerebellopontine angle (CPA) represents 0.05% of all CPA tumors operated on in our department from 1978 to 1996. The lipoma constitutes an important differential diagnosis because the clinical management differs significantly from other CPA lesions. The clinical presentation and management of the presented case are analyzed in comparison to all previously described cases of CPA lipomas. The etiology and the radiological features of CPA lipomas are reviewed and discussed. CPA lipomas are maldevelopmental lesions that may cause slowly progressive symptoms. Neuroradiology enables a reliable preoperative diagnosis. Attempts of complete lipoma resection usually result in severe neurological deficits. Therefore, we recommend a conservative approach in managing these patients. Limited surgery is indicated if the patient has an associated vascular compression syndrome or suffers from disabling vertigo. ImagesFigure 1Figure 2Figure 3Figure 4 PMID:17171031

  19. True-color night vision cameras

    NASA Astrophysics Data System (ADS)

    Kriesel, Jason; Gat, Nahum

    2007-04-01

    This paper describes True-Color Night Vision cameras that are sensitive to the visible to near-infrared (V-NIR) portion of the spectrum allowing for the "true-color" of scenes and objects to be displayed and recorded under low-light-level conditions. As compared to traditional monochrome (gray or green) night vision imagery, color imagery has increased information content and has proven to enable better situational awareness, faster response time, and more accurate target identification. Urban combat environments, where rapid situational awareness is vital, and marine operations, where there is inherent information in the color of markings and lights, are example applications that can benefit from True-Color Night Vision technology. Two different prototype cameras, employing two different true-color night vision technological approaches, are described and compared in this paper. One camera uses a fast-switching liquid crystal filter in front of a custom Gen-III image intensified camera, and the second camera is based around an EMCCD sensor with a mosaic filter applied directly to the sensor. In addition to visible light, both cameras utilize NIR to (1) increase the signal and (2) enable the viewing of laser aiming devices. The performance of the true-color cameras, along with the performance of standard (monochrome) night vision cameras, are reported and compared under various operating conditions in the lab and the field. In addition to subjective criterion, figures of merit designed specifically for the objective assessment of such cameras are used in this analysis.

  20. Mirror pendulum pose measurement by camera calibration

    NASA Astrophysics Data System (ADS)

    Li, Lulu; Zhao, Wenchuan; Wu, Fan; Liu, Yong

    2014-09-01

    A simple method for planar mirror pendulum pose measurement is proposed. The method only needs a LCD screen and a CCD camera. LCD screen displays calibration patterns, and the virtual images (VIs) reflected by mirror are taken by the CCD camera. By camera calibration, the pose relationships between camera and VI coordinate systems can be determined. Thus the pendulum poses of the mirror is obtained according to coordinate transition and reflection principle. This method is simple and convenient, and has a big application potential in mirror pendulum pose measurement.

  1. [Comparative studies of the width of the anterior chamber angle using echography and gonioscopy].

    PubMed

    Makabe, R

    1989-01-01

    Using a thin B-scan probe the anterior chamber angle was ultrasonographically studied using a contact eye cup filled with saline. The width of the chamber angle was measured on the display video copy and compared with the gonioscopic findings. Sixty-four eyes with primary glaucoma and 53 nonglaucomatous eyes were examined. In eyes with gonioscopic narrow angle the real ultrasonographic width of the chamber angle was varied considerably from case to case. In eyes where the chamber angle was not visible by gonioscopy it was not always closed. The correlation between the width of the angle and the depth of the anterior chamber was weak. There was often a plateau iris, which was demonstrated clearly by B-scan ultrasonography. PMID:2651788

  2. Angle-sensitive pixels: a new paradigm for low-power, low-cost 2D and 3D sensing

    NASA Astrophysics Data System (ADS)

    Wang, Albert; Hemami, Sheila S.; Molnar, Alyosha

    2012-03-01

    Angle-sensitive pixels are micro-scale devices which capture information about both the intensity and incident angle of the light they see. These pixels acquire a richer description of incident light that conventional intensity-sensitive pixels. We provide a mathematical framework for analyzing the imaging capability of these pixels and demonstrate that they provide a response similar to one component of a 2D Hartley transform in angle, with a distinct frequency and orientation. By using several kinds of different pixels throughout an image sensor, we obtain a full, low-order Hartley transform of local angle, which is mapped to a local, spatial Hartley transform by a conventional camera lens. Based on these principles, we demonstrate a light-field camera using an image sensor composed of angle-sensitive pixels and a conventional camera lens. Single images captured by our camera can be directly used for both computational refocus for enhanced depth of field and depth map generation. The algorithms used for these tasks are simple and take advantage of the transform-based nature of angle-sensitive pixel based image capture.

  3. Theodolite with CCD Camera for Safe Measurement of Laser-Beam Pointing

    NASA Technical Reports Server (NTRS)

    Crooke, Julie A.

    2003-01-01

    The simple addition of a charge-coupled-device (CCD) camera to a theodolite makes it safe to measure the pointing direction of a laser beam. The present state of the art requires this to be a custom addition because theodolites are manufactured without CCD cameras as standard or even optional equipment. A theodolite is an alignment telescope equipped with mechanisms to measure the azimuth and elevation angles to the sub-arcsecond level. When measuring the angular pointing direction of a Class ll laser with a theodolite, one could place a calculated amount of neutral density (ND) filters in front of the theodolite s telescope. One could then safely view and measure the laser s boresight looking through the theodolite s telescope without great risk to one s eyes. This method for a Class ll visible wavelength laser is not acceptable to even consider tempting for a Class IV laser and not applicable for an infrared (IR) laser. If one chooses insufficient attenuation or forgets to use the filters, then looking at the laser beam through the theodolite could cause instant blindness. The CCD camera is already commercially available. It is a small, inexpensive, blackand- white CCD circuit-board-level camera. An interface adaptor was designed and fabricated to mount the camera onto the eyepiece of the specific theodolite s viewing telescope. Other equipment needed for operation of the camera are power supplies, cables, and a black-and-white television monitor. The picture displayed on the monitor is equivalent to what one would see when looking directly through the theodolite. Again, the additional advantage afforded by a cheap black-and-white CCD camera is that it is sensitive to infrared as well as to visible light. Hence, one can use the camera coupled to a theodolite to measure the pointing of an infrared as well as a visible laser.

  4. Narrow Iron K$?$ Lines in Active Galactic Nuclei: Evolving Populations?

    E-print Network

    Xin-Lin Zhou; Jian-Min Wang

    2004-12-03

    We assemble a sample consisting of 66 active galactic nuclei (AGNs) from literature and the {\\em XMM-Newton} archive in order to investigate the origin of the 6.4 keV narrow iron K$\\alpha$ line (NIKAL). The X-ray Baldwin effect of the NIKAL is confirmed in this sample. We find the equivalent width ($EW$) of the NIKAL is more strongly inversely correlated with Eddington ratio ($\\ce$) than the 2-10 keV X-ray luminosity. Our sample favors the origin from the dusty torus and the X-ray Baldwin effect is caused by the changing opening angle of the dusty torus. The relation $EW-\\ce$ can be derived from a toy model of the dusty torus. If the unification scheme is valid in all AGNs, we can derive the Baldwin effect from the fraction of type {\\sc ii} AGNs to the total population given by {\\em Chandra} and {\\em Hubble} deep surveys. Thus the evolution of populations could be reflected by the NIKAL's Baldwin effect.

  5. Mountain glaciers caught on camera

    NASA Astrophysics Data System (ADS)

    Balcerak, Ernie

    2011-12-01

    Many glaciers around the world are melting, and new research is showing some of the dramatic details. Ulyana Horodyskyj, a graduate student at the Cooperative Institute for Research in Environmental Sciences (CIRES), University of Colorado at Boulder, set up cameras to take time-lapse photographs of three lakes on a glacier in Nepal. This allowed her and her colleagues to see the supraglacial lake drain in real time for the first time, making it possible to estimate how much water was involved and how long it took for the lake to drain and refill. Horodyskyj said in a press conference at the AGU Fall Meeting that such observations of supraglacial lakes are valuable because in a warming climate, melting glaciers can lead to formation of supraglacial lakes.

  6. Approximations to camera sensor noise

    NASA Astrophysics Data System (ADS)

    Jin, Xiaodan; Hirakawa, Keigo

    2013-02-01

    Noise is present in all image sensor data. Poisson distribution is said to model the stochastic nature of the photon arrival process, while it is common to approximate readout/thermal noise by additive white Gaussian noise (AWGN). Other sources of signal-dependent noise such as Fano and quantization also contribute to the overall noise profile. Question remains, however, about how best to model the combined sensor noise. Though additive Gaussian noise with signal-dependent noise variance (SD-AWGN) and Poisson corruption are two widely used models to approximate the actual sensor noise distribution, the justification given to these types of models are based on limited evidence. The goal of this paper is to provide a more comprehensive characterization of random noise. We concluded by presenting concrete evidence that Poisson model is a better approximation to real camera model than SD-AWGN. We suggest further modification to Poisson that may improve the noise model.

  7. Sensitivity of seismic waves to structure: Wide-angle broad-band sensitivity packets

    E-print Network

    Cerveny, Vlastislav

    Sensitivity of seismic waves to structure: Wide-angle broad-band sensitivity packets Ludek Klimes domain as the sensitivity beams, and in the time domain as the sensitivity packets. The sensitivity packets are mostly represented by narrow­band Gaussian sensitivity packets studied in the previous paper

  8. Limited Angle Tomography of Sparse Images from Noisy Data using TLS MUSIC Algorithm

    E-print Network

    Yagle, Andrew E.

    ) over a limited range of an- gles. It has applications in medical imaging and syn- thetic aperture radar is available over a narrow range of angles, due to physical constraints. In synthetic aperture radar in a segment of an annulus in the Fourier plane [2]. Multistatic, spotlight-mode, and strip-map synthetic

  9. ZEN2: a narrow J-band search for z ~ 9 Ly? emitting galaxies directed towards three lensing clusters

    NASA Astrophysics Data System (ADS)

    Willis, J. P.; Courbin, F.; Kneib, J.-P.; Minniti, D.

    2008-03-01

    We present the results of a continuing survey to detect Ly? emitting galaxies at redshifts z ~ 9: the `z equals nine' (ZEN) survey. We have obtained deep VLT Infrared Spectrometer and Array Camera observations in the narrow J-band filter NB119 directed towards three massive lensing clusters: Abell clusters 1689, 1835 and 114. The foreground clusters provide a magnified view of the distant Universe and permit a sensitive test for the presence of very high redshift galaxies. We search for z ~ 9 Ly? emitting galaxies displaying a significant narrow-band excess relative to accompanying J-band observations that remain undetected in Hubble Space Telescope (HST)/Advanced Camera for Surveys (ACS) optical images of each field. No sources consistent with this criterion are detected above the unlensed 90 per cent point-source flux limit of the narrow-band image, FNB = 3.7 × 10-18ergs-1cm-2. To date, the total coverage of the ZEN survey has sampled a volume at z ~ 9 of approximately 1700 comoving Mpc3 to a Ly? emission luminosity of 1043ergs-1. We conclude by considering the prospects for detecting z ~ 9 Ly? emitting galaxies in light of both observed galaxy properties at z < 7 and simulated populations at z > 7.

  10. Comparative analysis of the narrow resonances

    NASA Astrophysics Data System (ADS)

    Vladimirsky, V. V.; Grigor'ev, V. K.; Erofeev, I. A.; Erofeeva, O. N.; Zaitsev, A. P.; Katinov, Yu. V.; Lisin, V. I.; Luzin, V. N.; Nozdrachev, V. N.; Sokolovsky, V. V.; Fadeeva, E. A.; Shkurenko, Yu. P.

    2008-12-01

    Several narrow resonance features in the system of two K S mesons were found on the basis of experimental data obtained by using the 6-m spectrometer of the Institute of Theoretical and Experimental Physics (ITEP, Moscow). Three resonances, X(1070), X(1545), and f J (2220), are considered in the present study. An interference with the background is observed for the X(1545) resonance. It is shown that X(1070) and f J (2220) resonances are produced under nearly identical kinematical conditions characterized by high transverse momenta and the production of accompanying extra pions.

  11. Narrow-band nonlinear sea waves

    NASA Technical Reports Server (NTRS)

    Tayfun, M. A.

    1980-01-01

    Probabilistic description of nonlinear waves with a narrow-band spectrum is simplified to a form in which each realization of the surface displacement becomes an amplitude-modulated Stokes wave with a mean frequency and random phase. Under appropriate conditions this simplification provides a convenient yet rigorous means of describing nonlinear effects on sea surface properties in a semiclosed or closed form. In particular, it is shown that surface displacements are non-Gaussian and skewed, as was previously predicted by the Gram-Charlier approximation; that wave heights are Rayleigh distributed, just as in the linear case; and that crests are non-Rayleigh.

  12. f-band narrowing in uranium intermetallics

    SciTech Connect

    Dunlap, B.D.; Litterst, F.J.; Malik, S.K.; Kierstead, H.A.; Crabtree, G.W.; Kwok, W.; Lam, D.J.; Mitchell, A.W.

    1987-01-01

    Although the discovery of heavy fermion behavior in uranium compounds has attracted a great deal of attention, relatively little work has been done which is sufficiently systematic to allow an assessment of the relationship of such behavior to more common phenomena, such as mixed valence, narrow-band effects, etc. In this paper we report bulk property measurements for a number of alloys which form a part of such a systematic study. The approach has been to take relatively simple and well-understood materials and alter their behavior by alloying to produce heavy fermion or Kondo behavior in a controlled way.

  13. Geometric Model of a Narrow Tilting CAR using Robotics formalism

    E-print Network

    Boyer, Edmond

    Geometric Model of a Narrow Tilting CAR using Robotics formalism Salim Maakaroun *°, Wisama Khalil of an Electrical narrow tilting car instead of a large gasoline car should dramatically decrease traffic congestion modeling issue of a new narrow tilting car. The modeling is based on the modified Denavit Hartenberg

  14. 2. Photocopied July 1971 from photostat Jordan Narrows Folder #1, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    2. Photocopied July 1971 from photostat Jordan Narrows Folder #1, Engineering Department, Utah Power and Light Co., Salt Lake City, Utah. JORDAN NARROWS STATION. PLAN AND SECTION. - Salt Lake City Water & Electrical Power Company, Jordan Narrows Hydroelectric Plant, Jordan River, Riverton, Salt Lake County, UT

  15. MRI of surgically created pulmonary artery narrowing in the dog

    Microsoft Academic Search

    R. J. Hernandez; A. P. Rocchini; E. L. Bove; T. L. Chenevert; B. Gubin

    1989-01-01

    Narrowing of the pulmonary arteries was created surgically in twelve dogs. In six of the dogs the narrowing was central (main pulmonary artery), and in the remaining six the narrowing was located peripherally at the hilar level of the right pulmonary artery beyond the pericardial reflection. MRI and angiography were performed in all dogs. MRI clearly delineated the site of

  16. Two-stage multiple cameras calibration

    Microsoft Academic Search

    Andrei Y. Kargashin; Eugeny I. Kugushev; E. L. Starostin

    1995-01-01

    The problem of determining spatial position and orientation of several cameras, knowing corresponding coordinates obtained by perspective projections onto the camera planes, is considered. Input data for calibration also include distances between some points in space. The calibration is carried out in two stages. In the first stage, position and orientation for pairs of images (stereo pairs) are determined. Every

  17. Accurate Camera Calibration with New Minimizing Function

    Microsoft Academic Search

    Qiaoyu Xu; Dong Ye; Rensheng Che; Yan Huang

    2006-01-01

    Camera calibration has been studied extensively in computer vision and photogrammetry. But almost all the camera calibration techniques iterate with the general minimizing function by minimizing the discrepancy between the real position in pixels of a 2D image point and the calculated projection of the 3D object point on the image plane. Though the imaging distance errors are equal, the

  18. Upgrading a CCD camera for astronomical use 

    E-print Network

    Lamecker, James Frank

    1993-01-01

    Existing charge-coupled device (CCD) video cameras have been modified to be used for astronomical imaging on telescopes in order to improve imaging times over those of photography. An astronomical CCD camera at the Texas A&M Observatory would...

  19. Computational Cameras: Approaches, Benefits and Limits

    E-print Network

    Nayar, Shree K.

    Computational Cameras: Approaches, Benefits and Limits Shree K. Nayar Computer Science Department the benefits and limits of computational imaging, and discuss how it is related to the adjacent and overlapping A computational camera uses a combination of optics and software to produce images that cannot be taken

  20. Controlled Impact Demonstration (CID) tail camera video

    NASA Technical Reports Server (NTRS)

    1984-01-01

    The Controlled Impact Demonstration (CID) was a joint research project by NASA and the FAA to test a survivable aircraft impact using a remotely piloted Boeing 720 aircraft. The tail camera movie is one shot running 27 seconds. It shows the impact from the perspective of a camera mounted high on the vertical stabilizer, looking forward over the fuselage and wings.

  1. Thermal Cameras in School Laboratory Activities

    ERIC Educational Resources Information Center

    Haglund, Jesper; Jeppsson, Fredrik; Hedberg, David; Schönborn, Konrad J.

    2015-01-01

    Thermal cameras offer real-time visual access to otherwise invisible thermal phenomena, which are conceptually demanding for learners during traditional teaching. We present three studies of students' conduction of laboratory activities that employ thermal cameras to teach challenging thermal concepts in grades 4, 7 and 10-12. Visualization of…

  2. Uncertainty and sensitivity analysis for camera calibration

    Microsoft Academic Search

    LiMin Zhu; HongGen Luo; Xu Zhang

    2009-01-01

    Purpose – The purpose of this paper is to present a unified approach to uncertainty and sensitivity analysis for camera calibration. Design\\/methodology\\/approach – The approach is based on the fact that camera calibration is a problem of parameter estimation and the parameters of interest are given by the optimal solution of a least-squares problem. Findings – A system of linear

  3. Calibration of detector sensitivity in positron cameras

    Microsoft Academic Search

    D. A. Chesler; C. W. Stearns

    1990-01-01

    An improved method for calibrating detector sensitivities in a positron camera has been developed. The calibration phantom is a cylinder of activity placed near the center of the camera and fully within the field of view. The calibration data are processed in such a manner that the following two important properties are achieved: (1) the estimate of detector sensitivity is

  4. Digital Cameras in the K-12 Classroom.

    ERIC Educational Resources Information Center

    Clark, Kenneth; Hosticka, Alice; Bedell, Jacqueline

    This paper discusses the use of digital cameras in K-12 education. Examples are provided of the integration of the digital camera and visual images into: reading and writing; science, social studies, and mathematics; projects; scientific experiments; desktop publishing; visual arts; data analysis; computer literacy; classroom atmosphere; and…

  5. Particle Filtering for Robust Single Camera Localisation

    Microsoft Academic Search

    Mark Pupilli; Andrew Calway

    This paper summarises recent work on vision based localisation of a moving camera using particle filtering. We are interested in real-time opera- tion for applications in mobile and wearable computing, in which the camera is worn or held by a user. Specifically, we aim for localisation algorithms which are robust to the real-life motions associated with human activity and to

  6. ISOCAM - An infrared camera for ISO

    Microsoft Academic Search

    Francois Sibille; C. Cesarsky; S. Cazes; D. Cesarsky; A. Chedin

    1986-01-01

    This paper presents the project ISOCAM for an infrared camera which will be one of the four focal plane instruments on ISO. The camera contains two optical channels, one with an InSb CID array (3 to 5 microns), the other with a Si:Ga DVR array (5 to 17 microns). Interference filters and CVF's provide spectral resolutions between 2 and 50.

  7. Matching image color from different cameras

    NASA Astrophysics Data System (ADS)

    Fairchild, Mark D.; Wyble, David R.; Johnson, Garrett M.

    2008-01-01

    Can images from professional digital SLR cameras be made equivalent in color using simple colorimetric characterization? Two cameras were characterized, these characterizations were implemented on a variety of images, and the results were evaluated both colorimetrically and psychophysically. A Nikon D2x and a Canon 5D were used. The colorimetric analyses indicated that accurate reproductions were obtained. The median CIELAB color differences between the measured ColorChecker SG and the reproduced image were 4.0 and 6.1 for the Canon (chart and spectral respectively) and 5.9 and 6.9 for the Nikon. The median differences between cameras were 2.8 and 3.4 for the chart and spectral characterizations, near the expected threshold for reliable image difference perception. Eight scenes were evaluated psychophysically in three forced-choice experiments in which a reference image from one of the cameras was shown to observers in comparison with a pair of images, one from each camera. The three experiments were (1) a comparison of the two cameras with the chart-based characterizations, (2) a comparison with the spectral characterizations, and (3) a comparison of chart vs. spectral characterization within and across cameras. The results for the three experiments are 64%, 64%, and 55% correct respectively. Careful and simple colorimetric characterization of digital SLR cameras can result in visually equivalent color reproduction.

  8. Matching image color from different cameras

    Microsoft Academic Search

    Mark D. Fairchild; David R. Wyble; Garrett M. Johnson

    2008-01-01

    Can images from professional digital SLR cameras be made equivalent in color using simple colorimetric characterization? Two cameras were characterized, these characterizations were implemented on a variety of images, and the results were evaluated both colorimetrically and psychophysically. A Nikon D2x and a Canon 5D were used. The colorimetric analyses indicated that accurate reproductions were obtained. The median CIELAB color

  9. Detonation phenomena observed with a CCD camera

    Microsoft Academic Search

    Manfred Held

    1995-01-01

    With an appropriate test set up, the Hadland Photonics Ballistic Range Camera (SVR), designed primarily for exterior and terminal ballistics, can also be used very well for studying initiation events and analyzing a variety of detonation phenomena. This paper explains in detail the test set up of one interesting detonic experiment, observed with the Ballistic Range Camera, and the analysis

  10. Making a room-sized camera obscura

    NASA Astrophysics Data System (ADS)

    Flynt, Halima; Ruiz, Michael J.

    2015-01-01

    We describe how to convert a room into a camera obscura as a project for introductory geometrical optics. The view for our camera obscura is a busy street scene set against a beautiful mountain skyline. We include a short video with project instructions, ray diagrams and delightful moving images of cars driving on the road outside.

  11. Processing the Viking lander camera data

    Microsoft Academic Search

    Elliott C. Levinthal; William Green; Kenneth L. Jones; Robert Tucker

    1977-01-01

    Over 1000 camera events were returned from the two Viking landers during the Primary Mission. A system was devised for processing camera data as they were received, in real time, from the Deep Space Network. This system provided a flexible choice of parameters for three computer-enhanced versions of the data for display or hard-copy generation. Software systems allowed all but

  12. Creating and Using a Camera Obscura

    ERIC Educational Resources Information Center

    Quinnell, Justin

    2012-01-01

    The camera obscura (Latin for "darkened room") is the earliest optical device and goes back over 2500 years. The small pinhole or lens at the front of the room allows light to enter and this is then "projected" onto a screen inside the room. This differs from a camera, which projects its image onto light-sensitive material. Originally images were…

  13. Phase resolved electroluminescence measurements in thin films of low density polyethylene using a charge coupled device camera

    Microsoft Academic Search

    S. J. Dodd; P. L. Lewin; K. I. Wong

    2006-01-01

    Electroluminescence (EL) produced by a commercially available additive free low density polyethylene film has been investigated under a 50 Hz AC electrical stress. The spatial distribution, spectral characteristics and the phase angle relationship of EL with respect to the 50 Hz applied AC voltage were studied using a sensitive Peltier cooled charge-couple device (CCD) camera. The experimental results from several

  14. 8.G Find the Angle

    NSDL National Science Digital Library

    2012-05-01

    This is a task from the Illustrative Mathematics website that is one part of a complete illustration of the standard to which it is aligned. Each task has at least one solution and some commentary that addresses important asects of the task and its potential use. Here are the first few lines of the commentary for this task: In triangle $\\Delta ABC$, point $M$ is the point of intersection of the bisectors of angles $\\angle BAC$, $\\angle ABC$, and $\\angle ACB$. The measure o...

  15. Maximum likelihood positioning in the scintillation camera using depth of interaction

    SciTech Connect

    Gagnon, D.; Pouliot, N.; Laperriere, L.; Therrien, M.; Olivier, P. (Montreal Heart Inst., Quebec (Canada). Biomedical Engineering)

    1993-03-01

    The spatial (X and Y) dependence of the photomultiplier (PM) response in Anger gamma camera has been thoroughly described in the past. The light distribution to individual PM in gamma cameras--solid angle seen by each photocathode--being a truly three-dimensional problem, the depth of interaction (DOI) has to be included in the analysis of the PM output. Furthermore, DOI being a stochastic process, it has to be considered explicitly, on a event-by-event basis, while evaluating both position and energy. Specific effects of the DOI on the PM response have been quantified. The method was implemented and tested on a Monte Carlo simulator with special care to the noise modeling. Two models were developed, a first one considering only the geometric aspects of the camera and used for comparison, and a second one describing a more realistic camera environment. In a typical camera configuration and 140 keV photons, the DOI alone can account for a 6.4 mm discrepancy in position and 12% in energy between two scintillations. Variation of the DOI can still bring additional distortions when photons do not enter the crystal perpendicularly such as in slant hole, cone beam and other focusing collimators. With a 0.95 cm crystal and a 30[degree] slant angle, the obliquity factor can be responsible for a 5.5 mm variation in the event position. Results indicate that both geometrical and stochastic effects of the DOI are definitely reducing the camera performances and should be included in the image formation process.

  16. Architecture of PAU survey camera readout electronics

    NASA Astrophysics Data System (ADS)

    Castilla, Javier; Cardiel-Sas, Laia; De Vicente, Juan; Illa, Joseph; Jimenez, Jorge; Maiorino, Marino; Martinez, Gustavo

    2012-07-01

    PAUCam is a new camera for studying the physics of the accelerating universe. The camera will consist of eighteen 2Kx4K HPK CCDs: sixteen for science and two for guiding. The camera will be installed at the prime focus of the WHT (William Herschel Telescope). In this contribution, the architecture of the readout electronics system is presented. Back- End and Front-End electronics are described. Back-End consists of clock, bias and video processing boards, mounted on Monsoon crates. The Front-End is based on patch panel boards. These boards are plugged outside the camera feed-through panel for signal distribution. Inside the camera, individual preamplifier boards plus kapton cable completes the path to connect to each CCD. The overall signal distribution and grounding scheme is shown in this paper.

  17. ARNICA, the Arcetri Near-Infrared Camera

    NASA Astrophysics Data System (ADS)

    Lisi, F.; Baffa, C.; Bilotti, V.; Bonaccini, D.; del Vecchio, C.; Gennari, S.; Hunt, L. K.; Marcucci, G.; Stanga, R.

    1996-04-01

    ARNICA (ARcetri Near-Infrared CAmera) is the imaging camera for the near-infrared bands between 1.0 and 2.5 microns that the Arcetri Observatory has designed and built for the Infrared Telescope TIRGO located at Gornergrat, Switzerland. We describe the mechanical and optical design of the camera, and report on the astronomical performance of ARNICA as measured during the commissioning runs at the TIRGO (December, 1992 to December 1993), and an observing run at the William Herschel Telescope, Canary Islands (December, 1993). System performance is defined in terms of efficiency of the camera+telescope system and camera sensitivity for extended and point-like sources. (SECTION: Astronomical Instrumentation)

  18. Incremental activity modeling in multiple disjoint cameras.

    PubMed

    Loy, Chen Change; Xiang, Tao; Gong, Shaogang

    2012-09-01

    Activity modeling and unusual event detection in a network of cameras is challenging, particularly when the camera views are not overlapped. We show that it is possible to detect unusual events in multiple disjoint cameras as context-incoherent patterns through incremental learning of time delayed dependencies between distributed local activities observed within and across camera views. Specifically, we model multicamera activities using a Time Delayed Probabilistic Graphical Model (TD-PGM) with different nodes representing activities in different decomposed regions from different views and the directed links between nodes encoding their time delayed dependencies. To deal with visual context changes, we formulate a novel incremental learning method for modeling time delayed dependencies that change over time. We validate the effectiveness of the proposed approach using a synthetic data set and videos captured from a camera network installed at a busy underground station. PMID:22184260

  19. The Critical Angle Can Override the Brewster Angle

    ERIC Educational Resources Information Center

    Froehle, Peter H.

    2009-01-01

    As a culminating activity in their study of optics, my students investigate polarized light and the Brewster angle. In this exercise they encounter a situation in which it is impossible to measure the Brewster angle for light reflecting from a particular surface. This paper describes the activity and explains the students' observations.

  20. Studies of narrow autoionizing resonances in gadolinium

    SciTech Connect

    Bushaw, Bruce A.; Nortershauser, W.; Blaum, K.; Wendt, Klaus

    2003-06-30

    The autoionization (AI) spectrum of gadolinium between the first and second limits has been investigated by triple-resonance excitation with high-resolution cw lasers. A large number of narrow AI resonances have been observed and assigned total angular momentum J values. The resonances are further divided into members of AI Rydberg series converging to the second limit or other ''interloping'' levels. Fine structure in the Rydberg series has been identified and interpreted in terms of Jc j coupling. A number of detailed studies have been performed on the interloping resonances: These include lifetime determination by lineshape analysis, isotope shifts, hyperfine structure, and photoionization saturation parameters. The electronic structure of the interloping levels is discussed in terms of these studies. Linewidths generally decrease with increasing total angular momentum and the J = 7 resonances are extremely narrow with Lorentzian widths ranging from < 1 MHz up to 157 MHz. The strongest resonances are found to have cross-sections of {approx}10-12 cm{sup 2} and photoionization can be saturated with powers available from cw diode lasers.

  1. Modeling of a slanted-hole collimator in a compact endo-cavity gamma camera.

    NASA Astrophysics Data System (ADS)

    Kamuda, Mark; Cui, Yonggang; Lall, Terry; Ionson, Jim; Camarda, Giuseppe S.; Hossain, Anwar; Yang, Ge; Roy, Utpal N.; James, Ralph B.

    2013-09-01

    Having the ability to take an accurate 3D image of a tumor greatly helps doctors diagnose it and then create a treatment plan for a patient. One way to accomplish molecular imaging is to inject a radioactive tracer into a patient and then measure the gamma rays emitted from regions with high-uptake of the tracer, viz., the cancerous tissues. In large, expensive PET- or SPECT-imaging systems, the 3D imaging easily is accomplished by rotating the gamma-ray detectors and then employing software to reconstruct the 3D images from the multiple 2D projections at different angles of view. However, this method is impractical in a very compact imaging system due to anatomical considerations, e.g., the transrectal gamma camera under development at Brookhaven National Laboratory (BNL) for detection of intra-prostatic tumors. The camera uses pixilated cadmium zinc telluride (CdZnTe or CZT) detectors with matched parallel-hole collimator. Our research investigated the possibility of using a collimator with slanted holes to create 3D pictures of a radioactive source. The underlying concept is to take 2D projection images at different angles of view by adjusting the slant angle of the collimator, then using the 2D projection images to reconstruct the 3D image. To do this, we first simulated the response of a pixilated CZT detector to radiation sources placed in the field of view of the camera. Then, we formulated an algorithm to use the simulation results as prior knowledge and estimate the distribution of a shaped source from its 2D projection images. From the results of the simulation, we measured the spatial resolution of the camera as ~7-mm at a depth of 13.85-mm when using a detector with 2.46-mm pixel pitch and a collimator with 60° slant angle.

  2. Visualization of explosion phenomena using a high-speed video camera with an uncoupled objective lens by fiber-optic

    NASA Astrophysics Data System (ADS)

    Tokuoka, Nobuyuki; Miyoshi, Hitoshi; Kusano, Hideaki; Hata, Hidehiro; Hiroe, Tetsuyuki; Fujiwara, Kazuhito; Yasushi, Kondo

    2008-11-01

    Visualization of explosion phenomena is very important and essential to evaluate the performance of explosive effects. The phenomena, however, generate blast waves and fragments from cases. We must protect our visualizing equipment from any form of impact. In the tests described here, the front lens was separated from the camera head by means of a fiber-optic cable in order to be able to use the camera, a Shimadzu Hypervision HPV-1, for tests in severe blast environment, including the filming of explosions. It was possible to obtain clear images of the explosion that were not inferior to the images taken by the camera with the lens directly coupled to the camera head. It could be confirmed that this system is very useful for the visualization of dangerous events, e.g., at an explosion site, and for visualizations at angles that would be unachievable under normal circumstances.

  3. An interpretation of the narrow positron annihilation feature from X-ray nova Muscae 1991

    NASA Technical Reports Server (NTRS)

    Chen, Wan; Gehrels, Neil; Cheng, F. H.

    1993-01-01

    The physical mechanism responsible for the narrow redshifted positron annihilation gamma-ray line from the X-ray nova Muscae 1991 is studied. The orbital inclination angle of the system is estimated and its black hole mass is constrained under the assumptions that the annihilation line centroid redshift is purely gravitational and that the line width is due to the combined effect of temperature broadening and disk rotation. The large black hole mass lower limit of 8 solar and the high binary mass ratio it implies raise a serious challenge to theoretical models of the formation and evolution of massive binaries.

  4. New design of a gamma camera detector with reduced edge effect for breast imaging

    NASA Astrophysics Data System (ADS)

    Yeon Hwang, Ji; Lee, Seung-Jae; Baek, Cheol-Ha; Hyun Kim, Kwang; Hyun Chung, Yong

    2011-05-01

    In recent years, there has been a growing interest in developing small gamma cameras dedicated to breast imaging. We designed a new detector with trapezoidal shape to expand the field of view (FOV) of camera without increasing its dimensions. To find optimal parameters, images of point sources at the edge area as functions of the angle and optical treatment of crystal side surface were simulated by using a DETECT2000. Our detector employs monolithic CsI(Tl) with dimensions of 48.0×48.0×6.0 mm coupled to an array of photo-sensors. Side surfaces of crystal were treated with three different surface finishes: black absorber, metal reflector and white reflector. The trapezoidal angle varied from 45° to 90° in steps of 15°. Gamma events were generated on 15 evenly spaced points with 1.0 mm spacing in the X-axis starting 1.0 mm away from the side surface. Ten thousand gamma events were simulated at each location and images were formed by calculating the Anger-logic. The results demonstrated that all the 15 points could be identified only for the crystal with trapezoidal shape having 45° angle and white reflector on the side surface. In conclusion, our new detector proved to be a reliable design to expand the FOV of small gamma camera for breast imaging.

  5. Ultralow-angle dynamic light scattering with a charge coupled device camera based multispeckle, multitau correlator

    E-print Network

    rather than linear. A detailed analysis is presented of the effects of dark noise, stray light a viable alternative for a DLS detector. We test the apparatus on a dilute suspension of colloidal a laser source, a goniometer, and a detector, usually a photomulti- plier tube, whose signal is fed

  6. Design and analysis of a two-dimensional camera array

    E-print Network

    Yang, Jason C. (Jason Chieh-Sheng), 1977-

    2005-01-01

    I present the design and analysis of a two-dimensional camera array for virtual studio applications. It is possible to substitute conventional cameras and motion control devices with a real-time, light field camera array. ...

  7. Mitsubishi Electric Research Labs (MERL) Computational Cameras Amit Agrawal

    E-print Network

    Agrawal, Amit

    Mitsubishi Electric Research Labs (MERL) Computational Cameras Amit Agrawal Mitsubishi Electric Research Labs (MERL) Cambridge, MA, USA Computational Cameras: Exploiting Spatial- Angular Temporal Tradeoffs in Photography #12;Mitsubishi Electric Research Labs (MERL) Computational Cameras Where

  8. Solar elevation angle probability distribution

    Microsoft Academic Search

    D. C. Larson; C. R. Acquista

    1980-01-01

    The probability distribution of solar elevation angles is determined, and the importance of this distribution on concentrator design is discussed. It is concluded that the time probability function for the solar elevation angle is important when considering alternative low-concentration systems for year-round or seasonal applications.

  9. Measuring Angles in Physical Therapy.

    ERIC Educational Resources Information Center

    Greeley, Nansee; Offerman, Theresa Reardon

    1997-01-01

    Features articles about physical therapy and its history as related to geometry through measurement of body angles. Includes open-ended worksheets for mathematics activities that introduce students to angle measurement, data analysis, and mathematical tools. Activities include: (1) Making Your Own Goniometer; (2) Range of Motion; (3) Active versus…

  10. Automatic calibration of laser range cameras using arbitrary planar surfaces

    SciTech Connect

    Baker, J.E.

    1994-06-01

    Laser Range Cameras (LRCs) are powerful tools for many robotic/computer perception activities. They can provide accurate range images and perfectly registered reflectance images of the target scene, useful for constructing reliably detailed 3-D world maps and target characterizations. An LRC`s output is an array of distances obtained by scanning a laser over the scene. To accurately interpret this data, the angular definition of each pixel, i.e., the 3-D direction corresponding to each distance measurement, must be known. This angular definition is a function of the camera`s intrinsic design and unique implementation characteristics, e.g., actual mirror positions, axes of rotation, angular velocities, etc. Typically, the range data is converted to Cartesian coordinates by calibration-parameterized, non-linear transformation equations. Unfortunately, typical LRC calibration techniques are manual, intensive, and inaccurate. Common techniques involve imaging carefully orchestrated artificial targets and manually measuring actual distances and relative angles to infer the correct calibration parameter values. This paper presents an automated method which uses Genetic Algorithms to search for calibration parameter values and possible transformation equations which combine to maximize the planarity of user-specified sub-regions of the image(s). This method permits calibration to be based on an arbitrary plane, without precise knowledge of the LRC`s mechanical precision, intrinsic design, or its relative positioning to the target. Furthermore, this method permits rapid, remote, and on-line recalibration - important capabilities for many robotic systems. Empirical validation of this system has been performed using two different LRC systems and has led to significant improvement in image accuracy while reducing the calibration time by orders of magnitude.

  11. Supercritical angle fluorescence (SAF) microscopy.

    PubMed

    Ruckstuhl, Thomas; Verdes, Dorinel

    2004-09-01

    We explore a new confocal microscope for the detection of surface-generated fluorescence. The instrument is designed for high resolution imaging as well as for the readout of large biochips. Special feature is the separated collection of two different fluorescence emission modes. One optical path covers the emission into the glass at low surface angles, the other captures high angles, exceeding the critical angle of the water/glass interface. Due to the collection of the supercritical angle fluorescence (SAF) the confocal detection volume is strictly confined to the interface, whereas the low angles collect much deeper from the aqueous analyte solution. Hence the system can deliver information about surfacebound and unbound fraction of fluorescent analyte simultaneously. PMID:19483970

  12. Shear angle of magnetic fields.

    NASA Astrophysics Data System (ADS)

    Yanping, Lü; Wang, Jingxiu; Wang, Huaning

    1993-11-01

    The authors introduce a new parameter, the shear angle of vector magnetic fields, ??, to describe the non-potentiality of magnetic fields in active regions, which is defined as the angle between the observed vector magnetic field and its corresponding current-free field. In the case of highly inclined field configurations, this angle is approximately equal to the "angular shear", ??, defined by Hagyard et al. (1984). ?? can be considered as the projection of the shear angle, ??, on the photosphere. For the active region studied, the shear angle, ??, seems to have a better and neater correspondence with flare activity than does ??. It gives a clearer explanation of the non-potentiality of magnetic fields. It is a better measure of the deviation of the observed magnetic field from a potential field, and is directly related to the magnetic free energy stored in non-potential fields.

  13. Spinning angle optical calibration apparatus

    DOEpatents

    Beer, Stephen K. (Morgantown, WV); Pratt, II, Harold R. (Morgantown, WV)

    1991-01-01

    An optical calibration apparatus is provided for calibrating and reproducing spinning angles in cross-polarization, nuclear magnetic resonance spectroscopy. An illuminated magnifying apparatus enables optical setting an accurate reproducing of spinning "magic angles" in cross-polarization, nuclear magnetic resonance spectroscopy experiments. A reference mark scribed on an edge of a spinning angle test sample holder is illuminated by a light source and viewed through a magnifying scope. When the "magic angle" of a sample material used as a standard is attained by varying the angular position of the sample holder, the coordinate position of the reference mark relative to a graduation or graduations on a reticle in the magnifying scope is noted. Thereafter, the spinning "magic angle" of a test material having similar nuclear properties to the standard is attained by returning the sample holder back to the originally noted coordinate position.

  14. Development of gamma ray imaging cameras

    SciTech Connect

    Wehe, D.K.; Knoll, G.F.

    1992-05-28

    In January 1990, the Department of Energy initiated this project with the objective to develop the technology for general purpose, portable gamma ray imaging cameras useful to the nuclear industry. The ultimate goal of this R D initiative is to develop the analog to the color television camera where the camera would respond to gamma rays instead of visible photons. The two-dimensional real-time image would be displayed would indicate the geometric location of the radiation relative to the camera's orientation, while the brightness and color'' would indicate the intensity and energy of the radiation (and hence identify the emitting isotope). There is a strong motivation for developing such a device for applications within the nuclear industry, for both high- and low-level waste repositories, for environmental restoration problems, and for space and fusion applications. At present, there are no general purpose radiation cameras capable of producing spectral images for such practical applications. At the time of this writing, work on this project has been underway for almost 18 months. Substantial progress has been made in the project's two primary areas: mechanically-collimated (MCC) and electronically-collimated camera (ECC) designs. We present developments covering the mechanically-collimated design, and then discuss the efforts on the electronically-collimated camera. The renewal proposal addresses the continuing R D efforts for the third year effort. 8 refs.

  15. A new micro laser camera

    NASA Astrophysics Data System (ADS)

    Drabe, Christian; James, Richard; Klose, Thomas; Wolter, Alexander; Schenk, Harald; Lakner, Hubert

    2007-01-01

    A new two-dimensional and resonantly driven scanning micro mirror has been simulated, fabricated and characterized. Features are a small chip size of 2900 μm x 2350 μm with a frame oscillating at frequencies in the range of 1 kHz. The frame carries a mirror of 500 μm diameter in a gimbal mounting oscillating at frequencies in the range of 16 kHz. The characteristic mechanical amplitudes are 21 ° and 28 ° respectively. Voltages of 60 V and less than 140 V were necessary to accomplish this. Much higher amplitudes have been achieved on the mirror axis without breaking the torsion bars. Initial difficulties in realizing the high amplitudes have been overcome by improving the geometry of the suspension. The initial design is presented as well as the measurement results of the initial and improved design. The device was used to develop a micro laser camera with high depth of focus. Pictures taken with the system are presented revealing the excellent resolution.

  16. Smart Camera Technology Increases Quality

    NASA Technical Reports Server (NTRS)

    2004-01-01

    When it comes to real-time image processing, everyone is an expert. People begin processing images at birth and rapidly learn to control their responses through the real-time processing of the human visual system. The human eye captures an enormous amount of information in the form of light images. In order to keep the brain from becoming overloaded with all the data, portions of an image are processed at a higher resolution than others, such as a traffic light changing colors. changing colors. In the same manner, image processing products strive to extract the information stored in light in the most efficient way possible. Digital cameras available today capture millions of pixels worth of information from incident light. However, at frame rates more than a few per second, existing digital interfaces are overwhelmed. All the user can do is store several frames to memory until that memory is full and then subsequent information is lost. New technology pairs existing digital interface technology with an off-the-shelf complementary metal oxide semiconductor (CMOS) imager to provide more than 500 frames per second of specialty image processing. The result is a cost-effective detection system unlike any other.

  17. Cloud Computing with Context Cameras

    E-print Network

    Pickles, A J

    2013-01-01

    We summarize methods and plans to monitor and calibrate photometric observations with our autonomous, robotic network of 2m, 1m and 40cm telescopes. These are sited globally to optimize our ability to observe time-variable sources. Wide field "context" cameras are aligned with our network telescopes and cycle every 2 minutes through BVriz filters, spanning our optical range. We measure instantaneous zero-point offsets and transparency (throughput) against calibrators in the 5-12m range from the all-sky Tycho2 catalog, and periodically against primary standards. Similar measurements are made for all our science images, with typical fields of view of 0.5 degrees. These are matched against Landolt, Stetson and Sloan standards, and against calibrators in the 10-17m range from the all-sky APASS catalog. Such measurements provide pretty good instantaneous flux calibration, often to better than 5%, even in cloudy conditions. Zero-point and transparency measurements can be used to characterize, monitor and inter-comp...

  18. A Prototype Si/CdTe Compton Camera and the Polarization Measurement

    E-print Network

    Takefumi Mitani; Takaaki Tanaka; Kazuhiro Nakazawa; Tadayuki Takahashi; Takeshi Takashima; Hiroyasu Tajima; Hidehito Nakamura; Masaharu Nomachi; Tatsuya Nakamoto; Yasushi Fukazawa

    2004-10-05

    A Compton camera is the most promising approach for gamma-ray detection in the energy region from several hundred keV to MeV, especially for application in high energy astrophysics. In order to obtain good angular resolution, semiconductor detectors such as silicon, germanium and cadmium telluride(CdTe) have several advantages over scintillation detectors, which have been used so far. Based on the recent advances of high resolution CdTe and silicon imaging detectors, we are working on a Si/CdTe Compton camera. We have developed 64-pixel CdTe detectors with a pixel size of 2mmx2mm and double-sided Si strip detectors(DSSDs) with a position resolution of 800 micron. As a prototype Si/CdTe Compton camera, we use a DSSD as a scatterer and two CdTe pixel detectors as an absorber. In order to verify its performance, we irradiate the camera with 100% linearly polarised 170keV gamma-rays and demonstrate the system works properly as a Compton camera. The resolution of the reconstructed scattering angle is 22 degrees(FWHM). Measurement of polarization is also reported. The polarimetric modulation factor is obtained to be 43%, which is consistent with the prediction of Monte Carlo simulations.

  19. Robust gaze-tracking method by using frontal-viewing and eye-tracking cameras

    NASA Astrophysics Data System (ADS)

    Cho, Chul Woo; Lee, Ji Woo; Lee, Eui Chul; Park, Kang Ryoung

    2009-12-01

    Gaze-tracking technology is used to obtain the position of a user's viewpoint and a new gaze-tracking method is proposed based on a wearable goggle-type device, which includes an eye-tracking camera and a frontal viewing camera. The proposed method is novel in five ways compared to previous research. First, it can track the user's gazing position, allowing for the natural facial and eye movements by using frontal viewing and an eye-tracking camera. Second, an eye gaze position is calculated using a geometric transform, based on the mapping function among three rectangular regions. These are a rectangular region defined by the four pupil centers detected when a user gazes at the four corners of a monitor, a distorted monitor region observed by the frontal viewing camera, and an actual monitor region, respectively. Third, a facial gaze position is estimated based on the geometric center and the four internal angles of the monitor region detected by the frontal viewing camera. Fourth, a final gaze position is obtained by using the weighted summation of the eye and the facial gazing positions. Fifth, since a simple 2-D method is used to obtain the gazing position instead of a complicated 3-D method, the proposed method can be operated at real-time speeds. Experimental results show that the root mean square (rms) error of gaze estimation is less than 1 deg.

  20. Characterization of a CCD-camera-based system for measurement of the solar radial energy distribution

    NASA Astrophysics Data System (ADS)

    Gambardella, A.; Galleano, R.

    2011-10-01

    Charge-coupled device (CCD)-camera-based measurement systems offer the possibility to gather information on the solar radial energy distribution (sunshape). Sunshape measurements are very useful in designing high concentration photovoltaic systems and heliostats as they collect light only within a narrow field of view, the dimension of which has to be defined in the context of several different system design parameters. However, in this regard the CCD camera response needs to be adequately characterized. In this paper, uncertainty components for optical and other CCD-specific sources have been evaluated using indoor test procedures. We have considered CCD linearity and background noise, blooming, lens aberration, exposure time linearity and quantization error. Uncertainty calculation showed that a 0.94% (k = 2) combined expanded uncertainty on the solar radial energy distribution can be assumed.

  1. [Research Award providing funds for a tracking video camera

    NASA Technical Reports Server (NTRS)

    Collett, Thomas

    2000-01-01

    The award provided funds for a tracking video camera. The camera has been installed and the system calibrated. It has enabled us to follow in real time the tracks of individual wood ants (Formica rufa) within a 3m square arena as they navigate singly in-doors guided by visual cues. To date we have been using the system on two projects. The first is an analysis of the navigational strategies that ants use when guided by an extended landmark (a low wall) to a feeding site. After a brief training period, ants are able to keep a defined distance and angle from the wall, using their memory of the wall's height on the retina as a controlling parameter. By training with walls of one height and length and testing with walls of different heights and lengths, we can show that ants adjust their distance from the wall so as to keep the wall at the height that they learned during training. Thus, their distance from the base of a tall wall is further than it is from the training wall, and the distance is shorter when the wall is low. The stopping point of the trajectory is defined precisely by the angle that the far end of the wall makes with the trajectory. Thus, ants walk further if the wall is extended in length and not so far if the wall is shortened. These experiments represent the first case in which the controlling parameters of an extended trajectory can be defined with some certainty. It raises many questions for future research that we are now pursuing.

  2. Marine Boundary Layer Aerosol Profiling with a Camera Lidar

    NASA Astrophysics Data System (ADS)

    Barnes, J. E.; Parikh Sharma, N. C.; Kaplan, T.; Clarke, A. D.

    2010-12-01

    Aerosol measurements at a coastal site on the Big Island of Hawaii were made to assess the usefulness of a new aerosol profiling technique called CLidar (camera lidar). A scientific-grade digital camera was used with a wide-angle lens (>100 deg) to image a vertically-pointed laser beam which was 122 meters away. The image was then analyzed for aerosol scatter much in the same way a monostatic lidar signal is analyzed except that the altitude information is determined by the geometry. The technique has sub-meter altitude resolution near the ground and can be directly compared with in-situ instruments. Aerosol profiles can be acquired through the boundary-layer with lower altitude resolution. CLidar aerosol measurements were made on two evenings where coastal breaking waves from about 400 meters away were added to the background marine boundary layer. A NASA/AERONET aerosol phase function, as well as a previously directly-measured phase function, were used to convert the single-angle CLidar scatter into extinction. A large gradient in aerosols with altitude was found for the first 35 meters with a lower gradient up to 200 meters. This was probably the region affected by the breaking waves. This is a useful result in characterizing the sampling environment. Nephelometers with intakes at 7 and 25 meters were directly compared with the CLidar results. Agreement was better with the directly-measured phase function on the evening with higher wind, possibly indicating the breaking-wave aerosol was changing during the longer transit time of the other evening. Aerosol optical depths (AOD) were calculated with the CLidar data by integrating though the boundary layer. The first evening was clear and agreed, within error bars, with the NASA/MODIS overpass. The CLidar AOD on the second evening, which was partly cloudy, was significantly lower than the MODIS value possibly because of an over estimate of the MODIS instrument near clouds.

  3. A slanting light-guide analog decoding high resolution detector for positron emission tomography camera

    SciTech Connect

    Wong, W.H.; Jing, M.; Bendriem, B.; Hartz, R.; Mullani, N.; Gould, K.L.; Michel, C.

    1987-02-01

    Current high resolution PET cameras require the scintillation crystals to be much narrower than the smallest available photomultipliers. In addition, the large number of photomultiplier channels constitutes the major component cost in the camera. Recent new designs use the Anger camera type of analog decoding method to obtain higher resolution and lower cost by using the relatively large photomultipliers. An alternative approach to improve the resolution and cost factors has been proposed, with a system of slanting light-guides between the scintillators and the photomultipliers. In the Anger camera schemes, the scintillation light is distributed to several neighboring photomultipliers which then determine the scintillation location. In the slanting light-guide design, the scintillation is metered and channeled to only two photomultipliers for the decision making. This paper presents the feasibility and performance achievable with the slanting light-guide detectors. With a crystal/photomultiplier ratio of 6/1, the intrinsic resolution was found to be 4.0 mm using the first non-optimized prototype light-guides on BGO crystals. The axial resolution will be about 5-6 mm.

  4. Differentiating Biological Colours with Few and Many Sensors: Spectral Reconstruction with RGB and Hyperspectral Cameras

    PubMed Central

    Garcia, Jair E.; Girard, Madeline B.; Kasumovic, Michael; Petersen, Phred; Wilksch, Philip A.; Dyer, Adrian G.

    2015-01-01

    Background The ability to discriminate between two similar or progressively dissimilar colours is important for many animals as it allows for accurately interpreting visual signals produced by key target stimuli or distractor information. Spectrophotometry objectively measures the spectral characteristics of these signals, but is often limited to point samples that could underestimate spectral variability within a single sample. Algorithms for RGB images and digital imaging devices with many more than three channels, hyperspectral cameras, have been recently developed to produce image spectrophotometers to recover reflectance spectra at individual pixel locations. We compare a linearised RGB and a hyperspectral camera in terms of their individual capacities to discriminate between colour targets of varying perceptual similarity for a human observer. Main Findings (1) The colour discrimination power of the RGB device is dependent on colour similarity between the samples whilst the hyperspectral device enables the reconstruction of a unique spectrum for each sampled pixel location independently from their chromatic appearance. (2) Uncertainty associated with spectral reconstruction from RGB responses results from the joint effect of metamerism and spectral variability within a single sample. Conclusion (1) RGB devices give a valuable insight into the limitations of colour discrimination with a low number of photoreceptors, as the principles involved in the interpretation of photoreceptor signals in trichromatic animals also apply to RGB camera responses. (2) The hyperspectral camera architecture provides means to explore other important aspects of colour vision like the perception of certain types of camouflage and colour constancy where multiple, narrow-band sensors increase resolution. PMID:25965264

  5. X-ray imaging using a consumer-grade digital camera

    NASA Astrophysics Data System (ADS)

    Winch, N. M.; Edgar, A.

    2011-10-01

    The recent advancements in consumer-grade digital camera technology and the introduction of high-resolution, high sensitivity CsBr:Eu 2+ storage phosphor imaging plates make possible a new cost-effective technique for X-ray imaging. The imaging plate is bathed with red stimulating light by high-intensity light-emitting diodes, and the photostimulated image is captured with a digital single-lens reflex (SLR) camera. A blue band-pass optical filter blocks the stimulating red light but transmits the blue photostimulated luminescence. Using a Canon D5 Mk II camera and an f1.4 wide-angle lens, the optical image of a 240×180 mm 2 Konica CsBr:Eu 2+ imaging plate from a position 230 mm in front of the camera lens can be focussed so as to laterally fill the 35×23.3 mm 2 camera sensor, and recorded in 2808×1872 pixel elements, corresponding to an equivalent pixel size on the plate of 88 ?m. The analogue-to-digital conversion from the camera electronics is 13 bits, but the dynamic range of the imaging system as a whole is limited in practice by noise to about 2.5 orders of magnitude. The modulation transfer function falls to 0.2 at a spatial frequency of 2.2 line pairs/mm. The limiting factor of the spatial resolution is light scattering in the plate rather than the camera optics. The limiting factors for signal-to-noise ratio are shot noise in the light, and dark noise in the CMOS sensor. Good quality images of high-contrast objects can be recorded with doses of approximately 1 mGy. The CsBr:Eu 2+ plate has approximately three times the readout sensitivity of a similar BaFBr:Eu 2+ plate.

  6. Indium nitride: A narrow gap semiconductor

    SciTech Connect

    Wu, J.; Walukiewicz, W.; Yu, K.M.; Ager III, J.W.; Haller, E.E.; Lu, H.; Schaff, W.J.

    2002-08-14

    The optical properties of wurtzite InN grown on sapphire substrates by molecular-beam epitaxy have been characterized by optical absorption, photoluminescence, and photomodulated reflectance techniques. All these three characterization techniques show an energy gap for InN between 0.7 and 0.8 eV, much lower than the commonly accepted value of 1.9 eV. The photoluminescence peak energy is found to be sensitive to the free electron concentration of the sample. The peak energy exhibits a very weak hydrostatic pressure dependence and a small, anomalous blueshift with increasing temperature. The bandgap energies of In-rich InGaN alloys were found to be consistent with the narrow gap of InN. The bandgap bowing parameter was determined to be 1.43 eV in InGaN.

  7. Robotic chair at steep and narrow stairways

    NASA Astrophysics Data System (ADS)

    Imazato, Masahiro; Yamaguchi, Masahiro; Moromugi, Shunji; Ishimatsu, Takakazu

    2007-12-01

    A robotic chair is developed to support mobility of elderly and disabled people living in the house where steep and narrow stairways are installed. In order to deal with such mobility problem the developed robotic chair has a compact original configuration. The robotic chair vertically moves by actuation of electric cylinders and horizontally moves by push-pull operation given by a care-giver. In order to navigate safely every action of the chair is checked by the operator. Up-and-down motions of the robotic chair on the stairway are executed through combinations of motor and cylinder actuations. Performance of the robotic chair was evaluated through two kinds of experiments. The excellent ability of the robotic chair could be confirmed through these experiments.

  8. Bioequivalence and narrow therapeutic index drugs.

    PubMed

    Benet, L Z; Goyan, J E

    1995-01-01

    Every prescription written for a generic drug requires an act of faith by the prescriber that any one of the several available products will be therapeutically equivalent to the innovator (brand name) products. Concerns about this act of faith have been expressed for many years, particularly in the wake of the generic scandals that occurred in 1989-1990, and especially relative to the drugs with a narrow therapeutic range. We contend that these drugs are actually the least likely to pose problems in ensuring therapeutic equivalence, but that new criteria must be established for bioequivalence because the present system is wasteful and is stifling innovation in the industry. We propose four suggestions to the scientific and regulatory communities that we believe could assist in modifying the process such that innovation is encouraged and practitioners are reassured relative to the appropriateness of using generic drugs. PMID:7479195

  9. Production of narrow weakly divergent beams

    SciTech Connect

    Karpel'son, A.E.

    1989-02-01

    Experimental results are presented from an ultrasonic field study of piezoelectric transducers on whose operating surfaces there has been created an acoustic-pressure distribution in the form of a Bessel function of the first kind and zero order. The transducers were comprised of a system of ring electrodes on both sides of a piezoelectric plate. It was established that a transducer having on its operating surface an acoustic pressure distribution according to the Bessel function used will form throughout the irradiated zone the narrowest weakly divergent beam and will produce the greatest signal amplitude. The larger the number of ring-electrode pairs disposed on its surface and the larger its diameter, the narrower will be the main lobe of the directivity pattern it forms and the lower will be the level of its side lobes.

  10. An Instability in Narrow Planetary Rings

    NASA Astrophysics Data System (ADS)

    Weiss, J. W.; Stewart, G. R.

    2003-08-01

    We will present our work investigating the behavior of narrow planetary rings with low dispersion velocities. Such narrow a ring will be initially unstable to self-gravitational collapse. After the collapse, the ring is collisionally very dense. At this stage, it is subject to a new instability. Waves appear on the inner and outer edges of the ring within half of an orbital period. The ring then breaks apart radially, taking approximately a quarter of an orbital period of do so. As clumps of ring particles expand radially away from the dense ring, Kepler shear causes these clumps to stretch out azimuthally, and eventually collapse into a new set of dense rings. Small-scale repetitions of the original instability in these new rings eventually leads to a stabilized broad ring with higher dispersion velocities than the initial ring. Preliminary results indicate that this instability may be operating on small scales in broad rings in the wake-like features seen by Salo and others. Some intriguing properties have been observed during this instability. The most significant is a coherence in the epicyclic phases of the particles. Both self-gravity and collisions in the ring operated to create and enforce this coherence. The coherence might also be responsible for the instability to radial expansion. We also observe that guiding centers of the particles do not migrate to the center of the ring during the collapse phase of the ring. In fact, guiding centers move radially away from the core of the ring during this phase, consistent with global conservation of angular momentum. We will show the results of our simulations to date, including movies of the evolution of various parameters. (Audiences members wanting popcorn are advised to bring their own.) This work is supported by a NASA Graduate Student Research Program grant and by the Cassini mission.

  11. Determining camera parameters for round glassware measurements

    NASA Astrophysics Data System (ADS)

    Baldner, F. O.; Costa, P. B.; Gomes, J. F. S.; Filho, D. M. E. S.; Leta, F. R.

    2015-01-01

    Nowadays there are many types of accessible cameras, including digital single lens reflex ones. Although these cameras are not usually employed in machine vision applications, they can be an interesting choice. However, these cameras have many available parameters to be chosen by the user and it may be difficult to select the best of these in order to acquire images with the needed metrological quality. This paper proposes a methodology to select a set of parameters that will supply a machine vision system with the needed quality image, considering the measurement required of a laboratory glassware.

  12. Print spectral reflectance estimation using trichromatic camera

    NASA Astrophysics Data System (ADS)

    Harouna S., Aboubacar; Bringier, Benjamin; Khoudeir, Majdi

    2015-04-01

    This paper deals with print quality control through a spectral color measurement. The aim is to estimate the spectral reflectance curve of each pixel of a printed sheet for a spectral matching with the reference image. The proposed method consists to perform a spectral characterization of the complete chain which includes the printing system and a digital trichromatic camera. First, the spectral printer model is presented and verified by experiments. Then, the camera spectral sensitivity curves are estimated through the capture of a color chart whose spectral reflectance curves have been previously measured. Finally, the spectral printer model is used to estimate the print spectral reflectance curves from camera responses.

  13. New Modular Camera No Ordinary Joe

    NASA Technical Reports Server (NTRS)

    2003-01-01

    Although dubbed 'Little Joe' for its small-format characteristics, a new wavefront sensor camera has proved that it is far from coming up short when paired with high-speed, low-noise applications. SciMeasure Analytical Systems, Inc., a provider of cameras and imaging accessories for use in biomedical research and industrial inspection and quality control, is the eye behind Little Joe's shutter, manufacturing and selling the modular, multi-purpose camera worldwide to advance fields such as astronomy, neurobiology, and cardiology.

  14. Detonation phenomena observed with a CCD camera

    NASA Astrophysics Data System (ADS)

    Held, Manfred

    1995-05-01

    With an appropriate test set up, the Hadland Photonics Ballistic Range Camera (SVR), designed primarily for exterior and terminal ballistics, can also be used very well for studying initiation events and analyzing a variety of detonation phenomena. This paper explains in detail the test set up of one interesting detonic experiment, observed with the Ballistic Range Camera, and the analysis of the results. The ability of the camera to superimpose up to 16 exposures on a single image allowed particularly detailed examination of the detonation propagation, the detonation velocities, the corner turning distance and the nonreacting radial zones.

  15. Camera self-calibration based on circle

    NASA Astrophysics Data System (ADS)

    Zhang, Bei-Wei; Liang, Dong; Wu, Fu-Chao

    2001-09-01

    In this paper, we propose a flexible new technique to calibrate a camera easily. It is well suited for use without specialized knowledge of 3D geometry or computer vision. The technique only require the camera to observe a planar pattern shown at a few different orientations. Either the camera or the planar can be freely moved. And the motion need not be known. The whole procedure can be done automatically. Both computer simulation and real data have been used to test the proposed technique, and prove that the method has some effect.

  16. Task Panel Sensing with a Movable Camera

    NASA Astrophysics Data System (ADS)

    Wolfe, William J.; Mathis, Donald W.; Magee, Michael; Hoff, William A.

    1990-03-01

    This paper discusses the integration of model based computer vision with a robot planning system. The vision system deals with structured objects with several movable parts (the "Task Panel"). The robot planning system controls a T3-746 manipulator that has a gripper and a wrist mounted camera. There are two control functions: move the gripper into position for manipulating the panel fixtures (doors, latches, etc.), and move the camera into positions preferred by the vision system. This paper emphasizes the issues related to repositioning the camera for improved viewpoints.

  17. Close-range photogrammetry with video cameras

    NASA Technical Reports Server (NTRS)

    Burner, A. W.; Snow, W. L.; Goad, W. K.

    1985-01-01

    Examples of photogrammetric measurements made with video cameras uncorrected for electronic and optical lens distortions are presented. The measurement and correction of electronic distortions of video cameras using both bilinear and polynomial interpolation are discussed. Examples showing the relative stability of electronic distortions over long periods of time are presented. Having corrected for electronic distortion, the data are further corrected for lens distortion using the plumb line method. Examples of close-range photogrammetric data taken with video cameras corrected for both electronic and optical lens distortion are presented.

  18. Close-Range Photogrammetry with Video Cameras

    NASA Technical Reports Server (NTRS)

    Burner, A. W.; Snow, W. L.; Goad, W. K.

    1983-01-01

    Examples of photogrammetric measurements made with video cameras uncorrected for electronic and optical lens distortions are presented. The measurement and correction of electronic distortions of video cameras using both bilinear and polynomial interpolation are discussed. Examples showing the relative stability of electronic distortions over long periods of time are presented. Having corrected for electronic distortion, the data are further corrected for lens distortion using the plumb line method. Examples of close-range photogrammetric data taken with video cameras corrected for both electronic and optical lens distortion are presented.

  19. Assembly and testing of the ESI camera

    NASA Astrophysics Data System (ADS)

    Sheinis, Andrew I.; Sutin, Brian M.; Epps, Harland W.; Schier, J. A.; Hilyard, David F.; Lewis, Jeffrey

    1999-09-01

    The Echellette Spectrograph and Imager (ESI), currently being delivered for use at the Cassegrain focus of the Keck II telescope employs an all-spherical, 308 mm focal length f/1.07 Epps camera. The camera consists of 10 lens elements in 5 groups: an oil-coupled doublet; a singlet, an oil- coupled triplet; a grease-coupled triplet; and a field flattener, which also serves as the vacuum-dewar window. A sensitivity analysis suggested that mechanical manufacturing tolerances of order +/- 25 microns were appropriate. In this paper we discuss the sensitivity analysis, the assembly and the testing of this camera.

  20. The role of narrow-band imaging in the management of non-muscle-invasive bladder cancer.

    PubMed

    Naselli, Angelo; Hurle, Rodolfo; Puppo, Paolo

    2012-12-01

    Narrow-band imaging is a young optical enhancement technology for endoscopy. It is a filter to the standard white light which increases the contrast between underlying vasculature and epithelial strata of the mucosa, improving the detection of bladder cancer with particular regard to high grade, flat lesions. Narrow band imaging is absolutely safe, may be used any time during a procedure, either during office cystosopy or transurethral resection, and implies a minimal burden for the healthcare provider given the absence of a learning curve and the limited cost of the camera and light source. The ameliorated detection translates into an improved management of the disease and a lower recurrence risk in prospective randomized studies, suggesting the inclusion of the technology in daily clinical practice. PMID:23253218

  1. Clementine High Resolution Camera Mosaicking Project

    NASA Technical Reports Server (NTRS)

    1998-01-01

    This report constitutes the final report for NASA Contract NASW-5054. This project processed Clementine I high resolution images of the Moon, mosaicked these images together, and created a 22-disk set of compact disk read-only memory (CD-ROM) volumes. The mosaics were produced through semi-automated registration and calibration of the high resolution (HiRes) camera's data against the geometrically and photometrically controlled Ultraviolet/Visible (UV/Vis) Basemap Mosaic produced by the US Geological Survey (USGS). The HiRes mosaics were compiled from non-uniformity corrected, 750 nanometer ("D") filter high resolution nadir-looking observations. The images were spatially warped using the sinusoidal equal-area projection at a scale of 20 m/pixel for sub-polar mosaics (below 80 deg. latitude) and using the stereographic projection at a scale of 30 m/pixel for polar mosaics. Only images with emission angles less than approximately 50 were used. Images from non-mapping cross-track slews, which tended to have large SPICE errors, were generally omitted. The locations of the resulting image population were found to be offset from the UV/Vis basemap by up to 13 km (0.4 deg.). Geometric control was taken from the 100 m/pixel global and 150 m/pixel polar USGS Clementine Basemap Mosaics compiled from the 750 nm Ultraviolet/Visible Clementine imaging system. Radiometric calibration was achieved by removing the image nonuniformity dominated by the HiRes system's light intensifier. Also provided are offset and scale factors, achieved by a fit of the HiRes data to the corresponding photometrically calibrated UV/Vis basemap, that approximately transform the 8-bit HiRes data to photometric units. The sub-polar mosaics are divided into tiles that cover approximately 1.75 deg. of latitude and span the longitude range of the mosaicked frames. Images from a given orbit are map projected using the orbit's nominal central latitude. Polar mosaics are tiled into squares 2250 pixels on a side, which spans approximately 2.2 deg. Two mosaics are provided for each pole: one corresponding to data acquired while periapsis was in the south, the other while periapsis was in the north. The CD-ROMs also contain ancillary data files that support the HiRes mosaic. These files include browse images with UV/Vis context stored in a Joint Photographic Experts Group (JPEG) format, index files ('imgindx.tab' and 'srcindx.tab') that tabulate the contents of the CD, and documentation files.

  2. Overview of SIM wide angle astrometric system calibration strategies

    NASA Technical Reports Server (NTRS)

    Sievers, L. A.; Milman, M. H.; Shaklan, S. B.; Korechoff, R. P.; Catanzarite, J.; Basdogan, I.; Papalexandris, M. V.; Schwartz, R.

    2002-01-01

    This paper summarizes two very different strategies envisioned for calibrating the systematic field dependent biases present in the Space Interferometry Mission (SIM) instrument. The Internal Calibration strategy is based on pre-launch measurements combined with a set of on orbit measurements generated by a source internal to the instrument. The External Calibration strategy uses stars as an external source for generating the calibration function. Both approaches demand a significant amount of innovation given that SIM's calibration strategy requires a post-calibration error of 100picometers over a 15 degree field of regard while the uncalibrated instrument introduces 10's-100's of nanometers of error. The calibration strategies are discussed in the context of the Wide Angle Astrometric mode of the instrument, although variations on the Internal Calibration Strategy may be used for doing Narrow Angle Astrometry.

  3. Studies of the dynamic contact angle

    NASA Astrophysics Data System (ADS)

    Siekmann, Julius; Zimmermann, Elisabeth

    The factors affecting the contact or boundary angle (theta) between a gas-liquid interface and a solid vertical wall are investigated experimentally, with a focus on the dynamic case where a solid object is immersed in a liquid. The apparatus employs a high-precision five-phase step motor to immerse a glass plate or circular cylinder into ethylene glycol or glycerine at 20 C and then retract it, an He-Ne-laser/Xe-flashlamp illumination system, and an optical bench equipped with a microscope and camera. The results are presented in a graph, and the relationships between theta and the six parameters gravity, fluid density and dynamic viscosity, surface tension, characteristic length, and solid-body velocity are explored. Regression analysis gives the dependence equation theta = 175 x (Ca exp -0.81)(Fr exp 0.5)(M exp 0.23), where Ca is the capillarity (Weber number divided by Reynolds number), Fr is the Froude number, and M is the Morton number.

  4. Versatility of the CFR (Constrained Fourier Reconstruction) algorithm for limited angle reconstruction

    SciTech Connect

    Fujieda, I.; Heiskanen, K.; Perez-Mendez, V.

    1989-08-01

    The Constrained Fourier Reconstruction (CFR) algorithm and the Iterative Reconstruction-Reprojection (IRR) algorithm are evaluated based on their accuracy for three types of limited angle reconstruction problems. The CFR algorithm performs better for problems such as Xray CT imaging of a nuclear reactor core with one large data gap due to structural blocking of the source and detector pair. For gated heart imaging by Xray CT, radioisotope distribution imaging by PET or SPECT, using a polygonal array of gamma cameras with insensitive gaps between camera boundaries, the IRR algorithm has a slight advantage over the CFR algorithm but the difference is not significant. 3 refs., 5 figs.

  5. Hysteresis during contact angles measurement.

    PubMed

    Diaz, M Elena; Fuentes, Javier; Cerro, Ramon L; Savage, Michael D

    2010-03-15

    A theory, based on the presence of an adsorbed film in the vicinity of the triple contact line, provides a molecular interpretation of intrinsic hysteresis during the measurement of static contact angles. Static contact angles are measured by placing a sessile drop on top of a flat solid surface. If the solid surface has not been previously in contact with a vapor phase saturated with the molecules of the liquid phase, the solid surface is free of adsorbed liquid molecules. In the absence of an adsorbed film, molecular forces configure an advancing contact angle larger than the static contact angle. After some time, due to an evaporation/adsorption process, the interface of the drop coexists with an adsorbed film of liquid molecules as part of the equilibrium configuration, denoted as the static contact angle. This equilibrium configuration is metastable because the droplet has a larger vapor pressure than the surrounding flat film. As the drop evaporates, the vapor/liquid interface contracts and the apparent contact line moves towards the center of the drop. During this process, the film left behind is thicker than the adsorbed film and molecular attraction results in a receding contact angle, smaller than the equilibrium contact angle. PMID:20060981

  6. Multiple reflectors based autocollimator for three-dimensional angle measurement

    NASA Astrophysics Data System (ADS)

    Su, Ang; Liu, Haibo; Yu, Qifeng

    2015-03-01

    This paper designs a multiple reflectors based autocollimator, and proposes a direct linear solution for three-dimensional (3D) angle measurement with the observation vectors of the reflected lights from the reflectors. In the measuring apparatus, the multiple reflectors is fixed with the object to be measured and the reflected lights are received by a CCD camera, then the light spots in the image are extracted to obtain the vectors of the reflected lights in space. Any rotation of the object will induce a change in the observation vectors of the reflected lights, which is used to solve the rotation matrix of the object by finding a linear solution of Wahba problem with the quaternion method, and then the 3D angle is obtained by decomposing the rotation matrix. This measuring apparatus can be implemented easily as the light path is simple, and the computation of 3D angle with observation vectors is efficient as there is no need to iterate. The proposed 3D angle measurement method is verified by a set of simulation experiments.

  7. The Narrow-Line Region of Narrow-Line Seyfert 1 Galaxies

    NASA Astrophysics Data System (ADS)

    Rodríguez-Ardila, A.; Binette, Luc; Pastoriza, Miriani G.; Donzelli, Carlos J.

    2000-08-01

    This work studies the optical emission-line properties and physical conditions of the narrow-line region (NLR) of seven narrow-line Seyfert 1 galaxies (NLS1's) for which high signal-to-noise ratio spectroscopic observations were available. The resolution is 340 km s-1 (at H?) over the wavelength interval 3700-9500 Å, enabling us to separate the broad and narrow components of the permitted emission lines. Our results show that the flux carried out by the narrow component of H? is, on average, 50% of the total line flux. As a result, the [O III] ?5007/H? ratio emitted in the NLR varies from 1 to 5, instead of the universally adopted value of 10. This has strong implications for the required spectral energy distribution that ionizes the NLR gas. Photoionization models that consider a NLR composed of a combination of matter-bounded and ionization-bounded clouds are successful at explaining the low [O III] ?5007/H? ratio and the weakness of low-ionization lines of NLS1's. Variation of the relative proportion of these two type of clouds nicely reproduces the dispersion of narrow-line ratios found among the NLS1 sample. Assuming similar physical model parameters of both NLS1's and the normal Seyfert 1 galaxy NGC 5548, we show that the observed differences of emission-line ratios between these two groups of galaxies can be explained, to a first approximation, in terms of the shape of the input ionizing continuum. Narrow emission-line ratios of NLS1's are better reproduced by a steep power-law continuum in the EUV-soft X-ray region, with spectral index ?~-2. Flatter spectral indices (?~-1.5) match the observed line ratios of NGC 5548 but are unable to provide a good match to the NLS1 ratios. This result is consistent with ROSAT observations of NLS1's, which show that these objects are characterized by steeper power-law indices than those of Seyfert 1 galaxies with strong broad optical lines. Based on observations made at CASLEO. Complejo Astronómico El Leoncito (CASLEO) is operated under agreement between the Consejo Nacional de Investigaciones Científicas y técnicas de la República Argentina and the National Universities of La Plata, Córdoba and San Juán.

  8. Projective minimal analysis of camera geometry

    E-print Network

    Romano, Raquel Andrea

    2002-01-01

    This thesis addresses the general problem of how to find globally consistent and accurate estimates of multiple-view camera geometry from uncalibrated imagery of an extended scene. After decades of study, the classic problem ...

  9. Activity based matching in distributed camera networks.

    PubMed

    Ermis, Erhan Baki; Clarot, Pierre; Jodoin, Pierre-Marc; Saligrama, Venkatesh

    2010-10-01

    In this paper, we consider the problem of finding correspondences between distributed cameras that have partially overlapping field of views. When multiple cameras with adaptable orientations and zooms are deployed, as in many wide area surveillance applications, identifying correspondence between different activities becomes a fundamental issue. We propose a correspondence method based upon activity features that, unlike photometric features, have certain geometry independence properties. The proposed method is robust to pose, illumination and geometric effects, unsupervised (does not require any calibration objects). In addition, these features are amenable to low communication bandwidth and distributed network applications. We present quantitative and qualitative results with synthetic and real life examples, and compare the proposed method with scale invariant feature transform (SIFT) based method. We show that our method significantly outperforms the SIFT method when cameras have significantly different orientations. We then describe extensions of our method in a number of directions including topology reconstruction, camera calibration, and distributed anomaly detection. PMID:20550993

  10. Action selection for single-camera SLAM.

    PubMed

    Vidal-Calleja, Teresa A; Sanfeliu, Alberto; Andrade-Cetto, Juan

    2010-12-01

    A method for evaluating, at video rate, the quality of actions for a single camera while mapping unknown indoor environments is presented. The strategy maximizes mutual information between measurements and states to help the camera avoid making ill-conditioned measurements that are appropriate to lack of depth in monocular vision systems. Our system prompts a user with the appropriate motion commands during 6-DOF visual simultaneous localization and mapping with a handheld camera. Additionally, the system has been ported to a mobile robotic platform, thus closing the control-estimation loop. To show the viability of the approach, simulations and experiments are presented for the unconstrained motion of a handheld camera and for the motion of a mobile robot with nonholonomic constraints. When combined with a path planner, the technique safely drives to a marked goal while, at the same time, producing an optimal estimated map. PMID:20350845

  11. Selecting the Right Camera for Your Desktop.

    ERIC Educational Resources Information Center

    Rhodes, John

    1997-01-01

    Provides an overview of camera options and selection criteria for desktop videoconferencing. Key factors in image quality are discussed, including lighting, resolution, and signal-to-noise ratio; and steps to improve image quality are suggested. (LRW)

  12. Simple LCD Transmitter Camera Receiver Data Link

    E-print Network

    Katabi, Dina

    2009-06-15

    We demonstrate a freespace optical system using a consumer camera and projector in indoor environments using available devices for visual computing. Through design, prototype and experimentation with this commodity hardware, ...

  13. System selects framing rate for spectrograph camera

    NASA Technical Reports Server (NTRS)

    1965-01-01

    Circuit using zero-order light is reflected to a photomultiplier in the incoming radiation of a spectrograph monitor to provide an error signal which controls the advancing and driving rate of the film through the camera.

  14. Time-lapse camera for microscopy

    NASA Technical Reports Server (NTRS)

    Cook, J. E.

    1972-01-01

    Compact, lightweight camera which advances film frames without use of conventional sprockets and slip clutches obtains time lapse photomicrographs of human cell growth in a zero-G environment over a period of about a month.

  15. Candid Camera: Catch 'Em in Action.

    ERIC Educational Resources Information Center

    Raschke, Donna; And Others

    1985-01-01

    A concealed video camera can record learning disabled students' behavior and provide a nonjudgmental way for them to see how they appear to others. Such an approach can include a positive emphasis on redirecting energy as well. (CL)

  16. Daytime Aspect Camera for Balloon Altitudes

    NASA Technical Reports Server (NTRS)

    Dietz, Kurt L.; Ramsey, Brian D.; Alexander, Cheryl D.; Apple, Jeff A.; Ghosh, Kajal K.; Swift, Wesley R.

    2002-01-01

    We have designed, built, and flight-tested a new star camera for daytime guiding of pointed balloon-borne experiments at altitudes around 40 km. The camera and lens are commercially available, off-the-shelf components, but require a custom-built baffle to reduce stray light, especially near the sunlit limb of the balloon. This new camera, which operates in the 600- to 1000-nm region of the spectrum, successfully provides daytime aspect information of approx. 10 arcsec resolution for two distinct star fields near the galactic plane. The detected scattered-light backgrounds show good agreement with the Air Force MODTRAN models used to design the camera, but the daytime stellar magnitude limit was lower than expected due to longitudinal chromatic aberration in the lens. Replacing the commercial lens with a custom-built lens should allow the system to track stars in any arbitrary area of the sky during the daytime.

  17. Display-camera calibration from eye reflections

    Microsoft Academic Search

    Christian Nitschke; Atsushi Nakazawa; Haruo Takemura

    2009-01-01

    We present a novel technique for calibrating display-camera systems from reflections in the user's eyes. Display-camera systems enable a range of vision applications that need controlled illumination, including 3D object reconstruction, facial modeling and human computer interaction. One important issue, though, is the geometric calibration of the display, which requires additional hardware and tedious user interaction. The proposed approach eliminates

  18. Thermal cameras in school laboratory activities

    NASA Astrophysics Data System (ADS)

    Haglund, Jesper; Jeppsson, Fredrik; Hedberg, David; Schönborn, Konrad J.

    2015-07-01

    Thermal cameras offer real-time visual access to otherwise invisible thermal phenomena, which are conceptually demanding for learners during traditional teaching. We present three studies of students’ conduction of laboratory activities that employ thermal cameras to teach challenging thermal concepts in grades 4, 7 and 10–12. Visualization of heat-related phenomena in combination with predict-observe-explain experiments offers students and teachers a pedagogically powerful means for unveiling abstract yet fundamental physics concepts.

  19. Gaze tracking using one fixed camera

    Microsoft Academic Search

    Wen Gang

    2002-01-01

    In this paper, a noncontact corneal-pupil reflection scheme using only one fixed camera to track the eye gaze is presented. A small manual focus lens is used in a camera without a pan-and-tilt base. A connected component labeling algorithm is designed to detect the pupils. The gradient information is utilized to find the precise pupil center. The effect of pupil

  20. Si\\/CdTe semiconductor compton camera

    Microsoft Academic Search

    Shin Watanabe; K. Nakazawa; T. Takashima; T. Tanaka; T. Mitani; K. Oonuki; T. Takahashi; Hiroyasu Tajima; Yasushi Fukazawa; Masaharu Nomachi; Shin Kubo; Mitsunobu Onishi; Yoshikatsu Kuroda

    2004-01-01

    We are developing a Compton camera based on Si and CdTe semiconductor imaging devices with high energy resolution. In this paper, results from the most recent prototype are reported. The Compton camera consists of six stacked double-sided Si strip detectors and CdTe pixel detectors, which are read out with low noise analog ASICs, VA32TAs. We obtained Compton reconstructed images and

  1. A stereoscopic lens for digital cinema cameras

    NASA Astrophysics Data System (ADS)

    Lipton, Lenny; Rupkalvis, John

    2015-03-01

    Live-action stereoscopic feature films are, for the most part, produced using a costly post-production process to convert planar cinematography into stereo-pair images and are only occasionally shot stereoscopically using bulky dual-cameras that are adaptations of the Ramsdell rig. The stereoscopic lens design described here might very well encourage more live-action image capture because it uses standard digital cinema cameras and workflow to save time and money.

  2. Crystal streak camera for infrared light pulse

    Microsoft Academic Search

    Osamu Matsumoto; Yasushi Ohbayashi

    1992-01-01

    A streak camera is used to capture fast light pulses. However, it is limited to the visible and near-infrared regions of the optical spectrum. We investigated deflection of a light beam by use of an electro-optic deflector. The crystal streak camera is based on the direct deflection of light by an electro-optic crystal to an image sensor for recording. The

  3. Heart imaging by cadmium telluride gamma camera

    Microsoft Academic Search

    Ch. Scheiber; B. Eclancher; J. Chambron; V. Prat; A. Kazandjan; A. Jahnke; R. Matz; S. Thomas; S. Warren; M. Hage-Hali; R. Regal; P. Siffert; M. Karman

    1999-01-01

    Cadmium telluride semiconductor detectors (CdTe) operating at room temperature are attractive for medical imaging because of their good energy resolution providing excellent spatial and contrast resolution. The compactness of the detection system allows the building of small light camera heads which can be used for bedside imaging. A mobile pixellated gamma camera based on 2304 CdTe (pixel size: 3×3mm, field

  4. X-ray sensitive video camera

    Microsoft Academic Search

    Randy Luhta; John A. Rowlands

    1993-01-01

    By converting the absorbed X-ray image directly to an electrical video signal, the x-ray sensitive video camera offers improved resolution and reduced veiling glare over a conventional x-ray image intensifier for medical fluoroscopy. Unfortunately, currently available x-ray sensitive video cameras are limited to a 1' field of view and poor quantum efficiency. We are developing an x-ray sensitive vidicon for

  5. Performance analysis for gait in camera networks

    Microsoft Academic Search

    Michela Goffredo; Imed Bouchrika; John N. Carter; Mark S. Nixon

    2008-01-01

    ABSTRACT This paper deploys gait analysis for subject identication in multi-camera surveillance scenarios. We present a new method,for viewpoint independent markerless gait analysis that does not require camera,calibration and works with a wide range of directions of walking. These properties make the proposed method,particularly suitable for gait identi- cation in real surveillance scenarios where people and their behaviour need to

  6. Camera Calibration and the Search for Infinity

    Microsoft Academic Search

    Richard I. Hartley; Lourdes De Agapito; Ian D. Reid; Eric Hayman

    1999-01-01

    This paper considers the problem of self-calibration of a camera from an image sequence in the case where the cam- era's internal parameters (most notably focal length) may change. The problem of camera self-calibration from a se- quence of images has proven to be a difficult one in practice, due to the need ultimately to resort to non-linear methods, which

  7. Detection of composite events spanning multiple camera views with wireless embedded smart cameras

    Microsoft Academic Search

    Youlu Wang; Senem Velipasalar; Mauricio Casares

    2009-01-01

    With the introduction of battery-powered and embedded smart cameras, it has become viable to install many spatially-distributed cameras interconnected by wireless links. However, there are many problems that need to be solved to build scalable, battery-powered wireless smart-camera networks (Wi-SCaNs). These problems include the limited processing power, memory, energy and bandwidth. Limited resources necessitate light-weight algorithms to be implemented and

  8. Super-high-sensitive camera tube for HDTV hand-held cameras

    Microsoft Academic Search

    Masakazu Nanba; Yoshiro Takiguchi; Toshio Yamagishi; Misao Kubota; Saburo Okazaki; Tsutomu Kato; Kenkichi Tanioka; Tadaaki Hirai; Yukio Takasaki

    1995-01-01

    We have developed a compact HDTV camera tube that combines high sensitivity with high resolution and is compact enough for hand-held cameras. This new camera tube employs an 8-micrometers -thick HARP (High-gain Avalanche Rushing amorphous Photoconductor) target. Unlike other photoconductors, this target is unique in that its sensitivity can be increased to very high levels to cope with darker illumination.

  9. Low light performance of digital cameras

    NASA Astrophysics Data System (ADS)

    Hultgren, Bror; Hertel, Dirk

    2009-01-01

    Photospace data previously measured on large image sets have shown that a high percentage of camera phone pictures are taken under low-light conditions. Corresponding image quality measurements linked the lowest quality to these conditions, and subjective analysis of image quality failure modes identified image blur as the most important contributor to image quality degradation. Camera phones without flash have to manage a trade-off when adjusting shutter time to low-light conditions. The shutter time has to be long enough to avoid extreme underexposures, but not short enough that hand-held picture taking is still possible without excessive motion blur. There is still a lack of quantitative data on motion blur. Camera phones often do not record basic operating parameters such as shutter speed in their image metadata, and when recorded, the data are often inaccurate. We introduce a device and process for tracking camera motion and measuring its Point Spread Function (PSF). Vision-based metrics are introduced to assess the impact of camera motion on image quality so that the low-light performance of different cameras can be compared. Statistical distributions of user variability will be discussed.

  10. Development of the TopSat camera

    NASA Astrophysics Data System (ADS)

    Greenway, Paul; Tosh, Ian; Morris, Nigel

    2004-06-01

    The TopSat camera is a low cost remote sensing imager capable of producing 2.5 metre resolution panchromatic imagery, funded by the British National Space Centre's Mosaic programme. An engineering model development programme verified optical alignment techniques and crucially, demonstrated structural stability through vibration tests. As a result of this, the flight model camera has been assembled at the Space Science & Technology Department of CCLRC's Rutherford Appleton Laboratory in the UK, in preparation for launch in 2005. The camera has been designed to be compact and lightweight so that it may be flown on a low cost mini-satellite (~120kg launch mass). To achieve this, the camera utilises an off-axis three mirror anastigmatic (TMA) system, which has the advantages of excellent image quality over a wide field of view, combined with a compactness that makes its overall dimensions smaller than its focal length. Keeping the costs to a minimum has been a major design driver in the development of this camera. The camera is part of the TopSat mission, which is a collaboration between four UK organisations; RAL (Rutherford Appleton Laboratory), SSTL (Surrey Satellite Technology Ltd.), QinetiQ and Infoterra. Its objective is to demonstrate provision of rapid response high-resolution imagery to fixed and mobile ground stations using a low cost mini-satellite. This paper describes the opto-mechanical design, assembly and alignment techniques implemented and reports on the test results obtained to date.

  11. Broadband "infinite-speed" magic-angle spinning NMR spectroscopy.

    PubMed

    Hu, Yan-Yan; Levin, E M; Schmidt-Rohr, Klaus

    2009-06-24

    High-resolution magic-angle spinning NMR of high-Z spin-1/2 nuclei such as (125)Te, (207)Pb, (119)Sn, (113)Cd, and (195)Pt is often hampered by large (>1000 ppm) chemical-shift anisotropies, which result in strong spinning sidebands that can obscure the centerbands of interest. In various tellurides with applications as thermoelectrics and as phase-change materials for data storage, even 22-kHz magic-angle spinning cannot resolve the center- and sidebands broadened by chemical-shift dispersion, which precludes peak identification or quantification. For sideband suppression over the necessary wide spectral range (up to 200 kHz), radio frequency pulse sequences with few, short pulses are required. We have identified Gan's two-dimensional magic-angle-turning (MAT) experiment with five 90 degrees pulses as a promising broadband technique for obtaining spectra without sidebands. We have adapted it to broad spectra and fast magic-angle spinning by accounting for long pulses (comparable to the dwell time in t(1)) and short rotation periods. Spectral distortions are small and residual sidebands negligible even for spectra with signals covering a range of 1.5 gammaB(1), due to a favorable disposition of the narrow ranges containing the signals of interest in the spectral plane. The method is demonstrated on various technologically interesting tellurides with spectra spanning up to 170 kHz, at 22 kHz MAS. PMID:19489580

  12. Solar Angles and Tracking Systems

    NSDL National Science Digital Library

    Integrated Teaching and Learning Program,

    Students learn about the daily and annual cycles of solar angles used in power calculations to maximize photovoltaic power generation. They gain an overview of solar tracking systems that improve PV panel efficiency by following the sun through the sky.

  13. Angles, scales and parametric renormalization

    E-print Network

    Francis Brown; Dirk Kreimer

    2011-12-06

    We decompose renormalized Feynman rules according to the scale and angle dependence of amplitudes. We use parametric representations such that the resulting amplitudes can be studied in algebraic geometry.

  14. Compact Right-Angle Connector

    NASA Technical Reports Server (NTRS)

    Barajas, Salvador L.; Pierson, Vonde E.

    1989-01-01

    New right-angle connector between hose and "quick-disconnect" coupler smaller and simpler than its predecessor. Employs fewer parts and therefore cheaper and less likely to leak. Connector consists of only two major parts.

  15. Two Comments on Bond Angles

    NASA Astrophysics Data System (ADS)

    Glaister, P.

    1997-09-01

    Tetrahedral Bond Angle from Elementary Trigonometry The alternative approach of using the scalar (or dot) product of vectors enables the determination of the bond angle in a tetrahedral molecule in a simple way. There is, of course, an even more straightforward derivation suitable for students who are unfamiliar with vectors, or products thereof, but who do know some elementary trigonometry. The starting point is the figure showing triangle OAB. The point O is the center of a cube, and A and B are at opposite corners of a face of that cube in which fits a regular tetrahedron. The required bond angle alpha = AÔB; and using Pythagoras' theorem, AB = 2(square root 2) is the diagonal of a face of the cube. Hence from right-angled triangle OEB, tan(alpha/2) = (square root 2) and therefore alpha = 2tan-1(square root 2) is approx. 109° 28' (see Fig. 1).

  16. Virtual mirror rendering with stationary RGB-D cameras and stored 3-D background.

    PubMed

    Shen, Ju; Su, Po-Chang; Cheung, Sen-Ching Samson; Zhao, Jian

    2013-09-01

    Mirrors are indispensable objects in our lives. The capability of simulating a mirror on a computer display, augmented with virtual scenes and objects, opens the door to many interesting and useful applications from fashion design to medical interventions. Realistic simulation of a mirror is challenging as it requires accurate viewpoint tracking and rendering, wide-angle viewing of the environment, as well as real-time performance to provide immediate visual feedback. In this paper, we propose a virtual mirror rendering system using a network of commodity structured-light RGB-D cameras. The depth information provided by the RGB-D cameras can be used to track the viewpoint and render the scene from different prospectives. Missing and erroneous depth measurements are common problems with structured-light cameras. A novel depth denoising and completion algorithm is proposed in which the noise removal and interpolation procedures are guided by the foreground/background label at each pixel. The foreground/background label is estimated using a probabilistic graphical model that considers color, depth, background modeling, depth noise modeling, and spatial constraints. The wide viewing angle of the mirror system is realized by combining the dynamic scene, captured by the static camera network with a 3-D background model created off-line, using a color-depth sequence captured by a movable RGB-D camera. To ensure a real-time response, a scalable client-and-server architecture is used with the 3-D point cloud processing, the viewpoint estimate, and the mirror image rendering are all done on the client side. The mirror image and the viewpoint estimate are then sent to the server for final mirror view synthesis and viewpoint refinement. Experimental results are presented to show the accuracy and effectiveness of each component and the entire system. PMID:23782808

  17. Angles of multivariable root loci

    NASA Technical Reports Server (NTRS)

    Thompson, P. M.; Stein, G.; Laub, A. J.

    1982-01-01

    A generalized eigenvalue problem is demonstrated to be useful for computing the multivariable root locus, particularly when obtaining the arrival angles to finite transmission zeros. The multivariable root loci are found for a linear, time-invariant output feedback problem. The problem is then employed to compute a closed-loop eigenstructure. The method of computing angles on the root locus is demonstrated, and the method is extended to a multivariable optimal root locus.

  18. Location accuracy evaluation of lightning location systems using natural lightning flashes recorded by a network of high-speed cameras

    NASA Astrophysics Data System (ADS)

    Alves, J.; Saraiva, A. C. V.; Campos, L. Z. D. S.; Pinto, O., Jr.; Antunes, L.

    2014-12-01

    This work presents a method for the evaluation of location accuracy of all Lightning Location System (LLS) in operation in southeastern Brazil, using natural cloud-to-ground (CG) lightning flashes. This can be done through a multiple high-speed cameras network (RAMMER network) installed in the Paraiba Valley region - SP - Brazil. The RAMMER network (Automated Multi-camera Network for Monitoring and Study of Lightning) is composed by four high-speed cameras operating at 2,500 frames per second. Three stationary black-and-white (B&W) cameras were situated in the cities of São José dos Campos and Caçapava. A fourth color camera was mobile (installed in a car), but operated in a fixed location during the observation period, within the city of São José dos Campos. The average distance among cameras was 13 kilometers. Each RAMMER sensor position was determined so that the network can observe the same lightning flash from different angles and all recorded videos were GPS (Global Position System) time stamped, allowing comparisons of events between cameras and the LLS. The RAMMER sensor is basically composed by a computer, a Phantom high-speed camera version 9.1 and a GPS unit. The lightning cases analyzed in the present work were observed by at least two cameras, their position was visually triangulated and the results compared with BrasilDAT network, during the summer seasons of 2011/2012 and 2012/2013. The visual triangulation method is presented in details. The calibration procedure showed an accuracy of 9 meters between the accurate GPS position of the object triangulated and the result from the visual triangulation method. Lightning return stroke positions, estimated with the visual triangulation method, were compared with LLS locations. Differences between solutions were not greater than 1.8 km.

  19. Tunable pulsed narrow bandwidth light source

    DOEpatents

    Powers, Peter E. (Dayton, OH); Kulp, Thomas J. (Livermore, CA)

    2002-01-01

    A tunable pulsed narrow bandwidth light source and a method of operating a light source are provided. The light source includes a pump laser, first and second non-linear optical crystals, a tunable filter, and light pulse directing optics. The method includes the steps of operating the pump laser to generate a pulsed pump beam characterized by a nanosecond pulse duration and arranging the light pulse directing optics so as to (i) split the pulsed pump beam into primary and secondary pump beams; (ii) direct the primary pump beam through an input face of the first non-linear optical crystal such that a primary output beam exits from an output face of the first non-linear optical crystal; (iii) direct the primary output beam through the tunable filter to generate a sculpted seed beam; and direct the sculpted seed beam and the secondary pump beam through an input face of the second non-linear optical crystal such that a secondary output beam characterized by at least one spectral bandwidth on the order of about 0.1 cm.sup.-1 and below exits from an output face of the second non-linear optical crystal.

  20. Flame Acceleration and DDT in Narrow Tubes

    NASA Astrophysics Data System (ADS)

    Gamezo, Vadim N.; Oran, Elaine S.

    2004-11-01

    A laminar flame propagating towards the open end of a narrow channel filled with a gaseous combustible mixture can accelerate or oscillate, depending on the wall temperature and the channel width. The accelerating flame is able to produce a high-speed flow that has the potential to provide significant thrust, and this can be used in micropropulsion devices. Depending on the energetics of the reactive system and the length of the channel, the compression waves may converge to a strong shock, and eventually trigger a detonation that develops in the shock-compressed material near the wall. We study these phenomena using multidimensional reactive Navier-Stokes numerical simulations, and show that for adibatic walls the maximum flame acceleration occurs when the channel is about 5 times larger than the reaction zone of a laminar flame. The flame and the unreacted material ahead of it can accelerate to the velocities close to the sound speed without creating strong shocks. This combustion regime is of particular interest for micropropulsion because it allows an efficiet use of fuel and a gradual development of the trust.