Science.gov

Sample records for camera narrow angle

  1. Reconditioning of Cassini Narrow-Angle Camera

    NASA Technical Reports Server (NTRS)

    2002-01-01

    These five images of single stars, taken at different times with the narrow-angle camera on NASA's Cassini spacecraft, show the effects of haze collecting on the camera's optics, then successful removal of the haze by warming treatments.

    The image on the left was taken on May 25, 2001, before the haze problem occurred. It shows a star named HD339457.

    The second image from left, taken May 30, 2001, shows the effect of haze that collected on the optics when the camera cooled back down after a routine-maintenance heating to 30 degrees Celsius (86 degrees Fahrenheit). The star is Maia, one of the Pleiades.

    The third image was taken on October 26, 2001, after a weeklong decontamination treatment at minus 7 C (19 F). The star is Spica.

    The fourth image was taken of Spica January 30, 2002, after a weeklong decontamination treatment at 4 C (39 F).

    The final image, also of Spica, was taken July 9, 2002, following three additional decontamination treatments at 4 C (39 F) for two months, one month, then another month.

    Cassini, on its way toward arrival at Saturn in 2004, is a cooperative project of NASA, the European Space Agency and the Italian Space Agency. The Jet Propulsion Laboratory, a division of the California Institute of Technology in Pasadena, manages the Cassini mission for NASA's Office of Space Science, Washington, D.C.

  2. Flight Calibration of the LROC Narrow Angle Camera

    NASA Astrophysics Data System (ADS)

    Humm, D. C.; Tschimmel, M.; Brylow, S. M.; Mahanti, P.; Tran, T. N.; Braden, S. E.; Wiseman, S.; Danton, J.; Eliason, E. M.; Robinson, M. S.

    2015-09-01

    Characterization and calibration are vital for instrument commanding and image interpretation in remote sensing. The Lunar Reconnaissance Orbiter Camera Narrow Angle Camera (LROC NAC) takes 500 Mpixel greyscale images of lunar scenes at 0.5 meters/pixel. It uses two nominally identical line scan cameras for a larger crosstrack field of view. Stray light, spatial crosstalk, and nonlinearity were characterized using flight images of the Earth and the lunar limb. These are important for imaging shadowed craters, studying ˜1 meter size objects, and photometry respectively. Background, nonlinearity, and flatfield corrections have been implemented in the calibration pipeline. An eight-column pattern in the background is corrected. The detector is linear for DN = 600-2000 but a signal-dependent additive correction is required and applied for DN<600. A predictive model of detector temperature and dark level was developed to command dark level offset. This avoids images with a cutoff at DN=0 and minimizes quantization error in companding. Absolute radiometric calibration is derived from comparison of NAC images with ground-based images taken with the Robotic Lunar Observatory (ROLO) at much lower spatial resolution but with the same photometric angles.

  3. Two Years of Digital Terrain Model Production Using the Lunar Reconnaissance Orbiter Narrow Angle Camera

    NASA Astrophysics Data System (ADS)

    Burns, K.; Robinson, M. S.; Speyerer, E.; LROC Science Team

    2011-12-01

    One of the primary objectives of the Lunar Reconnaissance Orbiter Camera (LROC) is to gather stereo observations with the Narrow Angle Camera (NAC). These stereo observations are used to generate digital terrain models (DTMs). The NAC has a pixel scale of 0.5 to 2.0 meters but was not designed for stereo observations and thus requires the spacecraft to roll off-nadir to acquire these images. Slews interfere with the data collection of the other instruments, so opportunities are currently limited to four per day. Arizona State University has produced DTMs from 95 stereo pairs for 11 Constellation Project (CxP) sites (Aristarchus, Copernicus crater, Gruithuisen domes, Hortensius domes, Ina D-caldera, Lichtenberg crater, Mare Ingenii, Marius hills, Reiner Gamma, South Pole-Aitkin Rim, Sulpicius Gallus) as well as 30 other regions of scientific interest (including: Bhabha crater, highest and lowest elevation points, Highland Ponds, Kugler Anuchin, Linne Crater, Planck Crater, Slipher crater, Sears Crater, Mandel'shtam Crater, Virtanen Graben, Compton/Belkovich, Rumker Domes, King Crater, Luna 16/20/23/24 landing sites, Ranger 6 landing site, Wiener F Crater, Apollo 11/14/15/17, fresh craters, impact melt flows, Larmor Q crater, Mare Tranquillitatis pit, Hansteen Alpha, Moore F Crater, and Lassell Massif). To generate DTMs, the USGS ISIS software and SOCET SET° from BAE Systems are used. To increase the absolute accuracy of the DTMs, data obtained from the Lunar Orbiter Laser Altimeter (LOLA) is used to coregister the NAC images and define the geodetic reference frame. NAC DTMs have been used in examination of several sites, e.g. Compton-Belkovich, Marius Hills and Ina D-caldera [1-3]. LROC will continue to acquire high-resolution stereo images throughout the science phase of the mission and any extended mission opportunities, thus providing a vital dataset for scientific research as well as future human and robotic exploration. [1] B.L. Jolliff (2011) Nature Geoscience, in press. [2] Lawrence et al. (2011) LPSC XLII, Abst 2228. [3] Garry et al. (2011) LPSC XLII, Abst 2605.

  4. Narrow Angle movie

    NASA Technical Reports Server (NTRS)

    1999-01-01

    This brief three-frame movie of the Moon was made from three Cassini narrow-angle images as the spacecraft passed by the Moon on the way to its closest approach with Earth on August 17, 1999. The purpose of this particular set of images was to calibrate the spectral response of the narrow-angle camera and to test its 'on-chip summing mode' data compression technique in flight. From left to right, they show the Moon in the green, blue and ultraviolet regions of the spectrum in 40, 60 and 80 millisecond exposures, respectively. All three images have been scaled so that the brightness of Crisium basin, the dark circular region in the upper right, is the same in each image. The spatial scale in the blue and ultraviolet images is 1.4 miles per pixel (2.3 kilometers). The original scale in the green image (which was captured in the usual manner and then reduced in size by 2x2 pixel summing within the camera system) was 2.8 miles per pixel (4.6 kilometers). It has been enlarged for display to the same scale as the other two. The imaging data were processed and released by the Cassini Imaging Central Laboratory for Operations (CICLOPS) at the University of Arizona's Lunar and Planetary Laboratory, Tucson, AZ.

    Photo Credit: NASA/JPL/Cassini Imaging Team/University of Arizona

    Cassini, launched in 1997, is a joint mission of NASA, the European Space Agency and Italian Space Agency. The mission is managed by NASA's Jet Propulsion Laboratory, Pasadena, CA, for NASA's Office of Space Science, Washington DC. JPL is a division of the California Institute of Technology, Pasadena, CA.

  5. NFC - Narrow Field Camera

    NASA Astrophysics Data System (ADS)

    Koukal, J.; Srba, J.; Gorková, S.

    2015-01-01

    We have been introducing a low-cost CCTV video system for faint meteor monitoring and here we describe the first results from 5 months of two-station operations. Our system called NFC (Narrow Field Camera) with a meteor limiting magnitude around +6.5mag allows research on trajectories of less massive meteoroids within individual parent meteor showers and the sporadic background. At present 4 stations (2 pairs with coordinated fields of view) of NFC system are operated in the frame of CEMeNt (Central European Meteor Network). The heart of each NFC station is a sensitive CCTV camera Watec 902 H2 and a fast cinematographic lens Meopta Meostigmat 1/50 - 52.5 mm (50 mm focal length and fixed aperture f/1.0). In this paper we present the first results based on 1595 individual meteors, 368 of which were recorded from two stations simultaneously. This data set allows the first empirical verification of theoretical assumptions for NFC system capabilities (stellar and meteor magnitude limit, meteor apparent brightness distribution and accuracy of single station measurements) and the first low mass meteoroid trajectory calculations. Our experimental data clearly showed the capabilities of the proposed system for low mass meteor registration and for calculations based on NFC data to lead to a significant refinement in the orbital elements for low mass meteoroids.

  6. High-resolution topomapping of candidate MER landing sites with Mars Orbiter Camera narrow-angle images

    USGS Publications Warehouse

    Kirk, R.L.; Howington-Kraus, E.; Redding, B.; Galuszka, D.; Hare, T.M.; Archinal, B.A.; Soderblom, L.A.; Barrett, J.M.

    2003-01-01

    We analyzed narrow-angle Mars Orbiter Camera (MOC-NA) images to produce high-resolution digital elevation models (DEMs) in order to provide topographic and slope information needed to assess the safety of candidate landing sites for the Mars Exploration Rovers (MER) and to assess the accuracy of our results by a variety of tests. The mapping techniques developed also support geoscientific studies and can be used with all present and planned Mars-orbiting scanner cameras. Photogrammetric analysis of MOC stereopairs yields DEMs with 3-pixel (typically 10 m) horizontal resolution, vertical precision consistent with ???0.22 pixel matching errors (typically a few meters), and slope errors of 1-3??. These DEMs are controlled to the Mars Orbiter Laser Altimeter (MOLA) global data set and consistent with it at the limits of resolution. Photoclinometry yields DEMs with single-pixel (typically ???3 m) horizontal resolution and submeter vertical precision. Where the surface albedo is uniform, the dominant error is 10-20% relative uncertainty in the amplitude of topography and slopes after "calibrating" photoclinometry against a stereo DEM to account for the influence of atmospheric haze. We mapped portions of seven candidate MER sites and the Mars Pathfinder site. Safety of the final four sites (Elysium, Gusev, Isidis, and Meridiani) was assessed by mission engineers by simulating landings on our DEMs of "hazard units" mapped in the sites, with results weighted by the probability of landing on those units; summary slope statistics show that most hazard units are smooth, with only small areas of etched terrain in Gusev crater posing a slope hazard.

  7. Photogrammetric measurement of 3D freeform millimetre-sized objects with micro features: an experimental validation of the close-range camera calibration model for narrow angles of view

    NASA Astrophysics Data System (ADS)

    Percoco, Gianluca; Sánchez Salmerón, Antonio J.

    2015-09-01

    The measurement of millimetre and micro-scale features is performed by high-cost systems based on technologies with narrow working ranges to accurately control the position of the sensors. Photogrammetry would lower the costs of 3D inspection of micro-features and would be applicable to the inspection of non-removable micro parts of large objects too. Unfortunately, the behaviour of photogrammetry is not known when photogrammetry is applied to micro-features. In this paper, the authors address these issues towards the application of digital close-range photogrammetry (DCRP) to the micro-scale, taking into account that in literature there are research papers stating that an angle of view (AOV) around 10° is the lower limit to the application of the traditional pinhole close-range calibration model (CRCM), which is the basis of DCRP. At first a general calibration procedure is introduced, with the aid of an open-source software library, to calibrate narrow AOV cameras with the CRCM. Subsequently the procedure is validated using a reflex camera with a 60?mm macro lens, equipped with extension tubes (20 and 32?mm) achieving magnification of up to 2 times approximately, to verify literature findings with experimental photogrammetric 3D measurements of millimetre-sized objects with micro-features. The limitation experienced by the laser printing technology, used to produce the bi-dimensional pattern on common paper, has been overcome using an accurate pattern manufactured with a photolithographic process. The results of the experimental activity prove that the CRCM is valid for AOVs down to 3.4° and that DCRP results are comparable with the results of existing and more expensive commercial techniques.

  8. Narrow-angle astrometry with PRIMA

    NASA Astrophysics Data System (ADS)

    Sahlmann, J.; Ségransan, D.; Mérand, A.; Zimmerman, N.; Abuter, R.; Chazelas, B.; Delplancke, F.; Henning, T.; Kaminski, A.; Köhler, R.; Launhardt, R.; Mohler, M.; Pepe, F.; Queloz, D.; Quirrenbach, A.; Reffert, S.; Schmid, C.; Schuhler, N.; Schulze-Hartung, T.

    2012-07-01

    The Extrasolar Planet Search with PRIMA project (ESPRI) aims at characterising and detecting extrasolar planets by measuring the host star's reflex motion using the narrow-angle astrometry capability of the PRIMA facility at the Very Large Telescope Interferometer. A first functional demonstration of the astrometric mode was achieved in early 2011. This marked the start of the astrometric commissioning phase with the purpose of characterising the instrument's performance, which ultimately has to be sufficient for exoplanet detection. We show results obtained from the observation of bright visual binary stars, which serve as test objects to determine the instrument's astrometric precision, its accuracy, and the plate scale. Finally, we report on the current status of the ESPRI project, in view of starting its scientific programme.

  9. Narrow-angle astrometry with PRIMA

    E-print Network

    Sahlmann, J; Mérand, A; Zimmerman, N; Abuter, R; Chazelas, B; Delplancke, F; Henning, T; Kaminski, A; Köhler, R; Launhardt, R; Mohler, M; Pepe, F; Queloz, D; Quirrenbach, A; Reffert, S; Schmid, C; Schuhler, N; Schulze-Hartung, T

    2012-01-01

    The Extrasolar Planet Search with PRIMA project (ESPRI) aims at characterising and detecting extrasolar planets by measuring the host star's reflex motion using the narrow-angle astrometry capability of the PRIMA facility at the Very Large Telescope Interferometer. A first functional demonstration of the astrometric mode was achieved in early 2011. This marked the start of the astrometric commissioning phase with the purpose of characterising the instrument's performance, which ultimately has to be sufficient for exoplanet detection. We show results obtained from the observation of bright visual binary stars, which serve as test objects to determine the instrument's astrometric precision, its accuracy, and the plate scale. Finally, we report on the current status of the ESPRI project, in view of starting its scientific programme.

  10. Peripapillary Schisis in Glaucoma Patients With Narrow Angles and

    E-print Network

    Srinivasan, Vivek J.

    Peripapillary Schisis in Glaucoma Patients With Narrow Angles and Increased Intraocular Pressure cases of peripapillary retinal schisis in patients with glaucoma without evidence of optic nerve pits patient was followed over time. RESULTS: The first patient, diagnosed with narrow angle glaucoma

  11. Inflight Calibration of the Lunar Reconnaissance Orbiter Camera Wide Angle Camera

    NASA Astrophysics Data System (ADS)

    Mahanti, P.; Humm, D. C.; Robinson, M. S.; Boyd, A. K.; Stelling, R.; Sato, H.; Denevi, B. W.; Braden, S. E.; Bowman-Cisneros, E.; Brylow, S. M.; Tschimmel, M.

    2015-09-01

    The Lunar Reconnaissance Orbiter Camera (LROC) Wide Angle Camera (WAC) has acquired more than 250,000 images of the illuminated lunar surface and over 190,000 observations of space and non-illuminated Moon since 1 January 2010. These images, along with images from the Narrow Angle Camera (NAC) and other Lunar Reconnaissance Orbiter instrument datasets are enabling new discoveries about the morphology, composition, and geologic/geochemical evolution of the Moon. Characterizing the inflight WAC system performance is crucial to scientific and exploration results. Pre-launch calibration of the WAC provided a baseline characterization that was critical for early targeting and analysis. Here we present an analysis of WAC performance from the inflight data. In the course of our analysis we compare and contrast with the pre-launch performance wherever possible and quantify the uncertainty related to various components of the calibration process. We document the absolute and relative radiometric calibration, point spread function, and scattered light sources and provide estimates of sources of uncertainty for spectral reflectance measurements of the Moon across a range of imaging conditions.

  12. Camera Angle Affects Dominance in Video-Mediated Communication

    E-print Network

    Olson, Judith S.

    -Mediated Communication (VMC), these are distorted in various ways. Monitors and camera zooms make people look close or far, monitors and camera angles can be high or low making people look tall or short, volume can or shorter than they are. A person looking up all the time (with the remote person apparently looking down

  13. Techniques of coagulation laser prostatectomy for narrow divergence angle fibers.

    PubMed

    Milam, D F

    1996-01-01

    Although laser prostatectomy has become an accepted technique for the treatment of obstructive prostatism, considerable debate remains about which laser prostatectomy method to use in various treatment situations. This article discusses the different methods of noncontact side-firing coagulation laser prostatectomy using narrow divergence angle fibers (< 30 degrees). Static treatment strategies that have been successfully employed with widely divergent beams are not appropriate for fibers producing narrow divergence angle beams. Narrow divergence angle beams produce a small diameter spot on the prostatic urethra and far larger power density. Additionally, neodymium/yttrium aluminum garnet light scatters relatively poorly within prostatic tissue. Most light continues along the path of through transmission until ultimate tissue absorption and conversion into heat. The width and total volume of the coagulated lesion is therefore limited when using a narrow divergence angle fiber to produce static lesions. Probe movement is essential. Rapid (> 2 mm/s) probe movement produces only superficial coagulation. An initial dwell period of approximately 3 s is also important to maximize coagulated tissue volume. Scanning strategies where the fiber is moved through the prostatic urethra in longitudinal and radial directions are discussed and compared. Radial and longitudinal scanning methods produce similar coagulation defects. Treatment using a rocking motion within a limited volume of tissue may increase coagulation depth. No technique is ideal for all clinical situations. Vaporization prostatectomy or contact laser transurethral incision of the prostate is appropriate for primary treatment of glands < 30 g or as adjunctive therapy to facilitate early catheter removal. Alternative treatment methods are compared to noncontact coagulation prostatectomy. PMID:9118400

  14. SCDU (Spectral Calibration Development Unit) Testbed Narrow Angle Astrometric Performance

    NASA Technical Reports Server (NTRS)

    Wang, Xu; Goullioud, Renaud; Nemati, Bijan; Shao, Michael; Wehmeier, Udo J.; Weilert, Mark A.; Werne, Thomas A.; Wu, Janet P.; Zhai, Chengxing

    2010-01-01

    The most stringent astrometric performance requirements on NASA's SIM(Space Interferometer Mission)-Lite mission will come from the so-called Narrow-Angle (NA) observing scenario, aimed at finding Earth-like exoplanets, where the interferometer chops between the target star and several nearby reference stars multiple times over the course of a single visit. Previously, about 20 pm NA error with various shifts was reported. Since then, investigation has been under way to understand the mechanisms that give rise to these shifts. In this paper we report our findings, the adopted mitigation strategies, and the resulting testbed performance.

  15. WIDE-ANGLE, NARROW-ANGLE, AND IMAGING BASELINES OF OPTICAL LONG-BASELINE INTERFEROMETERS

    SciTech Connect

    Woillez, J.; Lacour, S. E-mail: sylvestre.lacour@obspm.fr

    2013-02-10

    For optical interferometers, the baseline is typically defined as the vector joining two perfectly identical telescopes. However, when the telescopes are naturally different or when the requirements on the baseline vector challenge the telescope perfection, the baseline definition depends on how the interferometer is used. This is where the notions of wide-angle, narrow-angle, and imaging baselines come into play. This article explores this variety of baselines, with the purpose of presenting a coherent set of definitions, describing how they relate to each other, and suggesting baseline metrology requirements. Ultimately, this work aims at supporting upcoming long-baseline optical interferometers with narrow-angle astrometry and phase-referenced imaging capabilities at the microarcsecond level.

  16. Associations between Narrow Angle and Adult Anthropometry: The Liwan Eye Study

    PubMed Central

    Jiang, Yuzhen; He, Mingguang; Friedman, David S.; Khawaja, Anthony P.; Lee, Pak Sang; Nolan, Winifred P.; Yin, Qiuxia; Foster, Paul J.

    2015-01-01

    Purpose To assess the associations between narrow angle and adult anthropometry. Methods Chinese adults aged 50 years and older were recruited from a population-based survey in the Liwan District of Guangzhou, China. Narrow angle was defined as the posterior trabecular meshwork not visible under static gonioscopy in at least three quadrants (i.e. a circumference of at least 270°). Logistic regression models were used to examine the associations between narrow angle and anthropomorphic measures (height, weight and body mass index, BMI). Results Among the 912 participants, lower weight, shorter height, and lower BMI were significantly associated with narrower angle width (tests for trend: mean angle width in degrees vs weight p<0.001; vs height p<0.001; vs BMI p = 0.012). In univariate analyses, shorter height, lower weight and lower BMI were all significantly associated with greater odds of narrow angle. The crude association between height and narrow angle was largely attributable to a stronger association with age and sex. Lower BMI and weight remained significantly associated with narrow angle after adjustment for height, age, sex, axial ocular biometric measures and education. In analyses stratified by sex, the association between BMI and narrow angle was only observed in women. Conclusion Lower BMI and weight were associated with significantly greater odds of narrow angle after adjusting for age, education, axial ocular biometric measures and height. The odds of narrow angle increased 7% per 1 unit decrease in BMI. This association was most evident in women. PMID:24707840

  17. Improved wide-angle, fisheye and omnidirectional camera calibration

    NASA Astrophysics Data System (ADS)

    Urban, Steffen; Leitloff, Jens; Hinz, Stefan

    2015-10-01

    In this paper an improved method for calibrating wide-angle, fisheye and omnidirectional imaging systems is presented. We extend the calibration procedure proposed by Scaramuzza et al. by replacing the residual function and joint refinement of all parameters. In doing so, we achieve a more stable, robust and accurate calibration (up to factor 7) and can reduce the number of necessary calibration steps from five to three. After introducing the camera model and highlighting the differences from the current calibration procedure, we perform a comprehensive performance evaluation using several data sets and show the impact of the proposed calibration procedure on the calibration results.

  18. 13. 22'X34' original vellum, VariableAngle Launcher, 'SIDEVIEW CAMERA CAR TRACK ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    13. 22'X34' original vellum, Variable-Angle Launcher, 'SIDEVIEW CAMERA CAR TRACK DETAILS' drawn at 1/4'=1'-0' (BUORD Sketch # 208078, PAPW 908). - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  19. 12. 22'X34' original vellum, VariableAngle Launcher, 'SIDE VIEW CAMERA TRACK ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    12. 22'X34' original vellum, Variable-Angle Launcher, 'SIDE VIEW CAMERA TRACK H-20 BRIDGE MODIFICATIONS' drawn at 3/16'=1'-0' and 1/2'1'-0'. (BUORD Sketch # 208784, PAPW 907). - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  20. 10. 22'X34' original blueprint, VariableAngle Launcher, 'SIDE VIEW CAMERA CARSTEEL ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    10. 22'X34' original blueprint, Variable-Angle Launcher, 'SIDE VIEW CAMERA CAR-STEEL FRAME AND AXLES' drawn at 1/2'=1'-0'. (BOURD Sketch # 209124). - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  1. Performing fish counts with a wide-angle camera, a promising approach reducing divers' limitations

    E-print Network

    Borges, Rita

    Performing fish counts with a wide-angle camera, a promising approach reducing divers' limitations Keywords: Fish surveys Underwater video Underwater visual census Wide-angle camera Visual standardised methods for census of reef fishes have long been used in fisheries management and biolog- ical surveys

  2. Improved iris localization by using wide and narrow field of view cameras for iris recognition

    NASA Astrophysics Data System (ADS)

    Kim, Yeong Gon; Shin, Kwang Yong; Park, Kang Ryoung

    2013-10-01

    Biometrics is a method of identifying individuals by their physiological or behavioral characteristics. Among other biometric identifiers, iris recognition has been widely used for various applications that require a high level of security. When a conventional iris recognition camera is used, the size and position of the iris region in a captured image vary according to the X, Y positions of a user's eye and the Z distance between a user and the camera. Therefore, the searching area of the iris detection algorithm is increased, which can inevitably decrease both the detection speed and accuracy. To solve these problems, we propose a new method of iris localization that uses wide field of view (WFOV) and narrow field of view (NFOV) cameras. Our study is new as compared to previous studies in the following four ways. First, the device used in our research acquires three images, one each of the face and both irises, using one WFOV and two NFOV cameras simultaneously. The relation between the WFOV and NFOV cameras is determined by simple geometric transformation without complex calibration. Second, the Z distance (between a user's eye and the iris camera) is estimated based on the iris size in the WFOV image and anthropometric data of the size of the human iris. Third, the accuracy of the geometric transformation between the WFOV and NFOV cameras is enhanced by using multiple matrices of the transformation according to the Z distance. Fourth, the searching region for iris localization in the NFOV image is significantly reduced based on the detected iris region in the WFOV image and the matrix of geometric transformation corresponding to the estimated Z distance. Experimental results showed that the performance of the proposed iris localization method is better than that of conventional methods in terms of accuracy and processing time.

  3. The design and fabricate of wide angle 905nm narrow band filter

    NASA Astrophysics Data System (ADS)

    Shi, Baohua; Li, Zaijin; Li, Hongyu; Qu, Yi

    2014-12-01

    All-dielectric film narrow band filter is widely used in laser system owing to its excellent optical capability, manufacturability and environmental adaptability. But 905nm infrared semiconductor laser system have large divergence angel so we designed entrance light cone angle 905nm narrow band filter. And center wavelength shift, due to entrance light cone angle, affects its spectral selective power seriously. In order to reduce these impacts, an informal dielectric film narrowband filter is designed. Changes of transmission characteristics with oblique incidence of Gaussian beam of uneven illumination are analyzed. The relationship between the angle of incidence and the central wavelength shift quantificational are Solved. A ± 30 ° incident 905nm narrowband filter was fabricated. Between 880nm and 950nm, the average transmittance is above 90%, and at the cut-off band the average transmittance is below 1%.

  4. On-Orbit Geometric Calibration of the Lunar Reconnaissance Orbiter Wide Angle Camera

    NASA Astrophysics Data System (ADS)

    Speyerer, E. J.; Wagner, R.; Robinson, M. S.

    2013-12-01

    Lunar Reconnaissance Orbiter (LRO) is equipped with a single Wide Angle Camera (WAC) [1] designed to collect monochromatic and multispectral observations of the lunar surface. Cartographically accurate image mosaics and stereo image based terrain models requires the position of each pixel in a given image be known to a corresponding point on the lunar surface with a high degree of accuracy and precision. The Lunar Reconnaissance Orbiter Camera (LROC) team initially characterized the WAC geometry prior to launch at the Malin Space Science Systems calibration facility. After lunar orbit insertion, the LROC team recognized spatially varying geometric offsets between color bands. These misregistrations made analysis of the color data problematic and showed that refinements to the pre-launch geometric analysis were necessary. The geometric parameters that define the WAC optical system were characterized from statistics gathered from co-registering over 84,000 image pairs. For each pair, we registered all five visible WAC bands to a precisely rectified Narrow Angle Camera (NAC) image (accuracy <15 m) [2] to compute key geometric parameters. In total, we registered 2,896 monochrome and 1,079 color WAC observations to nearly 34,000 NAC observations and collected over 13.7 million data points across the visible portion of the WAC CCD. Using the collected statistics, we refined the relative pointing (yaw, pitch and roll), effective focal length, principal point coordinates, and radial distortion coefficients. This large dataset also revealed spatial offsets between bands after orthorectification due to chromatic aberrations in the optical system. As white light enters the optical system, the light bends at different magnitudes as a function of wavelength, causing a single incident ray to disperse in a spectral spread of color [3,4]. This lateral chromatic aberration effect, also known as 'chromatic difference in magnification' [5] introduces variation to the effective focal length for each WAC band. Secondly, tangential distortions caused by minor decentering in the optical system altered the derived exterior orientation parameters for each 14-line WAC band. We computed the geometric parameter sets separately for each band to characterize the lateral chromatic aberrations and the decentering components in the WAC optical system. From this approach, we negated the need for additional tangential terms in the distortion model, thus reducing the number of computations during image orthorectification and therefore expediting the orthorectification process. We undertook a similar process for refining the geometry for the UV bands (321 and 360 nm), except we registered each UV bands to orthorectified visible bands of the same WAC observation (the visible bands have resolutions 4 times greater than the UV). The resulting 7-band camera model with refined geometric parameters enables map projection with sub-pixel accuracy. References: [1] Robinson et al. (2010) Space Sci. Rev. 150, 81-124 [2] Wagner et al. (2013) Lunar Sci Forum [3] Mahajan, V.N. (1998) Optical Imaging and Aberrations [4] Fiete, R.D. (2013), Manual of Photogrammetry, pp. 359-450 [5] Brown, D.C. (1966) Photometric Eng. 32, 444-462.

  5. Sheath effects on current collection by particle detectors with narrow acceptance angles

    NASA Technical Reports Server (NTRS)

    Singh, N.; Baugher, C. R.

    1981-01-01

    Restriction of the aperture acceptance angle of an ion or electron trap on an attracting spacecraft significantly alters the volt-ampere characteristics of the instrument in a low Mach number plasma. It is shown when the angular acceptance of the aperture is restricted the current to the collector tends to be independent of the Debye length. Expressions for the RPA characteristics for both a thin sheath and a thick sheath are derived; and it is shown that as the aperture is narrowed the curves tend toward equivalence.

  6. Large zenith angle observations with the high-resolution GRANITE III camera

    E-print Network

    Petry, D

    2001-01-01

    The GRANITE III camera of the Whipple Cherenkov Telescope at the Fred Lawrence Whipple Observatory on Mount Hopkins, Arizona (2300 m a.s.l.) has the highest angular resolution of all cameras used on this telescope so far. The central region of the camera has 379 pixels with an individual angular diameter of 0.12 degrees. This makes the instrument especially suitable for observations of gamma-induced air-showers at large zenith angles since the increase in average distance to the shower maximum leads to smaller shower images in the focal plane of the telescope. We examine the performance of the telescope for observations of gamma-induced air-showers at zenith angles up to 63 degrees based on observations of Mkn 421 and using Monte Carlo Simulations. An improvement to the standard data analysis is suggested.

  7. Large zenith angle observations with the high-resolution GRANITE III camera

    E-print Network

    D. Petry; the VERITAS Collaboration

    2001-08-06

    The GRANITE III camera of the Whipple Cherenkov Telescope at the Fred Lawrence Whipple Observatory on Mount Hopkins, Arizona (2300 m a.s.l.) has the highest angular resolution of all cameras used on this telescope so far. The central region of the camera has 379 pixels with an individual angular diameter of 0.12 degrees. This makes the instrument especially suitable for observations of gamma-induced air-showers at large zenith angles since the increase in average distance to the shower maximum leads to smaller shower images in the focal plane of the telescope. We examine the performance of the telescope for observations of gamma-induced air-showers at zenith angles up to 63 degrees based on observations of Mkn 421 and using Monte Carlo Simulations. An improvement to the standard data analysis is suggested.

  8. Narrow Angle Wide Spectral Range Radiometer Design FEANICS/REEFS Radiometer Design Report

    NASA Technical Reports Server (NTRS)

    Camperchioli, William

    2005-01-01

    A critical measurement for the Radiative Enhancement Effects on Flame Spread (REEFS) microgravity combustion experiment is the net radiative flux emitted from the gases and from the solid fuel bed. These quantities are measured using a set of narrow angle, wide spectral range radiometers. The radiometers are required to have an angular field of view of 1.2 degrees and measure over the spectral range of 0.6 to 30 microns, which presents a challenging design effort. This report details the design of this radiometer system including field of view, radiometer response, radiometric calculations, temperature effects, error sources, baffling and amplifiers. This report presents some radiometer specific data but does not present any REEFS experiment data.

  9. Topview stereo: combining vehicle-mounted wide-angle cameras to a distance sensor array

    NASA Astrophysics Data System (ADS)

    Houben, Sebastian

    2015-03-01

    The variety of vehicle-mounted sensors in order to fulfill a growing number of driver assistance tasks has become a substantial factor in automobile manufacturing cost. We present a stereo distance method exploiting the overlapping field of view of a multi-camera fisheye surround view system, as they are used for near-range vehicle surveillance tasks, e.g. in parking maneuvers. Hence, we aim at creating a new input signal from sensors that are already installed. Particular properties of wide-angle cameras (e.g. hanging resolution) demand an adaptation of the image processing pipeline to several problems that do not arise in classical stereo vision performed with cameras carefully designed for this purpose. We introduce the algorithms for rectification, correspondence analysis, and regularization of the disparity image, discuss reasons and avoidance of the shown caveats, and present first results on a prototype topview setup.

  10. The first demonstration of the concept of "narrow-FOV Si/CdTe semiconductor Compton camera"

    E-print Network

    Ichinohe, Yuto; Watanabe, Shin; Edahiro, Ikumi; Hayashi, Katsuhiro; Kawano, Takafumi; Ohno, Masanori; Ohta, Masayuki; Takeda, Shin'ichiro; Fukazawa, Yasushi; Katsuragawa, Miho; Nakazawa, Kazuhiro; Odaka, Hirokazu; Tajima, Hiroyasu; Takahashi, Hiromitsu; Takahashi, Tadayuki; Yuasa, Takayuki

    2015-01-01

    The Soft Gamma-ray Detector (SGD), to be deployed onboard the {\\it ASTRO-H} satellite, has been developed to provide the highest sensitivity observations of celestial sources in the energy band of 60-600~keV by employing a detector concept which uses a Compton camera whose field-of-view is restricted by a BGO shield to a few degree (narrow-FOV Compton camera). In this concept, the background from outside the FOV can be heavily suppressed by constraining the incident direction of the gamma ray reconstructed by the Compton camera to be consistent with the narrow FOV. We, for the first time, demonstrate the validity of the concept using background data taken during the thermal vacuum test and the low-temperature environment test of the flight model of SGD on ground. We show that the measured background level is suppressed to less than 10\\% by combining the event rejection using the anti-coincidence trigger of the active BGO shield and by using Compton event reconstruction techniques. More than 75\\% of the signal...

  11. The first demonstration of the concept of "narrow-FOV Si/CdTe semiconductor Compton camera"

    NASA Astrophysics Data System (ADS)

    Ichinohe, Yuto; Uchida, Yuusuke; Watanabe, Shin; Edahiro, Ikumi; Hayashi, Katsuhiro; Kawano, Takafumi; Ohno, Masanori; Ohta, Masayuki; Takeda, Shin`ichiro; Fukazawa, Yasushi; Katsuragawa, Miho; Nakazawa, Kazuhiro; Odaka, Hirokazu; Tajima, Hiroyasu; Takahashi, Hiromitsu; Takahashi, Tadayuki; Yuasa, Takayuki

    2016-01-01

    The Soft Gamma-ray Detector (SGD), to be deployed on board the ASTRO-H satellite, has been developed to provide the highest sensitivity observations of celestial sources in the energy band of 60-600 keV by employing a detector concept which uses a Compton camera whose field-of-view is restricted by a BGO shield to a few degree (narrow-FOV Compton camera). In this concept, the background from outside the FOV can be heavily suppressed by constraining the incident direction of the gamma ray reconstructed by the Compton camera to be consistent with the narrow FOV. We, for the first time, demonstrate the validity of the concept using background data taken during the thermal vacuum test and the low-temperature environment test of the flight model of SGD on ground. We show that the measured background level is suppressed to less than 10% by combining the event rejection using the anti-coincidence trigger of the active BGO shield and by using Compton event reconstruction techniques. More than 75% of the signals from the field-of-view are retained against the background rejection, which clearly demonstrates the improvement of signal-to-noise ratio. The estimated effective area of 22.8 cm2 meets the mission requirement even though not all of the operational parameters of the instrument have been fully optimized yet.

  12. Development of soft x-ray large solid angle camera onboard WF-MAXI

    NASA Astrophysics Data System (ADS)

    Kimura, Masashi; Tomida, Hiroshi; Ueno, Shiro; Kawai, Nobuyuki; Yatsu, Yoichi; Arimoto, Makoto; Mihara, Tatehiro; Serino, Motoko; Tsunemi, Hiroshi; Yoshida, Atsumasa; Sakamoto, Takanori; Kohmura, Takayoshi; Negoro, Hitoshi

    2014-07-01

    Wide-Field MAXI (WF-MAXI) planned to be installed in Japanese Experiment Module "Kibo" Exposed Facility of the international space station (ISS). WF-MAXI consists of two types of cameras, Soft X-ray Large Solid Angle Camera (SLC) and Hard X-ray Monitor (HXM). HXM is multi-channel arrays of CsI scintillators coupled with avalanche photodiodes (APDs) which covers the energy range of 20 - 200 keV. SLC is arrays of CCD, which is evolved version of MAXI/SSC. Instead of slit and collimator in SSC, SLC is equipped with coded mask allowing its field of view to 20% of all sky at any given time, and its location determination accuracy to few arcminutes. In older to achieve larger effective area, the number of CCD chip and the size of each chip will be larger than that of SSC. We are planning to use 59 x 31 mm2 CCD chip provided by Hamamatsu Photonics. Each camera will be quipped with 16 CCDs and total of 4 cameras will be installed in WF-MAXI. Since SLC utilize X-ray CCDs it must equip active cooling system for CCDs. Instead of using the peltier cooler, we use mechanical coolers that are also employed in Astro-H. In this way we can cool the CCDs down to -100C. ISS orbit around the earth in 90 minutes; therefore a point source moves 4 arcminutes per second. In order to achieve location determination accuracy, we need fast readout from CCD. The pulse heights are stacked into a single row along the vertical direction. Charge is transferred continuously, thus the spatial information along the vertical direction is lost and replaced with the precise arrival time information. Currently we are making experimental model of the camera body including the CCD and electronics for the CCDs. In this paper, we show the development status of SLC.

  13. A New Approach to Micro-arcsecond Astrometry with SIM Allowing Early Mission Narrow Angle Measurements of Compelling Astronomical Targets

    NASA Technical Reports Server (NTRS)

    Shaklan, Stuart; Pan, Xiaopei

    2004-01-01

    The Space Interferometry Mission (SIM) is capable of detecting and measuring the mass of terrestrial planets around stars other than our own. It can measure the mass of black holes and the visual orbits of radio and x-ray binary sources. SIM makes possible a new level of understanding of complex astrophysical processes. SIM achieves its high precision in the so-called narrow-angle regime. This is defined by a 1 degree diameter field in which the position of a target star is measured with respect to a set of reference stars. The observation is performed in two parts: first, SIM observes a grid of stars that spans the full sky. After a few years, repeated observations of the grid allow one to determine the orientation of the interferometer baseline. Second, throughout the mission, SIM periodically observes in the narrow-angle mode. Every narrow-angle observation is linked to the grid to determine the precise attitude and length of the baseline. The narrow angle process demands patience. It is not until five years after launch that SIM achieves its ultimate accuracy of 1 microarcsecond. The accuracy is degraded by a factor of approx. 2 at mid-mission. Our work proposes a technique for narrow angle astrometry that does not rely on the measurement of grid stars. This technique, called Gridless Narrow Angle Astrometry (GNAA) can obtain microarcsecond accuracy and can detect extra-solar planets and other exciting objects with a few days of observation. It can be applied as early as during the first six months of in-orbit calibration (IOC). The motivations for doing this are strong. First, and obviously, it is an insurance policy against a catastrophic mid-mission failure. Second, at the start of the mission, with several space-based interferometers in the planning or implementation phase, NASA will be eager to capture the public's imagination with interferometric science. Third, early results and a technique that can duplicate those results throughout the mission will give the analysts important experience in the proper use and calibration of SIM.

  14. Limitations of the narrow-angle convergent pair. [of Viking Orbiter photographs for triangulation and topographic mapping

    NASA Technical Reports Server (NTRS)

    Arthur, D. W. G.

    1977-01-01

    Spatial triangulations and topographies of the Martian surface derived from Viking Orbiter pictures depend on the use of symmetric narrow-angle convergent pairs. The overlap in each pair is close to 100 percent and the ground principal points virtually coincide. The analysis of this paper reveals a high degree of indeterminacy in such pairs and at least in part explains the rather disappointing precision of the associated spatial triangulations.

  15. Empirical Photometric Normalization for the Seven Band UV-VIS Lunar Reconnaissance Orbiter Wide Angle Camera

    NASA Astrophysics Data System (ADS)

    Boyd, A. K.; Robinson, M. S.; Nuno, R. G.; Sato, H.

    2014-12-01

    We present results on a near-global (80°S to 80°N) seven color Wide Angle Camera (WAC) photometric normalization and color analysis. Over 100,000 WAC color observations were calibrated to reflectance (radiance factor: IoF), and photometric angles (i,e,g), latitude, and longitude were calculated and stored for each WAC pixel. Photometric angles were calculated using the WAC GLD100 [1], and a six-dimensional data set (3 spatial and 3 photometric) was reduced to three by photometrically normalizing the IoF with a global wavelength-dependent, 3rd-order multivariate polynomial. The multispectral mosaic was normalized to a standard viewing geometry (incidence angle=30°, emission angle=0°, phase angle=30°).The WAC has a 60° cross-track field-of-view in color mode, which allows the acquisition of a near global data set each month; however, the phase angle can change by as much as 60° across each image. These large changes in viewing geometry present challenges to the required photometric normalization. In the ratio of the 321 nm and 689 nm wavelengths, the Moon has a standard deviation less than 3% in the highlands and 7% globally; thus to allow confident identification of true color differences, the photometric normalization must be precise. Pyroclastic deposits in Marius Hills, Sinus Aestuum, and Mare Serenitatis are among the least reflective materials, with 643 nm normalized reflectance values less than 0.036.Low-reflectance deposits are generally concentrated close to the equator on the nearside, whereas high-reflectance materials are dispersed globally. The highest reflectance materials occur at Giordano Bruno and Virtanen craters and are attributed to exposure of immature materials. Immature ejecta has shallower spectral slope compared to the mean highlands spectra (321 nm to 689 nm), and UV weathering characteristics can be seen when comparing different aged Copernican ejecta [2]. Copernican ejecta is found to have 643 nm reflectance values greater than 0.36 in some areas. The range of reflectance on the Moon is 10x from the least to most reflective.The new empirical normalized reflectance presented here correlates with an independent Hapke model based normalization [3] with an R-squared value of 0.985.[1] Scholten et al. LPSC XVII (2011) [2] Denevi et al. JGR Planets (2014) [3] Sato et al. JGR Planets (2014)

  16. A triple axis double crystal multiple reflection camera for ultra small angle X-ray scattering

    NASA Astrophysics Data System (ADS)

    Lambard, Jacques; Lesieur, Pierre; Zemb, Thomas

    1992-06-01

    To extend the domain of small angle X-ray scattering requires multiple reflection crystals to collimate the beam. A double crystal, triple axis X-ray camera using multiple reflection channel cut crystals is described. Procedures for measuring the desmeared scattering cross-section on absolute scale are described as well as the measurement from several typical samples : fibrils of collagen, 0.3 ?m diameter silica spheres, 0.16 ?m diameter interacting latex spheres, porous lignite coal, liquid crystals in a surfactant-water system, colloidal crystal of 0.32 ?m diameter silica spheres. L'extension du domaine de diffusion des rayons-X vers les petits angles demande l'emploi de cristaux à réflexions multiples pour collimater le faisceau. Nous décrivons une caméra à rayons-X à trois axes où les réflexions multiples sont réalisées dans deux cristaux à gorge. Nous donnons ensuite les procédures de déconvolution pour obtenir la section efficace de diffusion en échelle absolue, ainsi que les résultats des mesures effectuées avec plusieurs échantillons typiques : fibres de collagène, sphères de silice de 0,3 ?m de diamètre, sphères de latex de 0,16 ?m de diamètre en interaction, charbon lignite poreux, cristaux liquides formés dans un système eau-tensioactif, solution colloïdale de sphères de silice de 0,32 ?m de diamètre.

  17. Small-angle approximation to the transfer of narrow laser beams in anisotropic scattering media

    NASA Technical Reports Server (NTRS)

    Box, M. A.; Deepak, A.

    1981-01-01

    The broadening and the signal power detected of a laser beam traversing an anisotropic scattering medium were examined using the small-angle approximation to the radiative transfer equation in which photons suffering large-angle deflections are neglected. To obtain tractable answers, simple Gaussian and non-Gaussian functions for the scattering phase functions are assumed. Two other approximate approaches employed in the field to further simplify the small-angle approximation solutions are described, and the results obtained by one of them are compared with those obtained using small-angle approximation. An exact method for obtaining the contribution of each higher order scattering to the radiance field is examined but no results are presented.

  18. A New Lunar Atlas: Mapping the Moon with the Wide Angle Camera

    NASA Astrophysics Data System (ADS)

    Speyerer, E.; Robinson, M. S.; Boyd, A.; Sato, H.

    2012-12-01

    The Lunar Reconnaissance Orbiter (LRO) spacecraft launched in June 2009 and began systematically mapping the lunar surface and providing a priceless dataset for the planetary science community and future mission planners. From 20 September 2009 to 11 December 2011, the spacecraft was in a nominal 50 km polar orbit, except for two one-month long periods when a series of spacecraft maneuvers enabled low attitude flyovers (as low as 22 km) of key exploration and scientifically interesting targets. One of the instruments, the Lunar Reconnaissance Orbiter Camera (LROC) Wide Angle Camera (WAC) [1], captured nearly continuous synoptic views of the illuminated lunar surface. The WAC is a 7-band (321, 360, 415, 566, 604, 643, 689 nm) push frame imager with field of view of 60° in color mode and 90° in monochrome mode. This broad field of view enables the WAC to reimage nearly 50% (at the equator where the orbit tracks our spaced the furthest) of the terrain it imaged in the previous orbit. The visible bands of map projected WAC images have a pixel scale of 100 m, while UV bands have a pixel scale of 400 m due to 4x4 pixel on-chip binning that increases signal-to-noise. The nearly circular polar orbit and short (two hour) orbital periods enable seamless mosaics of broad areas of the surface with uniform lighting and resolution. In March of 2011, the LROC team released the first version of the global monochrome (643nm) morphologic map [2], which was comprised of 15,000 WAC images collected over three periods. With the over 130,000 WAC images collected while the spacecraft was in the 50 km orbit, a new set of mosaics are being produced by the LROC Team and will be released to the Planetary Data Systems. These new maps include an updated morphologic map with an improved set of images (limiting illumination variations and gores due to off-nadir observation of other instruments) and a new photometric correction derived from the LROC WAC dataset. In addition, a higher sun (lower incidence angle) mosaic will also be released. This map has minimal shadows and highlights albedo differences. In addition, seamless regional WAC mosaics acquired under multiple lighting geometries (Sunlight coming from the East, overhead, and West) will also be produced for key areas of interest. These new maps use the latest terrain model (LROC WAC GLD100) [3], updated spacecraft ephemeris provided by the LOLA team [4], and improved WAC distortion model [5] to provide accurate placement of each WAC pixel on the lunar surface. References: [1] Robinson et al. (2010) Space Sci. Rev. [2] Speyerer et al. (2011) LPSC, #2387. [3] Scholten et al. (2012) JGR. [4] Mazarico et al. (2012) J. of Geodesy [5] Speyerer et al. (2012) ISPRS Congress.

  19. Enantiopure narrow bite-angle P-OP ligands: synthesis and catalytic performance in asymmetric hydroformylations and hydrogenations.

    PubMed

    Fernández-Pérez, Héctor; Benet-Buchholz, Jordi; Vidal-Ferran, Anton

    2014-11-17

    Herein is reported the preparation of a set of narrow bite-angle P-OP ligands the backbone of which contains a stereogenic carbon atom. The synthesis was based on a Corey-Bakshi-Shibata (CBS)-catalyzed asymmetric reduction of phosphomides. The structure of the resulting 1,1-P-OP ligands, which was selectively tuned through adequate combination of the configuration of the stereogenic carbon atom, its substituent, and the phosphite fragment, proved crucial for providing a rigid environment around the metal center, as evidenced by X-ray crystallography. These new ligands enabled very good catalytic properties in the Rh-mediated enantioselective hydrogenation and hydroformylation of challenging and model substrates (up to 99?%?ee). Whereas for asymmetric hydrogenation the optimal P-OP ligand depended on the substrate, for hydroformylation, a single ligand was the highest-performing one for almost all studied substrates: it contains an R-configured stereogenic carbon atom between the two phosphorus ligating groups, and an S-configured 3,3'-diphenyl-substituted biaryl unit. PMID:25335770

  20. Lunar Reconnaissance Orbiter Camera (LROC) instrument overview

    USGS Publications Warehouse

    Robinson, M.S.; Brylow, S.M.; Tschimmel, M.; Humm, D.; Lawrence, S.J.; Thomas, P.C.; Denevi, B.W.; Bowman-Cisneros, E.; Zerr, J.; Ravine, M.A.; Caplinger, M.A.; Ghaemi, F.T.; Schaffner, J.A.; Malin, M.C.; Mahanti, P.; Bartels, A.; Anderson, J.; Tran, T.N.; Eliason, E.M.; McEwen, A.S.; Turtle, E.; Jolliff, B.L.; Hiesinger, H.

    2010-01-01

    The Lunar Reconnaissance Orbiter Camera (LROC) Wide Angle Camera (WAC) and Narrow Angle Cameras (NACs) are on the NASA Lunar Reconnaissance Orbiter (LRO). The WAC is a 7-color push-frame camera (100 and 400 m/pixel visible and UV, respectively), while the two NACs are monochrome narrow-angle linescan imagers (0.5 m/pixel). The primary mission of LRO is to obtain measurements of the Moon that will enable future lunar human exploration. The overarching goals of the LROC investigation include landing site identification and certification, mapping of permanently polar shadowed and sunlit regions, meter-scale mapping of polar regions, global multispectral imaging, a global morphology base map, characterization of regolith properties, and determination of current impact hazards.

  1. Fovea-stereographic: a projection function for ultra-wide-angle cameras

    NASA Astrophysics Data System (ADS)

    Samy, Ahmed Mahmoud; Gao, Zhishan

    2015-04-01

    A new ultra-wide-angle projection function called fovea-stereographic is described and characterized by the relative relationship between the radial distortion level and the object field-of-view (FOV) angle, creating a high-resolution wide foveal image and adequate peripheral information to be processed within a limited computational time. The paper also provides the design results of an innovative fast fovea-stereographic fisheye lens system with a 170 deg of FOV that shows a more than 58.8% (100 deg) high-resolution central foveal image and at least 15% more peripheral information than any other light projection. Our lens distortion curve, in addition to its modulation transfer function, produces a high-resolution projection for real-time tracking and image transmission applications.

  2. Matching the Best Viewing Angle in Depth Cameras for Biomass Estimation Based on Poplar Seedling Geometry

    PubMed Central

    Andújar, Dionisio; Fernández-Quintanilla, César; Dorado, José

    2015-01-01

    In energy crops for biomass production a proper plant structure is important to optimize wood yields. A precise crop characterization in early stages may contribute to the choice of proper cropping techniques. This study assesses the potential of the Microsoft Kinect for Windows v.1 sensor to determine the best viewing angle of the sensor to estimate the plant biomass based on poplar seedling geometry. Kinect Fusion algorithms were used to generate a 3D point cloud from the depth video stream. The sensor was mounted in different positions facing the tree in order to obtain depth (RGB-D) images from different angles. Individuals of two different ages, e.g., one month and one year old, were scanned. Four different viewing angles were compared: top view (0°), 45° downwards view, front view (90°) and ground upwards view (?45°). The ground-truth used to validate the sensor readings consisted of a destructive sampling in which the height, leaf area and biomass (dry weight basis) were measured in each individual plant. The depth image models agreed well with 45°, 90° and ?45° measurements in one-year poplar trees. Good correlations (0.88 to 0.92) between dry biomass and the area measured with the Kinect were found. In addition, plant height was accurately estimated with a few centimeters error. The comparison between different viewing angles revealed that top views showed poorer results due to the fact the top leaves occluded the rest of the tree. However, the other views led to good results. Conversely, small poplars showed better correlations with actual parameters from the top view (0°). Therefore, although the Microsoft Kinect for Windows v.1 sensor provides good opportunities for biomass estimation, the viewing angle must be chosen taking into account the developmental stage of the crop and the desired parameters. The results of this study indicate that Kinect is a promising tool for a rapid canopy characterization, i.e., for estimating crop biomass production, with several important advantages: low cost, low power needs and a high frame rate (frames per second) when dynamic measurements are required. PMID:26053748

  3. On an assessment of surface roughness estimates from lunar laser altimetry pulse-widths for the Moon from LOLA using LROC narrow-angle stereo DTMs.

    NASA Astrophysics Data System (ADS)

    Muller, Jan-Peter; Poole, William

    2013-04-01

    Neumann et al. [1] proposed that laser altimetry pulse-widths could be employed to derive "within-footprint" surface roughness as opposed to surface roughness estimated from between laser altimetry pierce-points such as the example for Mars [2] and more recently from the 4-pointed star-shaped LOLA (Lunar reconnaissance Orbiter Laser Altimeter) onboard the NASA-LRO [3]. Since 2009, the LOLA has been collecting extensive global laser altimetry data with a 5m footprint and ?25m between the 5 points in a star-shape. In order to assess how accurately surface roughness (defined as simple RMS after slope correction) derived from LROC matches with surface roughness derived from LOLA footprints, publicly released LROC-NA (LRO Camera Narrow Angle) 1m Digital Terrain Models (DTMs) were employed to measure the surface roughness directly within each 5m footprint. A set of 20 LROC-NA DTMs were examined. Initially the match-up between the LOLA and LROC-NA orthorectified images (ORIs) is assessed visually to ensure that the co-registration is better than the LOLA footprint resolution. For each LOLA footprint, the pulse-width geolocation is then retrieved and this is used to "cookie-cut" the surface roughness and slopes derived from the LROC-NA DTMs. The investigation which includes data from a variety of different landforms shows little, if any correlation between surface roughness estimated from DTMs with LOLA pulse-widths at sub-footprint scale. In fact there is only any perceptible correlation between LOLA and LROC-DTMs at baselines of 40-60m for surface roughness and 20m for slopes. [1] Neumann et al. Mars Orbiter Laser Altimeter pulse width measurements and footprint-scale roughness. Geophysical Research Letters (2003) vol. 30 (11), paper 1561. DOI: 10.1029/2003GL017048 [2] Kreslavsky and Head. Kilometer-scale roughness of Mars: results from MOLA data analysis. J Geophys Res (2000) vol. 105 (E11) pp. 26695-26711. [3] Rosenburg et al. Global surface slopes and roughness of the Moon from the Lunar Orbiter Laser Altimeter. Journal of Geophysical Research (2011) vol. 116, paper E02001. DOI: 10.1029/2010JE003716 [4] Chin et al. Lunar Reconnaissance Orbiter Overview: The Instrument Suite and Mission. Space Science Reviews (2007) vol. 129 (4) pp. 391-419

  4. Study of Flow and Heat Transfer Characteristics of non-periodical attack angle in Narrow Rectangular Channel with Longitudinal Vortex generators

    NASA Astrophysics Data System (ADS)

    Wang, L.; Huang, J.

    2010-03-01

    The heat transfer enhancement of Longitudinal Vortex (LV) is a kind of technology with good efficiency and low resistance. LV is produced by Longitudinal Vortex Generators (LVGs) mounted on the heated surface. With relative long influence distance and simple structure, the LVGs can be used in narrow channels with flat surface. The dimension of narrow rectangular channel is 600 mm (length)×40 mm (width) ×3 mm (gap width), the single rectangular block LVGs is laid out in one heated plate. The dimension of LVGs is as follows: height is 1.8 mm, width is 2.2 mm, length is 14 mm, transverse distance is 4 mm, and longitudinal distance is 150 mm. The attack angle of LVGs is very important to extend this kind of technology in narrow rectangular channel with water medium. In previous study, the attack angle of LVGs of periodicity mounted was discussed and the optimal value was 440. In this paper, the attack angle of the first and the second LVG are changed and the others keep 440. Study of flow and heat transfer characteristic of non-periodicity attack angle is completed. The result shows that with the change of attack angle of the first and the second LVGs, the heat transfer enhancement of water medium is advantageous. This conclusion should be extended when the working medium is vapor-liquid two-phase. The results of this calculate method are compared with the experimental results of thermal infrared imager and phase doppler particle analyzer, and they are reasonable. FLUENT6.2 is used to simulate this question, and three velocity components of water flow have been used to define residual intensity ratio of LV.

  5. 2. VAL CAMERA CAR, VIEW OF CAMERA CAR AND TRACK ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    2. VAL CAMERA CAR, VIEW OF CAMERA CAR AND TRACK WITH CAMERA STATION ABOVE LOOKING WEST TAKEN FROM RESERVOIR. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  6. 6. VAL CAMERA CAR, DETAIL OF COMMUNICATION EQUIPMENT INSIDE CAMERA ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    6. VAL CAMERA CAR, DETAIL OF COMMUNICATION EQUIPMENT INSIDE CAMERA CAR WITH CAMERA MOUNT IN FOREGROUND. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  7. 7. VAL CAMERA CAR, DETAIL OF 'FLARE' OR TRAJECTORY CAMERA ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    7. VAL CAMERA CAR, DETAIL OF 'FLARE' OR TRAJECTORY CAMERA INSIDE CAMERA CAR. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  8. The rate and causes of lunar space weathering: Insights from Lunar Reconnaissance Orbiter Wide Angle Camera ultraviolet observations

    NASA Astrophysics Data System (ADS)

    Denevi, B. W.; Robinson, M. S.; Sato, H.; Hapke, B. W.; McEwen, A. S.; Hawke, B. R.

    2011-12-01

    Lunar Reconnaissance Orbiter Wide Angle Camera global ultraviolet and visible imaging provides a unique opportunity to examine the rate and causes of space weathering on the Moon. Silicates typically have a strong decrease in reflectance toward UV wavelengths (<~450 nm) due to strong bands at 250 nm and in the far UV. Metallic iron is relatively spectrally neutral, and laboratory spectra suggest that its addition to mature soils in the form of submicroscopic iron (also known as nanophase iron) flattens silicate spectra, significantly reducing spectral slope in the ultraviolet. Reflectance at ultraviolet wavelengths may be especially sensitive to the surface coatings that form due to exposure to space weathering because scattering from the surfaces of grains contributes a larger fraction to the reflectance spectrum at short wavelengths. We find that the UV slope (as measured by the 320/415 nm ratio) is a more sensitive measure of maturity than indexes based on visible and near-infrared wavelengths. Only the youngest features (less than ~100 Ma) retain a UV slope that is distinct from mature soils of the same composition. No craters >20 km have UV slopes that approach those observed in laboratory spectra of fresh lunar materials (powdered lunar rocks). While the 320/415 nm ratio increases by ~18% from powdered rocks to mature soils in laboratory samples, Giordano Bruno, the freshest large crater, only shows a 3% difference between fresh and mature materials. At the resolution of our UV data (400 m/pixel), we observe some small (<5 km) craters that show a ~14% difference in 320/415 nm ratio from their mature surroundings. UV observations show that Reiner Gamma has had significantly lower levels of space weathering than any of the Copernican craters we examined, and was the only region we found with a UV slope that approached laboratory values for fresh powdered rock samples. This is consistent with the hypothesis that its high albedo is due to magnetic shielding from solar wind sputtering effects. Furthermore the observation that all Copernican craters we examined show some degree of space weathering and the extreme immaturity of Reiner Gamma materials show that space weathering of the surface and the resultant modification of UV spectra proceeds at a fast rate and is dominated by solar wind sputtering. Comparisons of the UV trends on other airless bodies (i.e., asteroids and Mercury) may prove fruitful for understanding the relative rates and causes of space weathering across the inner solar system.

  9. MUSIC - Multifunctional stereo imaging camera system for wide angle and high resolution stereo and color observations on the Mars-94 mission

    NASA Astrophysics Data System (ADS)

    Oertel, D.; Jahn, H.; Sandau, R.; Walter, I.; Driescher, H.

    1990-10-01

    Objectives of the multifunctional stereo imaging camera (MUSIC) system to be deployed on the Soviet Mars-94 mission are outlined. A high-resolution stereo camera (HRSC) and wide-angle opto-electronic stereo scanner (WAOSS) are combined in terms of hardware, software, technology aspects, and solutions. Both HRSC and WAOSS are push-button instruments containing a single optical system and focal plates with several parallel CCD line sensors. Emphasis is placed on the MUSIC system's stereo capability, its design, mass memory, and data compression. A 1-Gbit memory is divided into two parts: 80 percent for HRSC and 20 percent for WAOSS, while the selected on-line compression strategy is based on macropixel coding and real-time transform coding.

  10. Reliability of sagittal plane hip, knee, and ankle joint angles from a single frame of video data using the GAITRite camera system.

    PubMed

    Ross, Sandy A; Rice, Clinton; Von Behren, Kristyn; Meyer, April; Alexander, Rachel; Murfin, Scott

    2015-01-01

    The purpose of this study was to establish intra-rater, intra-session, and inter-rater, reliability of sagittal plane hip, knee, and ankle angles with and without reflective markers using the GAITRite walkway and single video camera between student physical therapists and an experienced physical therapist. This study included thirty-two healthy participants age 20-59, stratified by age and gender. Participants performed three successful walks with and without markers applied to anatomical landmarks. GAITRite software was used to digitize sagittal hip, knee, and ankle angles at two phases of gait: (1) initial contact; and (2) mid-stance. Intra-rater reliability was more consistent for the experienced physical therapist, regardless of joint or phase of gait. Intra-session reliability was variable, the experienced physical therapist showed moderate to high reliability (intra-class correlation coefficient (ICC)?=?0.50-0.89) and the student physical therapist showed very poor to high reliability (ICC?=?0.07-0.85). Inter-rater reliability was highest during mid-stance at the knee with markers (ICC?=?0.86) and lowest during mid-stance at the hip without markers (ICC?=?0.25). Reliability of a single camera system, especially at the knee joint shows promise. Depending on the specific type of reliability, error can be attributed to the testers (e.g. lack of digitization practice and marker placement), participants (e.g. loose fitting clothing) and camera systems (e.g. frame rate and resolution). However, until the camera technology can be upgraded to a higher frame rate and resolution, and the software can be linked to the GAITRite walkway, the clinical utility for pre/post measures is limited. PMID:25230893

  11. The measurement of in vivo joint angles during a squat using a single camera markerless motion capture system as compared to a marker based system.

    PubMed

    Schmitz, Anne; Ye, Mao; Boggess, Grant; Shapiro, Robert; Yang, Ruigang; Noehren, Brian

    2015-02-01

    Markerless motion capture may have the potential to make motion capture technology widely clinically practical. However, the ability of a single markerless camera system to quantify clinically relevant, lower extremity joint angles has not been studied in vivo. Therefore, the goal of this study was to compare in vivo joint angles calculated using a marker-based motion capture system and a Microsoft Kinect during a squat. Fifteen individuals participated in the study: 8 male, 7 female, height 1.702±0.089m, mass 67.9±10.4kg, age 24±4 years, BMI 23.4±2.2kg/m(2). Marker trajectories and Kinect depth map data of the leg were collected while each subject performed a slow squat motion. Custom code was used to export virtual marker trajectories for the Kinect data. Each set of marker trajectories was utilized to calculate Cardan knee and hip angles. The patterns of motion were similar between systems with average absolute differences of <5 deg. Peak joint angles showed high between-trial reliability with ICC>0.9 for both systems. The peak angles calculated by the marker-based and Kinect systems were largely correlated (r>0.55). These results suggest the data from the Kinect can be post processed in way that it may be a feasible markerless motion capture system that can be used in the clinic. PMID:25708833

  12. 3. VAL CAMERA CAR, VIEW OF CAMERA CAR AND TRACK ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    3. VAL CAMERA CAR, VIEW OF CAMERA CAR AND TRACK WITH THE VAL TO THE RIGHT, LOOKING NORTHEAST. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  13. 7. VAL CAMERA STATION, INTERIOR VIEW OF CAMERA MOUNT, COMMUNICATION ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    7. VAL CAMERA STATION, INTERIOR VIEW OF CAMERA MOUNT, COMMUNICATION EQUIPMENT AND STORAGE CABINET. - Variable Angle Launcher Complex, Camera Stations, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  14. 1. VARIABLEANGLE LAUNCHER CAMERA CAR, VIEW OF CAMERA CAR AND ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    1. VARIABLE-ANGLE LAUNCHER CAMERA CAR, VIEW OF CAMERA CAR AND TRACK WITH CAMERA STATION ABOVE LOOKING NORTH TAKEN FROM RESERVOIR. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  15. Wide Angle Movie

    NASA Technical Reports Server (NTRS)

    1999-01-01

    This brief movie illustrates the passage of the Moon through the Saturn-bound Cassini spacecraft's wide-angle camera field of view as the spacecraft passed by the Moon on the way to its closest approach with Earth on August 17, 1999. From beginning to end of the sequence, 25 wide-angle images (with a spatial image scale of about 14 miles per pixel (about 23 kilometers)were taken over the course of 7 and 1/2 minutes through a series of narrow and broadband spectral filters and polarizers, ranging from the violet to the near-infrared regions of the spectrum, to calibrate the spectral response of the wide-angle camera. The exposure times range from 5 milliseconds to 1.5 seconds. Two of the exposures were smeared and have been discarded and replaced with nearby images to make a smooth movie sequence. All images were scaled so that the brightness of Crisium basin, the dark circular region in the upper right, is approximately the same in every image. The imaging data were processed and released by the Cassini Imaging Central Laboratory for Operations (CICLOPS)at the University of Arizona's Lunar and Planetary Laboratory, Tucson, AZ.

    Photo Credit: NASA/JPL/Cassini Imaging Team/University of Arizona

    Cassini, launched in 1997, is a joint mission of NASA, the European Space Agency and Italian Space Agency. The mission is managed by NASA's Jet Propulsion Laboratory, Pasadena, CA, for NASA's Office of Space Science, Washington DC. JPL is a division of the California Institute of Technology, Pasadena, CA.

  16. Determining iconometric parameters of imaging devices using a wide-angle collimator. [calibration of satellite-borne television and photographic cameras

    NASA Technical Reports Server (NTRS)

    Ziman, Y. L.

    1974-01-01

    The problem of determining the iconometric parameters of the imaging device can be solved if the camera being calibrated is used to obtain the image of a group of reference points, the directions to which are known. In order to specify the imaging device coordinate system, it is sufficient in principle to obtain on the picture the images of three reference points which do not lie on a single straight line. Many more such points are required in order to determine the distortion corrections, and they must be distributed uniformly over the entire field of view of the camera being calibrated. Experimental studies were made using this technique to calibrate photographic and phototelevision systems. Evaluation of the results of these experiments permits recommending collimators for calibrating television and phototelevision imaging systems, and also short-focus small-format photographic cameras.

  17. Critical Heat Flux in Inclined Rectangular Narrow Gaps

    SciTech Connect

    Jeong J. Kim; Yong H. Kim; Seong J. Kim; Sang W. Noh; Kune Y. Suh; Joy L. Rempe; Fan-Bill Cheung; Sang B. Kim

    2004-06-01

    In light of the TMI-2 accident, in which the reactor vessel lower head survived the attack by molten core material, the in-vessel retention strategy was suggested to benefit from cooling the debris through a gap between the lower head and the core material. The GAMMA 1D (Gap Apparatus Mitigating Melt Attack One Dimensional) tests were conducted to investigate the critical heat flux (CHF) in narrow gaps with varying surface orientations. The CHF in an inclined gap, especially in case of the downward-facing narrow gap, is dictated by bubble behavior because the departing bubbles are squeezed. The orientation angle affects the bubble layer and escape of the bubbles from the narrow gap. The test parameters include gap sizes of 1, 2, 5 and 10 mm and the open periphery, and the orientation angles range from the fully downward-facing (180o) to the vertical (90o) position. The 15 ×35 mm copper test section was electrically heated by the thin film resistor on the back. The heater assembly was installed to the tip of the rotating arm in the heated water pool at the atmospheric pressure. The bubble behavior was photographed utilizing a high-speed camera through the Pyrex glass spacer. It was observed that the CHF decreased as the surface inclination angle increased and as the gap size decreased in most of the cases. However, the opposing results were obtained at certain surface orientations and gap sizes. Transition angles, at which the CHF changed in a rapid slope, were also detected, which is consistent with the existing literature. A semi-empirical CHF correlation was developed for the inclined narrow rectangular channels through dimensional analysis. The correlation provides with best-estimate CHF values for realistically assessing the thermal margin to failure of the lower head during a severe accident involving relocation of the core material.

  18. 8. VAL CAMERA CAR, CLOSEUP VIEW OF 'FLARE' OR TRAJECTORY ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    8. VAL CAMERA CAR, CLOSE-UP VIEW OF 'FLARE' OR TRAJECTORY CAMERA ON SLIDING MOUNT. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  19. Imaging Narrow Angle The Voyager Spacecraft

    E-print Network

    Calibration Target and Radiator Bus Housing Electronics Low-Energy Charged Particle Detector Photopolarimeter structure of the space carved out of the interstellar medium by the Sun, the distribution of magnetic fields in South Pasadena. He was responsible for testing all the detectors that went into the LETs and assembled

  20. Ceres Photometry and Albedo from Dawn Framing Camera Images

    NASA Astrophysics Data System (ADS)

    Schröder, S. E.; Mottola, S.; Keller, H. U.; Li, J.-Y.; Matz, K.-D.; Otto, K.; Roatsch, T.; Stephan, K.; Raymond, C. A.; Russell, C. T.

    2015-10-01

    The Dawn spacecraft is in orbit around dwarf planet Ceres. The onboard Framing Camera (FC) [1] is mapping the surface through a clear filter and 7 narrow-band filters at various observational geometries. Generally, Ceres' appearance in these images is affected by shadows and shading, effects which become stronger for larger solar phase angles, obscuring the intrinsic reflective properties of the surface. By means of photometric modeling we attempt to remove these effects and reconstruct the surface albedo over the full visible wavelength range. Knowledge of the albedo distribution will contribute to our understanding of the physical nature and composition of the surface.

  1. Camera Optics.

    ERIC Educational Resources Information Center

    Ruiz, Michael J.

    1982-01-01

    The camera presents an excellent way to illustrate principles of geometrical optics. Basic camera optics of the single-lens reflex camera are discussed, including interchangeable lenses and accessories available to most owners. Several experiments are described and results compared with theoretical predictions or manufacturer specifications.…

  2. Publications Angling, Angling Records,

    E-print Network

    Publications Angling, Angling Records, and Game Fish Conservation The 1982 edition of "World Record Game Fishes," published by the Inter- national Game Fish Association, 3000 East Las alas Boulevard, Fort Lauder- dale, FL 33316, continues to grow as an important reference work for the serious angler

  3. Dynamics of an oscillating bubble in a narrow gap

    E-print Network

    Azam, Fahad Ibn

    The complex dynamics of a single bubble of a few millimeters in size oscillating inside a narrow fluid-filled gap between two parallel plates is studied using high-speed videography. Two synchronized high-speed cameras ...

  4. 5. VAL CAMERA CAR, DETAIL OF HOIST AT SIDE OF ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    5. VAL CAMERA CAR, DETAIL OF HOIST AT SIDE OF BRIDGE AND ENGINE CAR ON TRACKS, LOOKING NORTHEAST. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  5. Electronic still camera

    NASA Astrophysics Data System (ADS)

    Holland, S. Douglas

    1992-09-01

    A handheld, programmable, digital camera is disclosed that supports a variety of sensors and has program control over the system components to provide versatility. The camera uses a high performance design which produces near film quality images from an electronic system. The optical system of the camera incorporates a conventional camera body that was slightly modified, thus permitting the use of conventional camera accessories, such as telephoto lenses, wide-angle lenses, auto-focusing circuitry, auto-exposure circuitry, flash units, and the like. An image sensor, such as a charge coupled device ('CCD') collects the photons that pass through the camera aperture when the shutter is opened, and produces an analog electrical signal indicative of the image. The analog image signal is read out of the CCD and is processed by preamplifier circuitry, a correlated double sampler, and a sample and hold circuit before it is converted to a digital signal. The analog-to-digital converter has an accuracy of eight bits to insure accuracy during the conversion. Two types of data ports are included for two different data transfer needs. One data port comprises a general purpose industrial standard port and the other a high speed/high performance application specific port. The system uses removable hard disks as its permanent storage media. The hard disk receives the digital image signal from the memory buffer and correlates the image signal with other sensed parameters, such as longitudinal or other information. When the storage capacity of the hard disk has been filled, the disk can be replaced with a new disk.

  6. Electronic Still Camera

    NASA Technical Reports Server (NTRS)

    Holland, S. Douglas (inventor)

    1992-01-01

    A handheld, programmable, digital camera is disclosed that supports a variety of sensors and has program control over the system components to provide versatility. The camera uses a high performance design which produces near film quality images from an electronic system. The optical system of the camera incorporates a conventional camera body that was slightly modified, thus permitting the use of conventional camera accessories, such as telephoto lenses, wide-angle lenses, auto-focusing circuitry, auto-exposure circuitry, flash units, and the like. An image sensor, such as a charge coupled device ('CCD') collects the photons that pass through the camera aperture when the shutter is opened, and produces an analog electrical signal indicative of the image. The analog image signal is read out of the CCD and is processed by preamplifier circuitry, a correlated double sampler, and a sample and hold circuit before it is converted to a digital signal. The analog-to-digital converter has an accuracy of eight bits to insure accuracy during the conversion. Two types of data ports are included for two different data transfer needs. One data port comprises a general purpose industrial standard port and the other a high speed/high performance application specific port. The system uses removable hard disks as its permanent storage media. The hard disk receives the digital image signal from the memory buffer and correlates the image signal with other sensed parameters, such as longitudinal or other information. When the storage capacity of the hard disk has been filled, the disk can be replaced with a new disk.

  7. CCD Camera

    DOEpatents

    Roth, R.R.

    1983-08-02

    A CCD camera capable of observing a moving object which has varying intensities of radiation emanating therefrom and which may move at varying speeds is shown wherein there is substantially no overlapping of successive images and wherein the exposure times and scan times may be varied independently of each other. 7 figs.

  8. CCD Camera

    DOEpatents

    Roth, Roger R. (Minnetonka, MN)

    1983-01-01

    A CCD camera capable of observing a moving object which has varying intensities of radiation eminating therefrom and which may move at varying speeds is shown wherein there is substantially no overlapping of successive images and wherein the exposure times and scan times may be varied independently of each other.

  9. Cassini Camera Contamination Anomaly: Experiences and Lessons Learned

    NASA Technical Reports Server (NTRS)

    Haemmerle, Vance R.; Gerhard, James H.

    2006-01-01

    We discuss the contamination 'Haze' anomaly for the Cassini Narrow Angle Camera (NAC), one of two optical telescopes that comprise the Imaging Science Subsystem (ISS). Cassini is a Saturn Orbiter with a 4-year nominal mission. The incident occurred in 2001, five months after Jupiter encounter during the Cruise phase and ironically at the resumption of planned maintenance decontamination cycles. The degraded optical performance was first identified by the Instrument Operations Team with the first ISS Saturn imaging six weeks later. A distinct haze of varying size from image to image marred the images of Saturn. A photometric star calibration of the Pleiades, 4 days after the incident, showed stars with halos. Analysis showed that while the halo's intensity was only 1 - 2% of the intensity of the central peak of a star, the halo contained 30 - 70% of its integrated flux. This condition would impact science return. In a review of our experiences, we examine the contamination control plan, discuss the analysis of the limited data available and describe the one-year campaign to remove the haze from the camera. After several long conservative heating activities and interim analysis of their results, the contamination problem as measured by the camera's point spread function was essentially back to preanomaly size and at a point where there would be more risk to continue. We stress the importance of the flexibility of operations and instrument design, the need to do early infight instrument calibration and continual monitoring of instrument performance.

  10. Calibration of the Lunar Reconnaissance Orbiter Camera

    NASA Astrophysics Data System (ADS)

    Tschimmel, M.; Robinson, M. S.; Humm, D. C.; Denevi, B. W.; Lawrence, S. J.; Brylow, S.; Ravine, M.; Ghaemi, T.

    2008-12-01

    The Lunar Reconnaissance Orbiter Camera (LROC) onboard the NASA Lunar Reconnaissance Orbiter (LRO) spacecraft consists of three cameras: the Wide-Angle Camera (WAC) and two identical Narrow Angle Cameras (NAC-L, NAC-R). The WAC is push-frame imager with 5 visible wavelength filters (415 to 680 nm) at a spatial resolution of 100 m/pixel and 2 UV filters (315 and 360 nm) with a resolution of 400 m/pixel. In addition to the multicolor imaging the WAC can operate in monochrome mode to provide a global large- incidence angle basemap and a time-lapse movie of the illumination conditions at both poles. The WAC has a highly linear response, a read noise of 72 e- and a full well capacity of 47,200 e-. The signal-to-noise ratio in each band is 140 in the worst case. There are no out-of-band leaks and the spectral response of each filter is well characterized. Each NAC is a monochrome pushbroom scanner, providing images with a resolution of 50 cm/pixel from a 50-km orbit. A single NAC image has a swath width of 2.5 km and a length of up to 26 km. The NACs are mounted to acquire side-by-side imaging for a combined swath width of 5 km. The NAC is designed to fully characterize future human and robotic landing sites in terms of topography and hazard risks. The North and South poles will be mapped on a 1-meter-scale poleward of 85.5° latitude. Stereo coverage can be provided by pointing the NACs off-nadir. The NACs are also highly linear. Read noise is 71 e- for NAC-L and 74 e- for NAC-R and the full well capacity is 248,500 e- for NAC-L and 262,500 e- for NAC- R. The focal lengths are 699.6 mm for NAC-L and 701.6 mm for NAC-R; the system MTF is 28% for NAC-L and 26% for NAC-R. The signal-to-noise ratio is at least 46 (terminator scene) and can be higher than 200 (high sun scene). Both NACs exhibit a straylight feature, which is caused by out-of-field sources and is of a magnitude of 1-3%. However, as this feature is well understood it can be greatly reduced during ground processing. All three cameras were calibrated in the laboratory under ambient conditions. Future thermal vacuum tests will characterize critical behaviors across the full range of lunar operating temperatures. In-flight tests will check for changes in response after launch and provide key data for meeting the requirements of 1% relative and 10% absolute radiometric calibration.

  11. Pinhole Cameras: For Science, Art, and Fun!

    ERIC Educational Resources Information Center

    Button, Clare

    2007-01-01

    A pinhole camera is a camera without a lens. A tiny hole replaces the lens, and light is allowed to come in for short amount of time by means of a hand-operated shutter. The pinhole allows only a very narrow beam of light to enter, which reduces confusion due to scattered light on the film. This results in an image that is focused, reversed, and…

  12. Omnidirectional Underwater Camera Design and Calibration

    PubMed Central

    Bosch, Josep; Gracias, Nuno; Ridao, Pere; Ribas, David

    2015-01-01

    This paper presents the development of an underwater omnidirectional multi-camera system (OMS) based on a commercially available six-camera system, originally designed for land applications. A full calibration method is presented for the estimation of both the intrinsic and extrinsic parameters, which is able to cope with wide-angle lenses and non-overlapping cameras simultaneously. This method is valid for any OMS in both land or water applications. For underwater use, a customized housing is required, which often leads to strong image distortion due to refraction among the different media. This phenomena makes the basic pinhole camera model invalid for underwater cameras, especially when using wide-angle lenses, and requires the explicit modeling of the individual optical rays. To address this problem, a ray tracing approach has been adopted to create a field-of-view (FOV) simulator for underwater cameras. The simulator allows for the testing of different housing geometries and optics for the cameras to ensure a complete hemisphere coverage in underwater operation. This paper describes the design and testing of a compact custom housing for a commercial off-the-shelf OMS camera (Ladybug 3) and presents the first results of its use. A proposed three-stage calibration process allows for the estimation of all of the relevant camera parameters. Experimental results are presented, which illustrate the performance of the calibration method and validate the approach. PMID:25774707

  13. Omnidirectional underwater camera design and calibration.

    PubMed

    Bosch, Josep; Gracias, Nuno; Ridao, Pere; Ribas, David

    2015-01-01

    This paper presents the development of an underwater omnidirectional multi-camera system (OMS) based on a commercially available six-camera system, originally designed for land applications. A full calibration method is presented for the estimation of both the intrinsic and extrinsic parameters, which is able to cope with wide-angle lenses and non-overlapping cameras simultaneously. This method is valid for any OMS in both land or water applications. For underwater use, a customized housing is required, which often leads to strong image distortion due to refraction among the different media. This phenomena makes the basic pinhole camera model invalid for underwater cameras, especially when using wide-angle lenses, and requires the explicit modeling of the individual optical rays. To address this problem, a ray tracing approach has been adopted to create a field-of-view (FOV) simulator for underwater cameras. The simulator allows for the testing of different housing geometries and optics for the cameras to ensure a complete hemisphere coverage in underwater operation. This paper describes the design and testing of a compact custom housing for a commercial off-the-shelf OMS camera (Ladybug 3) and presents the first results of its use. A proposed three-stage calibration process allows for the estimation of all of the relevant camera parameters. Experimental results are presented, which illustrate the performance of the calibration method and validate the approach. PMID:25774707

  14. Laboratory Calibration and Characterization of Video Cameras

    NASA Technical Reports Server (NTRS)

    Burner, A. W.; Snow, W. L.; Shortis, M. R.; Goad, W. K.

    1989-01-01

    Some techniques for laboratory calibration and characterization of video cameras used with frame grabber boards are presented. A laser-illuminated displaced reticle technique (with camera lens removed) is used to determine the camera/grabber effective horizontal and vertical pixel spacing as well as the angle of non-perpendicularity of the axes. The principal point of autocollimation and point of symmetry are found by illuminating the camera with an unexpanded laser beam, either aligned with the sensor or lens. Lens distortion and the principal distance are determined from images of a calibration plate suitable aligned with the camera. Calibration and characterization results for several video cameras are presented. Differences between these laboratory techniques and test range and plumb line calibration are noted.

  15. Laboratory calibration and characterization of video cameras

    NASA Technical Reports Server (NTRS)

    Burner, A. W.; Snow, W. L.; Shortis, M. R.; Goad, W. K.

    1990-01-01

    Some techniques for laboratory calibration and characterization of video cameras used with frame grabber boards are presented. A laser-illuminated displaced reticle technique (with camera lens removed) is used to determine the camera/grabber effective horizontal and vertical pixel spacing as well as the angle of nonperpendicularity of the axes. The principal point of autocollimation and point of symmetry are found by illuminating the camera with an unexpanded laser beam, either aligned with the sensor or lens. Lens distortion and the principal distance are determined from images of a calibration plate suitably aligned with the camera. Calibration and characterization results for several video cameras are presented. Differences between these laboratory techniques and test range and plumb line calibration are noted.

  16. Readout electronics of physics of accelerating universe camera

    NASA Astrophysics Data System (ADS)

    de Vicente, Juan; Castilla, Javier; Jiménez, Jorge; Cardiel-Sas, L.; Illa, José M.

    2014-08-01

    The Physics of Accelerating Universe Camera (PAUCam) is a new camera for dark energy studies that will be installed in the William Herschel telescope. The main characteristic of the camera is the capacity for high precision photometric redshift measurement. The camera is composed of eighteen Hamamatsu Photonics CCDs providing a wide field of view covering a diameter of one degree. Unlike the common five optical filters of other similar surveys, PAUCam has forty optical narrow band filters which will provide higher resolution in photometric redshifts. In this paper a general description of the electronics of the camera and its status is presented.

  17. Automatic commanding of the Mars Observer Camera

    NASA Technical Reports Server (NTRS)

    Caplinger, Michael

    1994-01-01

    Mars Observer, launched in September 1992, was intended to be a 'survey-type' mission that acquired global coverage of Mars from a low, circular, near-polar orbit during an entire Martian year. As such, most of its instruments had fixed data rates, wide fields of view, and relatively low resolution, with fairly limited requirements for commanding. An exception is the Mars Observer Camera, or MOC. The MOC consists of a two-color Wide Angle (WA) system that can acquire both global images at low resolution (7.5 km/pixel) and regional images at commandable resolutions up to 250 m/pixel. Complementing the WA is the Narrow Angle (NA) system, that can acquire images at 8 resolutions from 12 m/pixel to 1.5 m/pixel, with a maximum crosstrack dimension of 3 km. The MOC also provides various forms of data compression (both lossless and lossy), and is designed to work at data rates from 700 bits per second (bps) to over 80k bps. Because of this flexibility, developing MOC command sequences is much more difficult than the routine mode-changing that characterizes other instrument operations. Although the MOC cannot be pointed (the spacecraft is fixed nadir-pointing and has no scan platform), the timing, downlink stream allocation, compression type and parameters, and image dimensions of each image must be commanded from the ground, subject to the constraints inherent in the MOC and the spacecraft. To minimize the need for a large operations staff, the entire command generation process has been automated within the MOC Ground Data System. Following the loss of the Mars Observer spacecraft in August 1993, NASA intends to launch a new spacecraft, Mars Global Surveyor (MGS), in late 1996. This spacecraft will carry the MOC flight spare (MOC 2). The MOC 2 operations plan will be largely identical to that developed for MOC, and all of the algorithms described here are applicable to it.

  18. Characterization of previously unidentified lunar pyroclastic deposits using Lunar Reconnaissance Orbiter Camera (LROC) data

    USGS Publications Warehouse

    Gustafson, J. Olaf; Bell, James F.; Gaddis, Lisa R.R.; Hawke, B. Ray Ray; Giguere, Thomas A.

    2012-01-01

    We used a Lunar Reconnaissance Orbiter Camera (LROC) global monochrome Wide-angle Camera (WAC) mosaic to conduct a survey of the Moon to search for previously unidentified pyroclastic deposits. Promising locations were examined in detail using LROC multispectral WAC mosaics, high-resolution LROC Narrow Angle Camera (NAC) images, and Clementine multispectral (ultraviolet-visible or UVVIS) data. Out of 47 potential deposits chosen for closer examination, 12 were selected as probable newly identified pyroclastic deposits. Potential pyroclastic deposits were generally found in settings similar to previously identified deposits, including areas within or near mare deposits adjacent to highlands, within floor-fractured craters, and along fissures in mare deposits. However, a significant new finding is the discovery of localized pyroclastic deposits within floor-fractured craters Anderson E and F on the lunar farside, isolated from other known similar deposits. Our search confirms that most major regional and localized low-albedo pyroclastic deposits have been identified on the Moon down to ~100 m/pix resolution, and that additional newly identified deposits are likely to be either isolated small deposits or additional portions of discontinuous, patchy deposits.

  19. System Synchronizes Recordings from Separated Video Cameras

    NASA Technical Reports Server (NTRS)

    Nail, William; Nail, William L.; Nail, Jasper M.; Le, Doung T.

    2009-01-01

    A system of electronic hardware and software for synchronizing recordings from multiple, physically separated video cameras is being developed, primarily for use in multiple-look-angle video production. The system, the time code used in the system, and the underlying method of synchronization upon which the design of the system is based are denoted generally by the term "Geo-TimeCode(TradeMark)." The system is embodied mostly in compact, lightweight, portable units (see figure) denoted video time-code units (VTUs) - one VTU for each video camera. The system is scalable in that any number of camera recordings can be synchronized. The estimated retail price per unit would be about $350 (in 2006 dollars). The need for this or another synchronization system external to video cameras arises because most video cameras do not include internal means for maintaining synchronization with other video cameras. Unlike prior video-camera-synchronization systems, this system does not depend on continuous cable or radio links between cameras (however, it does depend on occasional cable links lasting a few seconds). Also, whereas the time codes used in prior video-camera-synchronization systems typically repeat after 24 hours, the time code used in this system does not repeat for slightly more than 136 years; hence, this system is much better suited for long-term deployment of multiple cameras.

  20. Pre-hibernation performances of the OSIRIS cameras onboard the Rosetta spacecraft

    NASA Astrophysics Data System (ADS)

    Magrin, S.; La Forgia, F.; Da Deppo, V.; Lazzarin, M.; Bertini, I.; Ferri, F.; Pajola, M.; Barbieri, M.; Naletto, G.; Barbieri, C.; Tubiana, C.; Küppers, M.; Fornasier, S.; Jorda, L.; Sierks, H.

    2015-02-01

    Context. The ESA cometary mission Rosetta was launched in 2004. In the past years and until the spacecraft hibernation in June 2011, the two cameras of the OSIRIS imaging system (Narrow Angle and Wide Angle Camera, NAC and WAC) observed many different sources. On 20 January 2014 the spacecraft successfully exited hibernation to start observing the primary scientific target of the mission, comet 67P/Churyumov-Gerasimenko. Aims: A study of the past performances of the cameras is now mandatory to be able to determine whether the system has been stable through the time and to derive, if necessary, additional analysis methods for the future precise calibration of the cometary data. Methods: The instrumental responses and filter passbands were used to estimate the efficiency of the system. A comparison with acquired images of specific calibration stars was made, and a refined photometric calibration was computed, both for the absolute flux and for the reflectivity of small bodies of the solar system. Results: We found a stability of the instrumental performances within ±1.5% from 2007 to 2010, with no evidence of an aging effect on the optics or detectors. The efficiency of the instrumentation is found to be as expected in the visible range, but lower than expected in the UV and IR range. A photometric calibration implementation was discussed for the two cameras. Conclusions: The calibration derived from pre-hibernation phases of the mission will be checked as soon as possible after the awakening of OSIRIS and will be continuously monitored until the end of the mission in December 2015. A list of additional calibration sources has been determined that are to be observed during the forthcoming phases of the mission to ensure a better coverage across the wavelength range of the cameras and to study the possible dust contamination of the optics.

  1. Multispectral calibration to enhance the metrology performance of C-mount camera systems

    NASA Astrophysics Data System (ADS)

    Robson, S.; MacDonald, L.; Kyle, S. A.; Shortis, M. R.

    2014-06-01

    Low cost monochrome camera systems based on CMOS sensors and C-mount lenses have been successfully applied to a wide variety of metrology tasks. For high accuracy work such cameras are typically equipped with ring lights to image retro-reflective targets as high contrast image features. Whilst algorithms for target image measurement and lens modelling are highly advanced, including separate RGB channel lens distortion correction, target image circularity compensation and a wide variety of detection and centroiding approaches, less effort has been directed towards optimising physical target image quality by considering optical performance in narrow wavelength bands. This paper describes an initial investigation to assess the effect of wavelength on camera calibration parameters for two different camera bodies and the same `C-mount' wide angle lens. Results demonstrate the expected strong influence on principal distance, radial and tangential distortion, and also highlight possible trends in principal point, orthogonality and affinity parameters which are close to the parameter estimation noise level from the strong convergent self-calibrating image networks.

  2. Angle performance on optima MDxt

    SciTech Connect

    David, Jonathan; Kamenitsa, Dennis

    2012-11-06

    Angle control on medium current implanters is important due to the high angle-sensitivity of typical medium current implants, such as halo implants. On the Optima MDxt, beam-to-wafer angles are controlled in both the horizontal and vertical directions. In the horizontal direction, the beam angle is measured through six narrow slits, and any angle adjustment is made by electrostatically steering the beam, while cross-wafer beam parallelism is adjusted by changing the focus of the electrostatic parallelizing lens (P-lens). In the vertical direction, the beam angle is measured through a high aspect ratio mask, and any angle adjustment is made by slightly tilting the wafer platen prior to implant. A variety of tests were run to measure the accuracy and repeatability of Optima MDxt's angle control. SIMS profiles of a high energy, channeling sensitive condition show both the cross-wafer angle uniformity, along with the small-angle resolution of the system. Angle repeatability was quantified by running a channeling sensitive implant as a regular monitor over a seven month period and measuring the sheet resistance-to-angle sensitivity. Even though crystal cut error was not controlled for in this case, when attributing all Rs variation to angle changes, the overall angle repeatability was measured as 0.16 Degree-Sign (1{sigma}). A separate angle repeatability test involved running a series of V-curves tests over a four month period using low crystal cut wafers selected from the same boule. The results of this test showed the angle repeatability to be <0.1 Degree-Sign (1{sigma}).

  3. Two-Camera Acquisition and Tracking of a Flying Target

    NASA Technical Reports Server (NTRS)

    Biswas, Abhijit; Assad, Christopher; Kovalik, Joseph M.; Pain, Bedabrata; Wrigley, Chris J.; Twiss, Peter

    2008-01-01

    A method and apparatus have been developed to solve the problem of automated acquisition and tracking, from a location on the ground, of a luminous moving target in the sky. The method involves the use of two electronic cameras: (1) a stationary camera having a wide field of view, positioned and oriented to image the entire sky; and (2) a camera that has a much narrower field of view (a few degrees wide) and is mounted on a two-axis gimbal. The wide-field-of-view stationary camera is used to initially identify the target against the background sky. So that the approximate position of the target can be determined, pixel locations on the image-detector plane in the stationary camera are calibrated with respect to azimuth and elevation. The approximate target position is used to initially aim the gimballed narrow-field-of-view camera in the approximate direction of the target. Next, the narrow-field-of view camera locks onto the target image, and thereafter the gimbals are actuated as needed to maintain lock and thereby track the target with precision greater than that attainable by use of the stationary camera.

  4. Search for narrow angle anisotropies with the MICRO experiment.

    NASA Astrophysics Data System (ADS)

    Battistoni, G.; Bloise, C.; Grillo, A. F.; Marini, A.; Ronga, F.; Valente, V.

    The authors present results from MICRO, a muon telescope with good angular resolution, which has collected more than 31×106cosmic muons. Upper limits are given for the flux coming from point sources and for the periodic component from Cyg X-3.

  5. Determining Camera Gain in Room Temperature Cameras

    SciTech Connect

    Joshua Cogliati

    2010-12-01

    James R. Janesick provides a method for determining the amplification of a CCD or CMOS camera when only access to the raw images is provided. However, the equation that is provided ignores the contribution of dark current. For CCD or CMOS cameras that are cooled well below room temperature, this is not a problem, however, the technique needs adjustment for use with room temperature cameras. This article describes the adjustment made to the equation, and a test of this method.

  6. Interconnected network of cameras

    NASA Astrophysics Data System (ADS)

    Hosseini Kamal, Mahdad; Afshari, Hossein; Leblebici, Yusuf; Schmid, Alexandre; Vandergheynst, Pierre

    2013-02-01

    The real-time development of multi-camera systems is a great challenge. Synchronization and large data rates of the cameras adds to the complexity of these systems as well. The complexity of such system also increases as the number of their incorporating cameras increases. The customary approach to implementation of such system is a central type, where all the raw stream from the camera are first stored then processed for their target application. An alternative approach is to embed smart cameras to these systems instead of ordinary cameras with limited or no processing capability. Smart cameras with intra and inter camera processing capability and programmability at the software and hardware level will offer the right platform for distributed and parallel processing for multi- camera systems real-time application development. Inter camera processing requires the interconnection of smart cameras in a network arrangement. A novel hardware emulating platform is introduced for demonstrating the concept of the interconnected network of cameras. A methodology is demonstrated for the interconnection network of camera construction and analysis. A sample application is developed and demonstrated.

  7. The nucleus of comet 67P through the eyes of the OSIRIS cameras

    NASA Astrophysics Data System (ADS)

    Guettler, Carsten; Sierks, Holger; Barbieri, Cesare; Lamy, Philippe; Rodrigo, Rafael; Koschny, Detlef; Rickman, Hans; OSIRIS Team; Capaccioni, Fabrizio; Filacchione, Gianrico; Ciarniello, Mauro; Erard, Stephane; Rinaldi, Giovanna; Tosi, Federico

    2015-11-01

    The Rosetta spacecraft is studying comet 67P/Churyumov-Gerasimenko from a close distance since August 2014. Onboard the spacecraft, the two scientific cameras, the OSIRIS narrow- and the wide-angle camera, are observing the cometary nucleus, its activity, as well as the dust and gas environment.This overview paper will cover OSIRIS science from the early arrival and mapping phase, the PHILAE landing, and the escort phase including the two close fly-bys. With a first characterization of global physical parameters of the nucleus, the OSIRIS cameras also provided the data to reconstruct a 3D shape model of the comet and a division into morphologic sub-units. From observations of near-surface activity, jet-like features can be projected onto the surface and active sources can be correlated with surface features like cliffs, pits, or flat planes. The increase of activity during and after perihelion in August 2015 showed several outbursts, which were seen as strong, collimated jets originating from the southern hemisphere.A comparison of results between different Rosetta instruments will give further inside into the physics of the comet's nucleus and its coma. The OSIRIS and VIRTIS instruments are particularly well suited to support and complement each other. With an overlap in spectral range, one instrument can provide the best spatial resolution while the other is strong in the spectral resolution. A summary on collaborative efforts will be given.

  8. Prediction of Viking lander camera image quality

    NASA Technical Reports Server (NTRS)

    Huck, F. O.; Burcher, E. E.; Jobson, D. J.; Wall, S. D.

    1976-01-01

    Formulations are presented that permit prediction of image quality as a function of camera performance, surface radiance properties, and lighting and viewing geometry. Predictions made for a wide range of surface radiance properties reveal that image quality depends strongly on proper camera dynamic range command and on favorable lighting and viewing geometry. Proper camera dynamic range commands depend mostly on the surface albedo that will be encountered. Favorable lighting and viewing geometries depend mostly on lander orientation with respect to the diurnal sun path over the landing site, and tend to be independent of surface albedo and illumination scattering function. Side lighting with low sun elevation angles (10 to 30 deg) is generally favorable for imaging spatial details and slopes, whereas high sun elevation angles are favorable for measuring spectral reflectances.

  9. 9. COMPLETED ROLLING CAMERA CAR ON RAILROAD TRACK AND BRIDGE ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    9. COMPLETED ROLLING CAMERA CAR ON RAILROAD TRACK AND BRIDGE LOOKING WEST, APRIL 26, 1948. (ORIGINAL PHOTOGRAPH IN POSSESSION OF DAVE WILLIS, SAN DIEGO, CALIFORNIA.) - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  10. Vacuum Camera Cooler

    NASA Technical Reports Server (NTRS)

    Laugen, Geoffrey A.

    2011-01-01

    Acquiring cheap, moving video was impossible in a vacuum environment, due to camera overheating. This overheating is brought on by the lack of cooling media in vacuum. A water-jacketed camera cooler enclosure machined and assembled from copper plate and tube has been developed. The camera cooler (see figure) is cup-shaped and cooled by circulating water or nitrogen gas through copper tubing. The camera, a store-bought "spy type," is not designed to work in a vacuum. With some modifications the unit can be thermally connected when mounted in the cup portion of the camera cooler. The thermal conductivity is provided by copper tape between parts of the camera and the cooled enclosure. During initial testing of the demonstration unit, the camera cooler kept the CPU (central processing unit) of this video camera at operating temperature. This development allowed video recording of an in-progress test, within a vacuum environment.

  11. Making Ceramic Cameras

    ERIC Educational Resources Information Center

    Squibb, Matt

    2009-01-01

    This article describes how to make a clay camera. This idea of creating functional cameras from clay allows students to experience ceramics, photography, and painting all in one unit. (Contains 1 resource and 3 online resources.)

  12. Rethinking color cameras

    E-print Network

    Chakrabarti, Ayan

    Digital color cameras make sub-sampled measurements of color at alternating pixel locations, and then “demosaick” these measurements to create full color images by up-sampling. This allows traditional cameras with restricted ...

  13. The MMT all-sky camera

    NASA Astrophysics Data System (ADS)

    Pickering, T. E.

    2006-06-01

    The MMT all-sky camera is a low-cost, wide-angle camera system that takes images of the sky every 10 seconds, day and night. It is based on an Adirondack Video Astronomy StellaCam II video camera and utilizes an auto-iris fish-eye lens to allow safe operation under all lighting conditions, even direct sunlight. This combined with the anti-blooming characteristics of the StellaCam's detector allows useful images to be obtained during sunny days as well as brightly moonlit nights. Under dark skies the system can detect stars as faint as 6th magnitude as well as very thin cirrus and low surface brightness zodiacal features such as gegenschein. The total hardware cost of the system was less than $3500 including computer and framegrabber card, a fraction of the cost of comparable systems utilizing traditional CCD cameras.

  14. Classroom multispectral imaging using inexpensive digital cameras.

    NASA Astrophysics Data System (ADS)

    Fortes, A. D.

    2007-12-01

    The proliferation of increasingly cheap digital cameras in recent years means that it has become easier to exploit the broad wavelength sensitivity of their CCDs (360 - 1100 nm) for classroom-based teaching. With the right tools, it is possible to open children's eyes to the invisible world of UVA and near-IR radiation either side of our narrow visual band. The camera-filter combinations I describe can be used to explore the world of animal vision, looking for invisible markings on flowers, or in bird plumage, for example. In combination with a basic spectroscope (such as the Project-STAR handheld plastic spectrometer, 25), it is possible to investigate the range of human vision and camera sensitivity, and to explore the atomic and molecular absorption lines from the solar and terrestrial atmospheres. My principal use of the cameras has been to teach multispectral imaging of the kind used to determine remotely the composition of planetary surfaces. A range of camera options, from 50 circuit-board mounted CCDs up to $900 semi-pro infrared camera kits (including mobile phones along the way), and various UV-vis-IR filter options will be presented. Examples of multispectral images taken with these systems are used to illustrate the range of classroom topics that can be covered. Particular attention is given to learning about spectral reflectance curves and comparing images from Earth and Mars taken using the same filter combination that it used on the Mars Rovers.

  15. Evaluating intensified camera systems

    SciTech Connect

    S. A. Baker

    2000-07-01

    This paper describes image evaluation techniques used to standardize camera system characterizations. Key areas of performance include resolution, noise, and sensitivity. This team has developed a set of analysis tools, in the form of image processing software used to evaluate camera calibration data, to aid an experimenter in measuring a set of camera performance metrics. These performance metrics identify capabilities and limitations of the camera system, while establishing a means for comparing camera systems. Analysis software is used to evaluate digital camera images recorded with charge-coupled device (CCD) cameras. Several types of intensified camera systems are used in the high-speed imaging field. Electro-optical components are used to provide precise shuttering or optical gain for a camera system. These components including microchannel plate or proximity focused diode image intensifiers, electro-static image tubes, or electron-bombarded CCDs affect system performance. It is important to quantify camera system performance in order to qualify a system as meeting experimental requirements. The camera evaluation tool is designed to provide side-by-side camera comparison and system modeling information.

  16. Omnifocus video camera

    NASA Astrophysics Data System (ADS)

    Iizuka, Keigo

    2011-04-01

    The omnifocus video camera takes videos, in which objects at different distances are all in focus in a single video display. The omnifocus video camera consists of an array of color video cameras combined with a unique distance mapping camera called the Divcam. The color video cameras are all aimed at the same scene, but each is focused at a different distance. The Divcam provides real-time distance information for every pixel in the scene. A pixel selection utility uses the distance information to select individual pixels from the multiple video outputs focused at different distances, in order to generate the final single video display that is everywhere in focus. This paper presents principle of operation, design consideration, detailed construction, and over all performance of the omnifocus video camera. The major emphasis of the paper is the proof of concept, but the prototype has been developed enough to demonstrate the superiority of this video camera over a conventional video camera. The resolution of the prototype is high, capturing even fine details such as fingerprints in the image. Just as the movie camera was a significant advance over the still camera, the omnifocus video camera represents a significant advance over all-focus cameras for still images.

  17. Digital Pinhole Camera

    ERIC Educational Resources Information Center

    Lancor, Rachael; Lancor, Brian

    2014-01-01

    In this article we describe how the classic pinhole camera demonstration can be adapted for use with digital cameras. Students can easily explore the effects of the size of the pinhole and its distance from the sensor on exposure time, magnification, and image quality. Instructions for constructing a digital pinhole camera and our method for…

  18. 800 x 800 charge-coupled device /CCD/ camera for the Galileo Jupiter Orbiter mission

    NASA Technical Reports Server (NTRS)

    Clary, M. C.; Klaasen, K. P.; Snyder, L. M.; Wang, P. K.

    1979-01-01

    During January 1982 the NASA space transportation system will launch a Galileo spacecraft composed of an orbiting bus and an atmospheric entry probe to arrive at the planet Jupiter in July 1985. A prime element of the orbiter's scientific instrument payload will be a new generation slow-scan planetary imaging system based on a newly developed 800 x 800 charge-coupled device (CCD) image sensor. Following Jupiter orbit insertion, the single, narrow-angle, CCD camera, designated the Solid State Imaging (SSI) Subsystem, will operate for 20 months as the orbiter makes repeated encounters with Jupiter and its Galilean Satellites. During this period the SSI will acquire 40,000 images of Jupiter's atmosphere and the surfaces of the Galilean Satellites. This paper describes the SSI, its operational modes, and science objectives.

  19. Automatic camera tracking for remote manipulators

    SciTech Connect

    Stoughton, R.S.; Martin, H.L.; Bentz, R.R.

    1984-04-01

    The problem of automatic camera tracking of mobile objects is addressed with specific reference to remote manipulators and using either fixed or mobile cameras. The technique uses a kinematic approach employing 4 x 4 coordinate transformation matrices to solve for the needed camera PAN and TILT angles. No vision feedback systems are used, as the required input data are obtained entirely from position sensors from the manipulator and the camera-positioning system. All hardware requirements are generally satisfied by currently available remote manipulator systems with a supervisory computer. The system discussed here implements linear plus on/off (bang-bang) closed-loop control with a +-2/sup 0/ deadband. The deadband area is desirable to avoid operator seasickness caused by continuous camera movement. Programming considerations for camera control, including operator interface options, are discussed. The example problem presented is based on an actual implementation using a PDP 11/34 computer, a TeleOperator Systems SM-229 manipulator, and an Oak Ridge National Laboratory (ORNL) camera-positioning system. 3 references, 6 figures, 2 tables.

  20. Automatic camera tracking for remote manipulators

    SciTech Connect

    Stoughton, R.S.; Martin, H.L.; Bentz, R.R.

    1984-07-01

    The problem of automatic camera tracking of mobile objects is addressed with specific reference to remote manipulators and using either fixed or mobile cameras. The technique uses a kinematic approach employing 4 x 4 coordinate transformation matrices to solve for the needed camera PAN and TILT angles. No vision feedback systems are used, as the required input data are obtained entirely from position sensors from the manipulator and the camera-positioning system. All hardware requirements are generally satisfied by currently available remote manipulator systems with a supervisory computer. The system discussed here implements linear plus on/off (bang-bang) closed-loop control with a +-2-deg deadband. The deadband area is desirable to avoid operator seasickness caused by continuous camera movement. Programming considerations for camera control, including operator interface options, are discussed. The example problem presented is based on an actual implementation using a PDP 11/34 computer, a TeleOperator Systems SM-229 manipulator, and an Oak Ridge National Laboratory (ORNL) camera-positioning system. 3 references, 6 figures, 2 tables.

  1. Ultra-fast framing camera tube

    DOEpatents

    Kalibjian, Ralph (1051 Batavia Ave., Livermore, CA 94550)

    1981-01-01

    An electronic framing camera tube features focal plane image dissection and synchronized restoration of the dissected electron line images to form two-dimensional framed images. Ultra-fast framing is performed by first streaking a two-dimensional electron image across a narrow slit, thereby dissecting the two-dimensional electron image into sequential electron line images. The dissected electron line images are then restored into a framed image by a restorer deflector operated synchronously with the dissector deflector. The number of framed images on the tube's viewing screen is equal to the number of dissecting slits in the tube. The distinguishing features of this ultra-fast framing camera tube are the focal plane dissecting slits, and the synchronously-operated restorer deflector which restores the dissected electron line images into a two-dimensional framed image. The framing camera tube can produce image frames having high spatial resolution of optical events in the sub-100 picosecond range.

  2. Tower Camera Handbook

    SciTech Connect

    Moudry, D

    2005-01-01

    The tower camera in Barrow provides hourly images of ground surrounding the tower. These images may be used to determine fractional snow cover as winter arrives, for comparison with the albedo that can be calculated from downward-looking radiometers, as well as some indication of present weather. Similarly, during spring time, the camera images show the changes in the ground albedo as the snow melts. The tower images are saved in hourly intervals. In addition, two other cameras, the skydeck camera in Barrow and the piling camera in Atqasuk, show the current conditions at those sites.

  3. Traffic camera system development

    NASA Astrophysics Data System (ADS)

    Hori, Toshi

    1997-04-01

    The intelligent transportation system has generated a strong need for the development of intelligent camera systems to meet the requirements of sophisticated applications, such as electronic toll collection (ETC), traffic violation detection and automatic parking lot control. In order to achieve the highest levels of accuracy in detection, these cameras must have high speed electronic shutters, high resolution, high frame rate, and communication capabilities. A progressive scan interline transfer CCD camera, with its high speed electronic shutter and resolution capabilities, provides the basic functions to meet the requirements of a traffic camera system. Unlike most industrial video imaging applications, traffic cameras must deal with harsh environmental conditions and an extremely wide range of light. Optical character recognition is a critical function of a modern traffic camera system, with detection and accuracy heavily dependent on the camera function. In order to operate under demanding conditions, communication and functional optimization is implemented to control cameras from a roadside computer. The camera operates with a shutter speed faster than 1/2000 sec. to capture highway traffic both day and night. Consequently camera gain, pedestal level, shutter speed and gamma functions are controlled by a look-up table containing various parameters based on environmental conditions, particularly lighting. Lighting conditions are studied carefully, to focus only on the critical license plate surface. A unique light sensor permits accurate reading under a variety of conditions, such as a sunny day, evening, twilight, storms, etc. These camera systems are being deployed successfully in major ETC projects throughout the world.

  4. Automated Camera Calibration

    NASA Technical Reports Server (NTRS)

    Chen, Siqi; Cheng, Yang; Willson, Reg

    2006-01-01

    Automated Camera Calibration (ACAL) is a computer program that automates the generation of calibration data for camera models used in machine vision systems. Machine vision camera models describe the mapping between points in three-dimensional (3D) space in front of the camera and the corresponding points in two-dimensional (2D) space in the camera s image. Calibrating a camera model requires a set of calibration data containing known 3D-to-2D point correspondences for the given camera system. Generating calibration data typically involves taking images of a calibration target where the 3D locations of the target s fiducial marks are known, and then measuring the 2D locations of the fiducial marks in the images. ACAL automates the analysis of calibration target images and greatly speeds the overall calibration process.

  5. Intrinsic camera calibration equipped with Scheimpflug optical device

    NASA Astrophysics Data System (ADS)

    Fasogbon, Peter; Duvieubourg, Luc; Lacaze, Pierre-Antoine; Macaire, Ludovic

    2015-04-01

    We present the problem of setting up an intrinsic camera calibration under Scheimpflug condition for an industrial application. We aim to calibrate the Scheimpflug camera using a roughly hand positioned calibration pattern with bundle adjustment technique. The assumptions used by classical calibration methodologies are not valid anymore for cameras undergoing Scheimpflug condition. Therefore, we slightly modify pin-hole model to estimate the Scheimpflug angles. The results are tested on real data sets captured from cameras limited by various industrial constraints, and in the presence of large distortions.

  6. Ultraviolet Spectroscopy of Narrow Coronal Mass Ejections

    NASA Astrophysics Data System (ADS)

    Dobrzycka, D.; Raymond, J. C.; Biesecker, D. A.; Li, J.; Ciaravella, A.

    2003-05-01

    We present Ultraviolet Coronagraph Spectrometer (UVCS) observations of five narrow coronal mass ejections (CMEs) that were among 15 narrow CMEs originally selected by Gilbert and coworkers. Two events (1999 March 27, April 15) were ``structured,'' i.e., in white-light data they exhibited well-defined interior features, and three (1999 May 9, May 21, June 3) were ``unstructured,'' i.e., appeared featureless. In UVCS data the events were seen as 4°-13° wide enhancements of the strongest coronal lines H I Ly? and O VI ??1032, 1037. We derived electron densities for several of the events from the Large Angle and Spectrometric Coronagraph Experiment (LASCO) C2 white-light observations. They are comparable to or smaller than densities inferred for other CMEs. We modeled the observable properties of examples of the structured (1999 April 15) and unstructured (1999 May 9) narrow CMEs at different heights in the corona between 1.5 and 2 Rsolar. The derived electron temperatures, densities, and outflow speeds are similar for those two types of ejections. They were compared with properties of polar coronal jets and other CMEs. We discuss different scenarios of narrow CME formation as either a jet formed by reconnection onto open field lines or a CME ejected by expansion of closed field structures. Overall, we conclude that the existing observations do not definitively place the narrow CMEs into the jet or the CME picture, but the acceleration of the 1999 April 15 event resembles acceleration seen in many CMEs, rather than constant speeds or deceleration observed in jets.

  7. Microchannel plate streak camera

    DOEpatents

    Wang, C.L.

    1984-09-28

    An improved streak camera in which a microchannel plate electron multiplier is used in place of or in combination with the photocathode used in prior streak cameras. The improved streak camera is far more sensitive to photons (uv to gamma-rays) than the conventional x-ray streak camera which uses a photocathode. The improved streak camera offers gamma-ray detection with high temporal resolution. It also offers low-energy x-ray detection without attenuation inside the cathode. Using the microchannel plate in the improved camera has resulted in a time resolution of about 150 ps, and has provided a sensitivity sufficient for 1000 keV x-rays.

  8. Microchannel plate streak camera

    DOEpatents

    Wang, Ching L. (Livermore, CA)

    1989-01-01

    An improved streak camera in which a microchannel plate electron multiplier is used in place of or in combination with the photocathode used in prior streak cameras. The improved streak camera is far more sensitive to photons (UV to gamma-rays) than the conventional x-ray streak camera which uses a photocathode. The improved streak camera offers gamma-ray detection with high temporal resolution. It also offers low-energy x-ray detection without attenuation inside the cathode. Using the microchannel plate in the improved camera has resulted in a time resolution of about 150 ps, and has provided a sensitivity sufficient for 1000 KeV x-rays.

  9. Microchannel plate streak camera

    DOEpatents

    Wang, C.L.

    1989-03-21

    An improved streak camera in which a microchannel plate electron multiplier is used in place of or in combination with the photocathode used in prior streak cameras is disclosed. The improved streak camera is far more sensitive to photons (UV to gamma-rays) than the conventional x-ray streak camera which uses a photocathode. The improved streak camera offers gamma-ray detection with high temporal resolution. It also offers low-energy x-ray detection without attenuation inside the cathode. Using the microchannel plate in the improved camera has resulted in a time resolution of about 150 ps, and has provided a sensitivity sufficient for 1,000 KeV x-rays. 3 figs.

  10. Improved Tracking of Targets by Cameras on a Mars Rover

    NASA Technical Reports Server (NTRS)

    Kim, Won; Ansar, Adnan; Steele, Robert

    2007-01-01

    A paper describes a method devised to increase the robustness and accuracy of tracking of targets by means of three stereoscopic pairs of video cameras on a Mars-rover-type exploratory robotic vehicle. Two of the camera pairs are mounted on a mast that can be adjusted in pan and tilt; the third camera pair is mounted on the main vehicle body. Elements of the method include a mast calibration, a camera-pointing algorithm, and a purely geometric technique for handing off tracking between different camera pairs at critical distances as the rover approaches a target of interest. The mast calibration is an extension of camera calibration in which the camera images of calibration targets at known positions are collected at various pan and tilt angles. In the camerapointing algorithm, pan and tilt angles are computed by a closed-form, non-iterative solution of inverse kinematics of the mast combined with mathematical models of the cameras. The purely geometric camera-handoff technique involves the use of stereoscopic views of a target of interest in conjunction with the mast calibration.

  11. Analytical multicollimator camera calibration

    USGS Publications Warehouse

    Tayman, W.P.

    1978-01-01

    Calibration with the U.S. Geological survey multicollimator determines the calibrated focal length, the point of symmetry, the radial distortion referred to the point of symmetry, and the asymmetric characteristiecs of the camera lens. For this project, two cameras were calibrated, a Zeiss RMK A 15/23 and a Wild RC 8. Four test exposures were made with each camera. Results are tabulated for each exposure and averaged for each set. Copies of the standard USGS calibration reports are included. ?? 1978.

  12. New developments to improve SO2 cameras

    NASA Astrophysics Data System (ADS)

    Luebcke, P.; Bobrowski, N.; Hoermann, C.; Kern, C.; Klein, A.; Kuhn, J.; Vogel, L.; Platt, U.

    2012-12-01

    The SO2 camera is a remote sensing instrument that measures the two-dimensional distribution of SO2 (column densities) in volcanic plumes using scattered solar radiation as a light source. From these data SO2-fluxes can be derived. The high time resolution of the order of 1 Hz allows correlating SO2 flux measurements with other traditional volcanological measurement techniques, i.e., seismology. In the last years the application of SO2 cameras has increased, however, there is still potential to improve the instrumentation. First of all, the influence of aerosols and ash in the volcanic plume can lead to large errors in the calculated SO2 flux, if not accounted for. We present two different concepts to deal with the influence of ash and aerosols. The first approach uses a co-axial DOAS system that was added to a two filter SO2 camera. The camera used Filter A (peak transmission centred around 315 nm) to measures the optical density of SO2 and Filter B (centred around 330 nm) to correct for the influence of ash and aerosol. The DOAS system simultaneously performs spectroscopic measurements in a small area of the camera's field of view and gives additional information to correct for these effects. Comparing the optical densities for the two filters with the SO2 column density from the DOAS allows not only a much more precise calibration, but also to draw conclusions about the influence from ash and aerosol scattering. Measurement examples from Popocatépetl, Mexico in 2011 are shown and interpreted. Another approach combines the SO2 camera measurement principle with the extremely narrow and periodic transmission of a Fabry-Pérot interferometer. The narrow transmission window allows to select individual SO2 absorption bands (or series of bands) as a substitute for Filter A. Measurements are therefore more selective to SO2. Instead of Filter B, as in classical SO2 cameras, the correction for aerosol can be performed by shifting the transmission window of the Fabry-Pérot interferometer towards the SO2 absorption cross section minima. A correction of ash and aerosol influences with this technique can decrease deviation from the true column by more than 60%, since the wavelength difference between the two measurement channels is much smaller than in classical SO2 cameras. While the implementation of this approach for a 2D camera encompasses many challenges, it gives the possibility to build a relatively simple and robust scanning instrument for volcanic SO2 distributions. A second problem of the SO2 camera technique is the relatively high price, which prevents its use in many volcano observatories in developing countries. Most SO2 cameras use CCDs that were originally designed for astronomical purposes. The large pixel size and low noise of these detectors compensates for the low intensity of solar radiation in the UV and the low quantum efficiency of the detector in this spectral range. However, the detectors used cost several thousand US dollars. We present results from test measurements using a consumer DSLR camera as a detector of an SO2 camera. Since the camera is not sensitive in the UV, the incoming radiation is first imaged onto a screen that is covered with a suitable fluorescent dye converting the UV radiation to visible light.

  13. Narrow band 3 × 3 Mueller polarimetric endoscopy

    PubMed Central

    Qi, Ji; Ye, Menglong; Singh, Mohan; Clancy, Neil T.; Elson, Daniel S.

    2013-01-01

    Mueller matrix polarimetric imaging has shown potential in tissue diagnosis but is challenging to implement endoscopically. In this work, a narrow band 3 × 3 Mueller matrix polarimetric endoscope was designed by rotating the endoscope to generate 0°, 45° and 90° linearly polarized illumination and positioning a rotating filter wheel in front of the camera containing three polarisers to permit polarization state analysis for backscattered light. The system was validated with a rotating linear polarizer and a diffuse reflection target. Initial measurements of 3 × 3 Mueller matrices on a rat are demonstrated, followed by matrix decomposition into the depolarization and retardance matrices for further analysis. Our work shows the feasibility of implementing polarimetric imaging in a rigid endoscope conveniently and economically in order to reveal diagnostic information. PMID:24298405

  14. Ringfield lithographic camera

    DOEpatents

    Sweatt, William C. (Albuquerque, NM)

    1998-01-01

    A projection lithography camera is presented with a wide ringfield optimized so as to make efficient use of extreme ultraviolet radiation from a large area radiation source (e.g., D.sub.source .apprxeq.0.5 mm). The camera comprises four aspheric mirrors optically arranged on a common axis of symmetry with an increased etendue for the camera system. The camera includes an aperture stop that is accessible through a plurality of partial aperture stops to synthesize the theoretical aperture stop. Radiation from a mask is focused to form a reduced image on a wafer, relative to the mask, by reflection from the four aspheric mirrors.

  15. LSST Camera Optics Design

    SciTech Connect

    Riot, V J; Olivier, S; Bauman, B; Pratuch, S; Seppala, L; Gilmore, D; Ku, J; Nordby, M; Foss, M; Antilogus, P; Morgado, N

    2012-05-24

    The Large Synoptic Survey Telescope (LSST) uses a novel, three-mirror, telescope design feeding a camera system that includes a set of broad-band filters and three refractive corrector lenses to produce a flat field at the focal plane with a wide field of view. Optical design of the camera lenses and filters is integrated in with the optical design of telescope mirrors to optimize performance. We discuss the rationale for the LSST camera optics design, describe the methodology for fabricating, coating, mounting and testing the lenses and filters, and present the results of detailed analyses demonstrating that the camera optics will meet their performance goals.

  16. Six-year operation of the Venus Monitoring Camera (Venus Express): spatial and temporal variations of the properties of particles in upper clouds of Venus from the phase dependence of the near-IR brightness

    NASA Astrophysics Data System (ADS)

    Shalygina, O. S.; Petrova, E. V.; Markiewicz, W. J.

    2015-10-01

    Since May, 2006, the Venus Monitoring Camera (VMC) [1] has been imaging Venus in four narrow spectral channels centered at the wavelengths of 0.365 ?m (UV), 0.513 ?m (VIS), 0.965 ?m (NIR1), and 1.010 ?m (NIR2). It took around 300 000 images in four channels covering almost all the latitudes, including night and day sides. We analyze the whole set of the VMC data processed to October, 2012, i.e. the data from orbits 60 - 2 352 obtained in the phase angle range

  17. Magellan Instant Camera testbed

    E-print Network

    McEwen, Heather K. (Heather Kristine), 1982-

    2004-01-01

    The Magellan Instant Camera (MagIC) is an optical CCD camera that was built at MIT and is currently used at Las Campanas Observatory (LCO) in La Serena, Chile. It is designed to be both simple and efficient with minimal ...

  18. Seasonal and vertical changes in leaf angle distribution for selected deciduous broadleaf tree species common to Europe

    NASA Astrophysics Data System (ADS)

    Raabe, Kairi; Pisek, Jan; Sonnentag, Oliver; Annuk, Kalju

    2014-05-01

    Leaf inclination angle distribution is a key parameter in determining the transmission and reflection of radiation by vegetation canopies. It has been previously observed that leaf inclination angle might change gradually from more vertical in the upper canopy and in high light habitats to more horizontal in the lower canopy and in low light habitats [1]. Despite its importance, relatively few measurements on actual leaf angle distributions have been reported for different tree species. Even smaller number of studies have dealt with the possible seasonal changes in leaf angle distribution [2]. In this study the variation of leaf inclination angle distributions was examined both temporally throughout the growing season and vertically at different heights of trees. We report on leaf inclination angle distributions for five deciduous broadleaf species found commonly in several parts of Europe: grey alder (Alnus incana), Silver birch (Betula pendula Roth), chestnut (Castanea), Norway maple (Acer platanoides), and aspen (Populus tremula). The angles were measured using the leveled camera method [3], with the data collected at several separate heights and four times during the period of May-September 2013. The results generally indicate the greatest change in leaf inclination angles for spring, with the changes usually being the most pronounced at the top of the canopy. It should also be noted, however, that whereas the temporal variation proved to be rather consistent for different species, the vertical variation differed more between species. The leveled camera method was additionally tested in terms of sensitivity to different users. Ten people were asked to measure the leaf angles for four different species. The results indicate the method is quite robust in providing coinciding distributions irrespective of the user and level of previous experience with the method. However, certain caution must be exercised when measuring long narrow leaves. References [1] G.G. McMillen, and J.H. McClendon, "Leaf angle: an adaptive feature of sun and shade leaves," Botanical Gazette, vol. 140, pp. 437-442, 1979. [2] J. Pisek, O. Sonnentag, A.D. Richardson, and M. Mõttus, "Is the spherical leaf inclination angle distribution a valid assumption for temperate and boreal broadleaf tree species?" Agricultural and Forest Meteorology, vol. 169, pp. 186-194, 2013. [3] Y. Ryu, O. Sonnentag, T. Nilson, R. Vargas, H. Kobayashi, R. Wenk, and D. Baldocchi, "How to quantify tree leaf area index in a heterogenous savanna ecosystem: a multi-instrument and multimodel approach," Agricultural and Forest Meteorology, vol. 150, pp. 63-76, 2010.

  19. Dry imaging cameras

    PubMed Central

    Indrajit, IK; Alam, Aftab; Sahni, Hirdesh; Bhatia, Mukul; Sahu, Samaresh

    2011-01-01

    Dry imaging cameras are important hard copy devices in radiology. Using dry imaging camera, multiformat images of digital modalities in radiology are created from a sealed unit of unexposed films. The functioning of a modern dry camera, involves a blend of concurrent processes, in areas of diverse sciences like computers, mechanics, thermal, optics, electricity and radiography. Broadly, hard copy devices are classified as laser and non laser based technology. When compared with the working knowledge and technical awareness of different modalities in radiology, the understanding of a dry imaging camera is often superficial and neglected. To fill this void, this article outlines the key features of a modern dry camera and its important issues that impact radiology workflow. PMID:21799589

  20. Streak camera time calibration procedures

    NASA Technical Reports Server (NTRS)

    Long, J.; Jackson, I.

    1978-01-01

    Time calibration procedures for streak cameras utilizing a modulated laser beam are described. The time calibration determines a writing rate accuracy of 0.15% with a rotating mirror camera and 0.3% with an image converter camera.

  1. What are the benefits of having multiple camera angles?

    Atmospheric Science Data Center

    2014-12-08

    ... can be interpreted (with appropriate models) to document the properties of the target, just as the more familiar spectral differences are ... or minimize the effect of sun glint over the ocean and other water surfaces, thereby enabling observations even when traditional sensors are ...

  2. Ringfield lithographic camera

    DOEpatents

    Sweatt, W.C.

    1998-09-08

    A projection lithography camera is presented with a wide ringfield optimized so as to make efficient use of extreme ultraviolet radiation from a large area radiation source (e.g., D{sub source} {approx_equal} 0.5 mm). The camera comprises four aspheric mirrors optically arranged on a common axis of symmetry. The camera includes an aperture stop that is accessible through a plurality of partial aperture stops to synthesize the theoretical aperture stop. Radiation from a mask is focused to form a reduced image on a wafer, relative to the mask, by reflection from the four aspheric mirrors. 11 figs.

  3. Kitt Peak speckle camera

    NASA Technical Reports Server (NTRS)

    Breckinridge, J. B.; Mcalister, H. A.; Robinson, W. G.

    1979-01-01

    The speckle camera in regular use at Kitt Peak National Observatory since 1974 is described in detail. The design of the atmospheric dispersion compensation prisms, the use of film as a recording medium, the accuracy of double star measurements, and the next generation speckle camera are discussed. Photographs of double star speckle patterns with separations from 1.4 sec of arc to 4.7 sec of arc are shown to illustrate the quality of image formation with this camera, the effects of seeing on the patterns, and to illustrate the isoplanatic patch of the atmosphere.

  4. Calibration of Action Cameras for Photogrammetric Purposes

    PubMed Central

    Balletti, Caterina; Guerra, Francesco; Tsioukas, Vassilios; Vernier, Paolo

    2014-01-01

    The use of action cameras for photogrammetry purposes is not widespread due to the fact that until recently the images provided by the sensors, using either still or video capture mode, were not big enough to perform and provide the appropriate analysis with the necessary photogrammetric accuracy. However, several manufacturers have recently produced and released new lightweight devices which are: (a) easy to handle, (b) capable of performing under extreme conditions and more importantly (c) able to provide both still images and video sequences of high resolution. In order to be able to use the sensor of action cameras we must apply a careful and reliable self-calibration prior to the use of any photogrammetric procedure, a relatively difficult scenario because of the short focal length of the camera and its wide angle lens that is used to obtain the maximum possible resolution of images. Special software, using functions of the OpenCV library, has been created to perform both the calibration and the production of undistorted scenes for each one of the still and video image capturing mode of a novel action camera, the GoPro Hero 3 camera that can provide still images up to 12 Mp and video up 8 Mp resolution. PMID:25237898

  5. Calibration of action cameras for photogrammetric purposes.

    PubMed

    Balletti, Caterina; Guerra, Francesco; Tsioukas, Vassilios; Vernier, Paolo

    2014-01-01

    The use of action cameras for photogrammetry purposes is not widespread due to the fact that until recently the images provided by the sensors, using either still or video capture mode, were not big enough to perform and provide the appropriate analysis with the necessary photogrammetric accuracy. However, several manufacturers have recently produced and released new lightweight devices which are: (a) easy to handle, (b) capable of performing under extreme conditions and more importantly (c) able to provide both still images and video sequences of high resolution. In order to be able to use the sensor of action cameras we must apply a careful and reliable self-calibration prior to the use of any photogrammetric procedure, a relatively difficult scenario because of the short focal length of the camera and its wide angle lens that is used to obtain the maximum possible resolution of images. Special software, using functions of the OpenCV library, has been created to perform both the calibration and the production of undistorted scenes for each one of the still and video image capturing mode of a novel action camera, the GoPro Hero 3 camera that can provide still images up to 12 Mp and video up 8 Mp resolution. PMID:25237898

  6. Ultraviolet Spectroscopy of Narrow CMEs

    NASA Astrophysics Data System (ADS)

    Dobrzycka, D.; Raymond, J. C.; Biesecker, D. A.; Li, J.; Ciaravella, A.

    2002-12-01

    Coronal mass ejections (CMEs) are commonly described as new, discrete, bright features appearing in the field of view of a white light coronagraph and moving outward over a period of minutes to hours. Apparent angular widths of the CMEs cover a wide range, from few to 360°. The very narrow structures (narrower than ~15-20°) form only a small subset of all the observed CMEs and are usually referred to as rays, spikes, fans, etc. Recently, Gilbert et al. (2001, ApJ, 550, 1093) reported LASCO white light observations of 15 selected narrow CMEs. We extended the study and analyzed ultraviolet spectroscopy of narrow ejections, including several events listed by Gilbert et al. The data were obtained by the Ultraviolet Coronagraph Spectrometer (UVCS/SOHO). We present comparison of narrow and large CMEs and discuss the relation of the narrow CMEs to coronal jets and/or other narrow transient events. This work is supported by NASA under Grant NAG5-11420 to the Smithsonian Astrophysical Observatory, by the Italian Space Agency and by PRODEX (Swiss contribution).

  7. Miniature TV Camera

    NASA Technical Reports Server (NTRS)

    2004-01-01

    Originally devised to observe Saturn stage separation during Apollo flights, Marshall Space Flight Center's Miniature Television Camera, measuring only 4 x 3 x 1 1/2 inches, quickly made its way to the commercial telecommunications market.

  8. Calibration Procedures in Mid Format Camera Setups

    NASA Astrophysics Data System (ADS)

    Pivnicka, F.; Kemper, G.; Geissler, S.

    2012-07-01

    A growing number of mid-format cameras are used for aerial surveying projects. To achieve a reliable and geometrically precise result also in the photogrammetric workflow, awareness on the sensitive parts is important. The use of direct referencing systems (GPS/IMU), the mounting on a stabilizing camera platform and the specific values of the mid format camera make a professional setup with various calibration and misalignment operations necessary. An important part is to have a proper camera calibration. Using aerial images over a well designed test field with 3D structures and/or different flight altitudes enable the determination of calibration values in Bingo software. It will be demonstrated how such a calibration can be performed. The direct referencing device must be mounted in a solid and reliable way to the camera. Beside the mechanical work especially in mounting the camera beside the IMU, 2 lever arms have to be measured in mm accuracy. Important are the lever arms from the GPS Antenna to the IMU's calibrated centre and also the lever arm from the IMU centre to the Camera projection centre. In fact, the measurement with a total station is not a difficult task but the definition of the right centres and the need for using rotation matrices can cause serious accuracy problems. The benefit of small and medium format cameras is that also smaller aircrafts can be used. Like that, a gyro bases stabilized platform is recommended. This causes, that the IMU must be mounted beside the camera on the stabilizer. The advantage is, that the IMU can be used to control the platform, the problematic thing is, that the IMU to GPS antenna lever arm is floating. In fact we have to deal with an additional data stream, the values of the movement of the stabiliser to correct the floating lever arm distances. If the post-processing of the GPS-IMU data by taking the floating levers into account, delivers an expected result, the lever arms between IMU and camera can be applied. However, there is a misalignment (bore side angle) that must be evaluated by photogrammetric process using advanced tools e.g. in Bingo. Once, all these parameters have been determined, the system is capable for projects without or with only a few ground control points. But which effect has the photogrammetric process when directly applying the achieved direct orientation values compared with an AT based on a proper tiepoint matching? The paper aims to show the steps to be done by potential users and gives a kind of quality estimation about the importance and quality influence of the various calibration and adjustment steps.

  9. Spacecraft camera image registration

    NASA Technical Reports Server (NTRS)

    Kamel, Ahmed A. (Inventor); Graul, Donald W. (Inventor); Chan, Fred N. T. (Inventor); Gamble, Donald W. (Inventor)

    1987-01-01

    A system for achieving spacecraft camera (1, 2) image registration comprises a portion external to the spacecraft and an image motion compensation system (IMCS) portion onboard the spacecraft. Within the IMCS, a computer (38) calculates an image registration compensation signal (60) which is sent to the scan control loops (84, 88, 94, 98) of the onboard cameras (1, 2). At the location external to the spacecraft, the long-term orbital and attitude perturbations on the spacecraft are modeled. Coefficients (K, A) from this model are periodically sent to the onboard computer (38) by means of a command unit (39). The coefficients (K, A) take into account observations of stars and landmarks made by the spacecraft cameras (1, 2) themselves. The computer (38) takes as inputs the updated coefficients (K, A) plus synchronization information indicating the mirror position (AZ, EL) of each of the spacecraft cameras (1, 2), operating mode, and starting and stopping status of the scan lines generated by these cameras (1, 2), and generates in response thereto the image registration compensation signal (60). The sources of periodic thermal errors on the spacecraft are discussed. The system is checked by calculating measurement residuals, the difference between the landmark and star locations predicted at the external location and the landmark and star locations as measured by the spacecraft cameras (1, 2).

  10. Sensing driver awareness by combining fisheye camera and Kinect

    NASA Astrophysics Data System (ADS)

    Wuhe, Z.; Lei, Z.; Ning, D.

    2014-11-01

    In this paper, we propose a Driver's Awareness Catching System to sense the driver's awareness. The system consists of a fisheye camera and a Kinect. The Kinect mounted inside vehicle is used to recognize and locate the 3D face of the driver. The fisheye camera mounted outside vehicle is used to monitor the road. The relative pose between two cameras is calibrated via a state-of-the-art method for calibrating cameras with non-overlapping field of view. The camera system works in this way: First, the SDK of Kinect released by Microsoft is used to tracking driver's face and capture eye's location together with sight direction. Secondly, the eye's location and the sight direction are transformed to the coordinate system of fisheye camera. Thirdly, corresponding view field is extracted from fisheye image. As there is a small displacement between driver's eyes and the optical center of fisheye camera, it will lead to a view angle deviation. Finally, we did a systematic analysis of the error distribution by numerical simulation and proved the feasibility of our camera system. On the other hand, we realized this camera system and achieved desired effect in realworld experiment.

  11. What convention is used for the illumination and view angles?

    Atmospheric Science Data Center

    2014-12-08

    ... Azimuth angles are measured clockwise from the direction of travel to local north. For both the Sun and cameras, azimuth describes the ... to the equator, because of its morning equator crossing time. Additionally, the difference in view and solar azimuth angle will be near ...

  12. Study on the diagnostic system of scoliosis by using infrared camera.

    PubMed

    Jeong, Jin-Hyoung; Park, Eun-Jeong; Cho, Chang-Ok; Kim, Yoon-Jeong; Lee, Sang-Sik

    2015-08-17

    In this study, the radiation generated in the diagnosis of scoliosis, to solve the problems by using an infrared camera and an optical marker system that can diagnose scoliosis developed. System developed by the infrared camera attached to the optical spinal curvature is recognized as a marker to shoot the angle between the two optical markers are measured. Measurement of angle, we used the Cobb's Angle method used in the diagnosis of spinal scoliosis. We developed a software to be able to output to the screen using an infrared camera to diagnose spinal scoliosis. Software is composed of camera output unit was manufactured in Labview, angle measurement unit, in Cobb's Angle measurement unit. In the future, kyphosis, Hallux Valgus, such as the diagnosis of orthopedic disorders that require the use of a diagnostic system is expected case. PMID:26405878

  13. Image dissector camera system study

    NASA Technical Reports Server (NTRS)

    Howell, L.

    1984-01-01

    Various aspects of a rendezvous and docking system using an image dissector detector as compared to a GaAs detector were discussed. Investigation into a gimbled scanning system is also covered and the measured video response curves from the image dissector camera are presented. Rendezvous will occur at ranges greater than 100 meters. The maximum range considered was 1000 meters. During docking, the range, range-rate, angle, and angle-rate to each reflector on the satellite must be measured. Docking range will be from 3 to 100 meters. The system consists of a CW laser diode transmitter and an image dissector receiver. The transmitter beam is amplitude modulated with three sine wave tones for ranging. The beam is coaxially combined with the receiver beam. Mechanical deflection of the transmitter beam, + or - 10 degrees in both X and Y, can be accomplished before or after it is combined with the receiver beam. The receiver will have a field-of-view (FOV) of 20 degrees and an instantaneous field-of-view (IFOV) of two milliradians (mrad) and will be electronically scanned in the image dissector. The increase in performance obtained from the GaAs photocathode is not needed to meet the present performance requirements.

  14. Autonomous pedestrian localization technique using CMOS camera sensors

    NASA Astrophysics Data System (ADS)

    Chun, Chanwoo

    2014-09-01

    We present a pedestrian localization technique that does not need infrastructure. The proposed angle-only measurement method needs specially manufactured shoes. Each shoe has two CMOS cameras and two markers such as LEDs attached on the inward side. The line of sight (LOS) angles towards the two markers on the forward shoe are measured using the two cameras on the other rear shoe. Our simulation results shows that a pedestrian walking down in a shopping mall wearing this device can be accurately guided to the front of a destination store located 100m away, if the floor plan of the mall is available.

  15. Uncooled radiometric camera performance

    NASA Astrophysics Data System (ADS)

    Meyer, Bill; Hoelter, T.

    1998-07-01

    Thermal imaging equipment utilizing microbolometer detectors operating at room temperature has found widespread acceptance in both military and commercial applications. Uncooled camera products are becoming effective solutions to applications currently using traditional, photonic infrared sensors. The reduced power consumption and decreased mechanical complexity offered by uncooled cameras have realized highly reliable, low-cost, hand-held instruments. Initially these instruments displayed only relative temperature differences which limited their usefulness in applications such as Thermography. Radiometrically calibrated microbolometer instruments are now available. The ExplorIR Thermography camera leverages the technology developed for Raytheon Systems Company's first production microbolometer imaging camera, the Sentinel. The ExplorIR camera has a demonstrated temperature measurement accuracy of 4 degrees Celsius or 4% of the measured value (whichever is greater) over scene temperatures ranges of minus 20 degrees Celsius to 300 degrees Celsius (minus 20 degrees Celsius to 900 degrees Celsius for extended range models) and camera environmental temperatures of minus 10 degrees Celsius to 40 degrees Celsius. Direct temperature measurement with high resolution video imaging creates some unique challenges when using uncooled detectors. A temperature controlled, field-of-view limiting aperture (cold shield) is not typically included in the small volume dewars used for uncooled detector packages. The lack of a field-of-view shield allows a significant amount of extraneous radiation from the dewar walls and lens body to affect the sensor operation. In addition, the transmission of the Germanium lens elements is a function of ambient temperature. The ExplorIR camera design compensates for these environmental effects while maintaining the accuracy and dynamic range required by today's predictive maintenance and condition monitoring markets.

  16. The Dark Energy Camera

    E-print Network

    Flaugher, B; Honscheid, K; Abbott, T M C; Alvarez, O; Angstadt, R; Annis, J T; Antonik, M; Ballester, O; Beaufore, L; Bernstein, G M; Bernstein, R A; Bigelow, B; Bonati, M; Boprie, D; Brooks, D; Buckley-Geer, E J; Campa, J; Cardiel-Sas, L; Castander, F J; Castilla, J; Cease, H; Cela-Ruiz, J M; Chappa, S; Chi, E; Cooper, C; da Costa, L N; Dede, E; Derylo, G; DePoy, D L; de Vicente, J; Doel, P; Drlica-Wagner, A; Eiting, J; Elliott, A E; Emes, J; Estrada, J; Neto, A Fausti; Finley, D A; Flores, R; Frieman, J; Gerdes, D; Gladders, M D; Gregory, B; Gutierrez, G R; Hao, J; Holland, S E; Holm, S; Huffman, D; Jackson, C; James, D J; Jonas, M; Karcher, A; Karliner, I; Kent, S; Kessler, R; Kozlovsky, M; Kron, R G; Kubik, D; Kuehn, K; Kuhlmann, S; Kuk, K; Lahav, O; Lathrop, A; Lee, J; Levi, M E; Lewis, P; Li, T S; Mandrichenko, I; Marshall, J L; Martinez, G; Merritt, K W; Miquel, R; Munoz, F; Neilsen, E H; Nichol, R C; Nord, B; Ogando, R; Olsen, J; Palio, N; Patton, K; Peoples, J; Plazas, A A; Rauch, J; Reil, K; Rheault, J -P; Roe, N A; Rogers, H; Roodman, A; Sanchez, E; Scarpine, V; Schindler, R H; Schmidt, R; Schmitt, R; Schubnell, M; Schultz, K; Schurter, P; Scott, L; Serrano, S; Shaw, T M; Smith, R C; Soares-Santos, M; Stefanik, A; Stuermer, W; Suchyta, E; Sypniewski, A; Tarle, G; Thaler, J; Tighe, R; Tran, C; Tucker, D; Walker, A R; Wang, G; Watson, M; Weaverdyck, C; Wester, W; Woods, R; Yanny, B

    2015-01-01

    The Dark Energy Camera is a new imager with a 2.2-degree diameter field of view mounted at the prime focus of the Victor M. Blanco 4-meter telescope on Cerro Tololo near La Serena, Chile. The camera was designed and constructed by the Dark Energy Survey Collaboration, and meets or exceeds the stringent requirements designed for the wide-field and supernova surveys for which the collaboration uses it. The camera consists of a five element optical corrector, seven filters, a shutter with a 60 cm aperture, and a CCD focal plane of 250 micron thick fully-depleted CCDs cooled inside a vacuum Dewar. The 570 Mpixel focal plane comprises 62 2kx4k CCDs for imaging and 12 2kx2k CCDs for guiding and focus. The CCDs have 15 microns x15 microns pixels with a plate scale of 0.263 arc sec per pixel. A hexapod system provides state-of-the-art focus and alignment capability. The camera is read out in 20 seconds with 6-9 electrons readout noise. This paper provides a technical description of the camera's engineering, construct...

  17. Satellite camera image navigation

    NASA Technical Reports Server (NTRS)

    Kamel, Ahmed A. (Inventor); Graul, Donald W. (Inventor); Savides, John (Inventor); Hanson, Charles W. (Inventor)

    1987-01-01

    Pixels within a satellite camera (1, 2) image are precisely located in terms of latitude and longitude on a celestial body, such as the earth, being imaged. A computer (60) on the earth generates models (40, 50) of the satellite's orbit and attitude, respectively. The orbit model (40) is generated from measurements of stars and landmarks taken by the camera (1, 2), and by range data. The orbit model (40) is an expression of the satellite's latitude and longitude at the subsatellite point, and of the altitude of the satellite, as a function of time, using as coefficients (K) the six Keplerian elements at epoch. The attitude model (50) is based upon star measurements taken by each camera (1, 2). The attitude model (50) is a set of expressions for the deviations in a set of mutually orthogonal reference optical axes (x, y, z) as a function of time, for each camera (1, 2). Measured data is fit into the models (40, 50) using a walking least squares fit algorithm. A transformation computer (66 ) transforms pixel coordinates as telemetered by the camera (1, 2) into earth latitude and longitude coordinates, using the orbit and attitude models (40, 50).

  18. The Dark Energy Camera

    NASA Astrophysics Data System (ADS)

    Flaugher, B.; Diehl, H. T.; Honscheid, K.; Abbott, T. M. C.; Alvarez, O.; Angstadt, R.; Annis, J. T.; Antonik, M.; Ballester, O.; Beaufore, L.; Bernstein, G. M.; Bernstein, R. A.; Bigelow, B.; Bonati, M.; Boprie, D.; Brooks, D.; Buckley-Geer, E. J.; Campa, J.; Cardiel-Sas, L.; Castander, F. J.; Castilla, J.; Cease, H.; Cela-Ruiz, J. M.; Chappa, S.; Chi, E.; Cooper, C.; da Costa, L. N.; Dede, E.; Derylo, G.; DePoy, D. L.; de Vicente, J.; Doel, P.; Drlica-Wagner, A.; Eiting, J.; Elliott, A. E.; Emes, J.; Estrada, J.; Fausti Neto, A.; Finley, D. A.; Flores, R.; Frieman, J.; Gerdes, D.; Gladders, M. D.; Gregory, B.; Gutierrez, G. R.; Hao, J.; Holland, S. E.; Holm, S.; Huffman, D.; Jackson, C.; James, D. J.; Jonas, M.; Karcher, A.; Karliner, I.; Kent, S.; Kessler, R.; Kozlovsky, M.; Kron, R. G.; Kubik, D.; Kuehn, K.; Kuhlmann, S.; Kuk, K.; Lahav, O.; Lathrop, A.; Lee, J.; Levi, M. E.; Lewis, P.; Li, T. S.; Mandrichenko, I.; Marshall, J. L.; Martinez, G.; Merritt, K. W.; Miquel, R.; Muñoz, F.; Neilsen, E. H.; Nichol, R. C.; Nord, B.; Ogando, R.; Olsen, J.; Palaio, N.; Patton, K.; Peoples, J.; Plazas, A. A.; Rauch, J.; Reil, K.; Rheault, J.-P.; Roe, N. A.; Rogers, H.; Roodman, A.; Sanchez, E.; Scarpine, V.; Schindler, R. H.; Schmidt, R.; Schmitt, R.; Schubnell, M.; Schultz, K.; Schurter, P.; Scott, L.; Serrano, S.; Shaw, T. M.; Smith, R. C.; Soares-Santos, M.; Stefanik, A.; Stuermer, W.; Suchyta, E.; Sypniewski, A.; Tarle, G.; Thaler, J.; Tighe, R.; Tran, C.; Tucker, D.; Walker, A. R.; Wang, G.; Watson, M.; Weaverdyck, C.; Wester, W.; Woods, R.; Yanny, B.; The DES Collaboration

    2015-11-01

    The Dark Energy Camera is a new imager with a 2.°2 diameter field of view mounted at the prime focus of the Victor M. Blanco 4 m telescope on Cerro Tololo near La Serena, Chile. The camera was designed and constructed by the Dark Energy Survey Collaboration and meets or exceeds the stringent requirements designed for the wide-field and supernova surveys for which the collaboration uses it. The camera consists of a five-element optical corrector, seven filters, a shutter with a 60 cm aperture, and a charge-coupled device (CCD) focal plane of 250 ?m thick fully depleted CCDs cooled inside a vacuum Dewar. The 570 megapixel focal plane comprises 62 2k × 4k CCDs for imaging and 12 2k × 2k CCDs for guiding and focus. The CCDs have 15 ?m × 15 ?m pixels with a plate scale of 0.?263 pixel?1. A hexapod system provides state-of-the-art focus and alignment capability. The camera is read out in 20 s with 6–9 electron readout noise. This paper provides a technical description of the camera's engineering, construction, installation, and current status.

  19. Narrow dip inside a natural linewidth absorption profile in a system of two atoms

    NASA Astrophysics Data System (ADS)

    Makarov, A. A.

    2015-11-01

    Absorption spectrum of a system of two closely spaced identical atoms displays, at certain preparation, a dip that can be much narrower than the natural linewidth. This preparation includes (i) application of a strong magnetic field at an angle ? , that is very close to the magic angle ?0=arccos(1 /?{3 }) ?54.7°, with respect to the direction from one atom to another, and (ii) in-plane illumination by a laser light in the form of a nonresonant standing wave polarized at the same angle ? . Both qualitative and quantitative arguments for the narrow dip effect are presented.

  20. Selective-imaging camera

    NASA Astrophysics Data System (ADS)

    Szu, Harold; Hsu, Charles; Landa, Joseph; Cha, Jae H.; Krapels, Keith A.

    2015-05-01

    How can we design cameras that image selectively in Full Electro-Magnetic (FEM) spectra? Without selective imaging, we cannot use, for example, ordinary tourist cameras to see through fire, smoke, or other obscurants contributing to creating a Visually Degraded Environment (VDE). This paper addresses a possible new design of selective-imaging cameras at firmware level. The design is consistent with physics of the irreversible thermodynamics of Boltzmann's molecular entropy. It enables imaging in appropriate FEM spectra for sensing through the VDE, and displaying in color spectra for Human Visual System (HVS). We sense within the spectra the largest entropy value of obscurants such as fire, smoke, etc. Then we apply a smart firmware implementation of Blind Sources Separation (BSS) to separate all entropy sources associated with specific Kelvin temperatures. Finally, we recompose the scene using specific RGB colors constrained by the HVS, by up/down shifting Planck spectra at each pixel and time.

  1. Solid state television camera

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The design, fabrication, and tests of a solid state television camera using a new charge-coupled imaging device are reported. An RCA charge-coupled device arranged in a 512 by 320 format and directly compatible with EIA format standards was the sensor selected. This is a three-phase, sealed surface-channel array that has 163,840 sensor elements, which employs a vertical frame transfer system for image readout. Included are test results of the complete camera system, circuit description and changes to such circuits as a result of integration and test, maintenance and operation section, recommendations to improve the camera system, and a complete set of electrical and mechanical drawing sketches.

  2. The Martian Atmosphere as seen by the OSIRIS camera

    NASA Astrophysics Data System (ADS)

    Moissl, R.; Pajola, M.; Määttänen, A.; Küppers, M.

    2013-09-01

    Despite the long time that has passed since the observations, only few studies based on the data from the wide- (WAC) and narrow- (NAC) angle camera systems of OSIRIS have been published to date. In this Paper we will present the results on the observations of the Martian Limbs acquired by the OSIRIS [1] instrument on board the ESA mission Rosetta during its swing-by maneuver around February 25th, 2007 on the way to Comet 67P/Churyumov-Gerasimenko, during the onset of the very active dust storm season of Mars year 28 (at Ls ~190). Although OSIRIS did only capture the Planet during a relatively short time interval of several hours, the obtained global view and the spectral coverage, from the UV (245 nm) over the full visible range to the near IR (1000 nm), allow for a valuable global overview over the state of the Martian atmosphere. The image acquisition started a February 24 around 18:00 UTC from a distance of about 260.000 km and continued until 04:51 UTC on February 25 to a distance of 105.000 km. During the Closest Approach to the Planet at 01:54 UTC o February 25 at a distance of 250 km. All images have been manually co-registered with the help of SPICE data, and vertical profiles have been extracted over the limb in intervals of ~0.5 degrees (se Figures 1 and 2). Despite the long time that has passed since the observations, only few studies based on the data from the wide- (WAC) and narrow- (NAC) angle camera systems of OSIRIS have been published to date. In this Paper we will present the results on the observations of the Martian Limbs acquired by the OSIRIS [1] instrument on board the ESA mission Rosetta during its swing-by maneuver around February 25th, 2007 on the way to Comet 67P/Churyumov-Gerasimenko, during the onset of the very active dust storm season of Mars year 28 (at Ls ~190). Although OSIRIS did only capture the Planet during a relatively short time interval of several hours, the obtained global view and the spectral coverage, from the UV (245 nm) over the full visible range to the near IR (1000 nm), allow for a valuable global overview over the state of the Martian atmosphere. The image acquisition started at February 24 around 18:00 UTC from a distance of about 260.000 km and continued until 04:51 UTC on February 25 to a distance of 105.000 km. During the Closest Approach to the Planet at 01:54 UTC on February 25 at a distance of 250 km. All images have been manually co-registered with the help of SPICE data, and vertical profiles have been extracted over the limb in intervals of ~0.5 degrees (see Figures 1 and 2). In this work we will focus on our findings about the vertical structure of the atmosphere over the Martian limbs and report on the observed altitudes and optical densities of dust and (partially detached) clouds and put the findings in context with data from other satellites in orbit around Mars at the same time (e.g. Mars Express). Based on previous datasets (MGS/TES, MOd/THEMIS, MRO/MCS, see, e.g., [2], [3] and [4]) we can expect to observe the waning of the South polar hood and the development of the Northern one. Some remains of the aphelion cloud belt might still be visible near the equator. Detached layers have been recently observed at this season by MEx/SPICAM [5] and MRO/MCS [6].

  3. Artificial human vision camera

    NASA Astrophysics Data System (ADS)

    Goudou, J.-F.; Maggio, S.; Fagno, M.

    2014-10-01

    In this paper we present a real-time vision system modeling the human vision system. Our purpose is to inspire from human vision bio-mechanics to improve robotic capabilities for tasks such as objects detection and tracking. This work describes first the bio-mechanical discrepancies between human vision and classic cameras and the retinal processing stage that takes place in the eye, before the optic nerve. The second part describes our implementation of these principles on a 3-camera optical, mechanical and software model of the human eyes and associated bio-inspired attention model.

  4. Early Experience & Multisensory Perceptual Narrowing

    PubMed Central

    Lewkowicz, David J.

    2014-01-01

    Perceptual narrowing is a reflection of early experience and contributes in key ways to perceptual and cognitive development. In general, findings have shown that unisensory perceptual sensitivity in early infancy is broadly tuned such that young infants respond to, and discriminate, native as well as non-native sensory inputs, whereas older infants only respond to native inputs. Recently, my colleagues and I discovered that perceptual narrowing occurs at the multisensory processing level as well. The present article reviews this new evidence and puts it in the larger context of multisensory perceptual development and the role that perceptual experience plays in it. Together, the evidence on unisensory and multisensory narrowing shows that early experience shapes the emergence of perceptual specialization and expertise. PMID:24435505

  5. Communities, Cameras, and Conservation

    ERIC Educational Resources Information Center

    Patterson, Barbara

    2012-01-01

    Communities, Cameras, and Conservation (CCC) is the most exciting and valuable program the author has seen in her 30 years of teaching field science courses. In this citizen science project, students and community volunteers collect data on mountain lions ("Puma concolor") at four natural areas and public parks along the Front Range of Colorado.…

  6. The LSST Camera Overview

    SciTech Connect

    Gilmore, Kirk; Kahn, Steven A.; Nordby, Martin; Burke, David; O'Connor, Paul; Oliver, John; Radeka, Veljko; Schalk, Terry; Schindler, Rafe; /SLAC

    2007-01-10

    The LSST camera is a wide-field optical (0.35-1um) imager designed to provide a 3.5 degree FOV with better than 0.2 arcsecond sampling. The detector format will be a circular mosaic providing approximately 3.2 Gigapixels per image. The camera includes a filter mechanism and, shuttering capability. It is positioned in the middle of the telescope where cross-sectional area is constrained by optical vignetting and heat dissipation must be controlled to limit thermal gradients in the optical beam. The fast, f/1.2 beam will require tight tolerances on the focal plane mechanical assembly. The focal plane array operates at a temperature of approximately -100 C to achieve desired detector performance. The focal plane array is contained within an evacuated cryostat, which incorporates detector front-end electronics and thermal control. The cryostat lens serves as an entrance window and vacuum seal for the cryostat. Similarly, the camera body lens serves as an entrance window and gas seal for the camera housing, which is filled with a suitable gas to provide the operating environment for the shutter and filter change mechanisms. The filter carousel can accommodate 5 filters, each 75 cm in diameter, for rapid exchange without external intervention.

  7. Make a Pinhole Camera

    ERIC Educational Resources Information Center

    Fisher, Diane K.; Novati, Alexander

    2009-01-01

    On Earth, using ordinary visible light, one can create a single image of light recorded over time. Of course a movie or video is light recorded over time, but it is a series of instantaneous snapshots, rather than light and time both recorded on the same medium. A pinhole camera, which is simple to make out of ordinary materials and using ordinary…

  8. Behind the Camera.

    ERIC Educational Resources Information Center

    Kuhns, William; Giardino, Thomas F.

    Intended for the beginning filmmaker, this book presents basic information on major aspects of shooting a film. It covers characteristics of various cameras, films, lenses, and lighting equipment and tells how to use them. The importance of a shooting script is stressed. The mechanics of sound systems, editing, and titles, animations, and special…

  9. Photogrammetric camera calibration

    USGS Publications Warehouse

    Tayman, W.P.; Ziemann, H.

    1984-01-01

    Section 2 (Calibration) of the document "Recommended Procedures for Calibrating Photogrammetric Cameras and Related Optical Tests" from the International Archives of Photogrammetry, Vol. XIII, Part 4, is reviewed in the light of recent practical work, and suggestions for changes are made. These suggestions are intended as a basis for a further discussion. ?? 1984.

  10. Face Fixing Cameras

    E-print Network

    Boyd, David

    2011-06-22

    , make your blog profiles or facebook pictures look like the best version of yourself--or someone else. Another instance of magical Japanese technology. Only unlike Snow White's stepmother's magic mirror, this camera won't just tell you you're the fairest...

  11. Image Sensors Enhance Camera Technologies

    NASA Technical Reports Server (NTRS)

    2010-01-01

    In the 1990s, a Jet Propulsion Laboratory team led by Eric Fossum researched ways of improving complementary metal-oxide semiconductor (CMOS) image sensors in order to miniaturize cameras on spacecraft while maintaining scientific image quality. Fossum s team founded a company to commercialize the resulting CMOS active pixel sensor. Now called the Aptina Imaging Corporation, based in San Jose, California, the company has shipped over 1 billion sensors for use in applications such as digital cameras, camera phones, Web cameras, and automotive cameras. Today, one of every three cell phone cameras on the planet feature Aptina s sensor technology.

  12. Pulse stretcher for narrow pulses

    NASA Technical Reports Server (NTRS)

    Lindsey, R. S., Jr. (inventor)

    1974-01-01

    A pulse stretcher for narrow pulses is presented. The stretcher is composed of an analog section for processing each arriving analog pulse and a digital section with logic for providing command signals to the gates and switches in the analog section.

  13. ROTATING LINE CAMERAS: MODEL AND CALIBRATION Shou Kang Wei

    E-print Network

    a line of pixel sensors, which can be rotated on a full circle, describing a cylindrical surface this way graphics communities (with many popular applications already outside of these communities). Geometric information is rather limited which 1 For an ideal pinhole camera we have a viewing angle of = arctan(s/2f

  14. 15. ELEVATED CAMERA STAND, SHOWING LINE OF CAMERA STANDS PARALLEL ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    15. ELEVATED CAMERA STAND, SHOWING LINE OF CAMERA STANDS PARALLEL TO SLED TRACK. Looking west southwest down Camera Road. - Edwards Air Force Base, South Base Sled Track, Edwards Air Force Base, North of Avenue B, between 100th & 140th Streets East, Lancaster, Los Angeles County, CA

  15. Gaze Directed Camera Control for Face Image Acquisition Eric Sommerlade, Ben Benfold and Ian Reid

    E-print Network

    Oxford, University of

    Gaze Directed Camera Control for Face Image Acquisition Eric Sommerlade, Ben Benfold and Ian Reid optimises the capturing of such images by using coarse gaze estimates from a static camera. By considering are in turn a function of the gaze angle. We validate the approach using a combination of simulated situations

  16. Multispectral Photometry of the Moon and Absolute Calibration of the Clementine UV/Vis Camera

    NASA Astrophysics Data System (ADS)

    Hillier, John K.; Buratti, Bonnie J.; Hill, Kathryn

    1999-10-01

    We present a multispectral photometric study of the Moon between solar phase angles of 0 and 85°. Using Clementine images obtained between 0.4 and 1.0 ?m, we produce a comprehensive study of the lunar surface containing the following results: (1) empirical photometric functions for the spectral range and viewing and illumination geometries mentioned, (2) photometric modeling that derives the physical properties of the upper regolith and includes a detailed study of the causes for the lunar opposition surge, (3) an absolute calibration of the Clementine UV/Vis camera. The calibration procedure given on the Clementine calibration web site produces reflectances relative to a halon standard and further appear significantly higher than those seen in groundbased observations. By comparing Clementine observations with prior groundbased observations of 15 sites on the Moon we have determined a good absolute calibration of the Clementine UV/Vis camera. A correction factor of 0.532 has been determined to convert the web site (www.planetary.brown.edu/clementine/calibration.html) reflectances to absolute values. From the calibrated data, we calculate empirical phase functions useful for performing photometric corrections to observations of the Moon between solar phase angles of 0 and 85° and in the spectral range 0.4 to 1.0?m. Finally, the calibrated data is used to fit a version of Hapke's photometric model modified to incorporate a new formulation, developed in this paper, of the lunar opposition surge which includes coherent backscatter. Recent studies of the lunar opposition effect have yielded contradictory results as to the mechanism responsible: shadow hiding, coherent backscatter, or both. We find that most of the surge can be explained by shadow hiding with a halfwidth of ˜8°. However, for the brightest regions (the highlands at 0.75-1.0?m) a small additional narrow component (halfwidth of <2°) of total amplitude ˜1/6 to 1/4 that of the shadow hiding surge is observed, which may be attributed to coherent backscatter. Interestingly, no evidence for the narrow component is seen in the maria or in the highlands at 0.415?m. A natural explanation for this is that these regions are too dark to exhibit enough multiple scattering for the effects of coherent backscatter to be seen. Finally, because the Moon is the only celestial body for which we have "ground truth" measurements, our results provide an important test for the robustness of photometric models of remote sensing observations.

  17. Dual cameras acquisition and display system of retina-like sensor camera and rectangular sensor camera

    NASA Astrophysics Data System (ADS)

    Cao, Nan; Cao, Fengmei; Lin, Yabin; Bai, Tingzhu; Song, Shengyu

    2015-04-01

    For a new kind of retina-like senor camera and a traditional rectangular sensor camera, dual cameras acquisition and display system need to be built. We introduce the principle and the development of retina-like senor. Image coordinates transformation and interpolation based on sub-pixel interpolation need to be realized for our retina-like sensor's special pixels distribution. The hardware platform is composed of retina-like senor camera, rectangular sensor camera, image grabber and PC. Combined the MIL and OpenCV library, the software program is composed in VC++ on VS 2010. Experience results show that the system can realizes two cameras' acquisition and display.

  18. Accidental Pinhole and Pinspeck Cameras

    E-print Network

    Torralba, Antonio

    We identify and study two types of “accidental” images that can be formed in scenes. The first is an accidental pinhole camera image. The second class of accidental images are “inverse” pinhole camera images, formed by ...

  19. Streak camera receiver definition study

    NASA Technical Reports Server (NTRS)

    Johnson, C. B.; Hunkler, L. T., Sr.; Letzring, S. A.; Jaanimagi, P.

    1990-01-01

    Detailed streak camera definition studies were made as a first step toward full flight qualification of a dual channel picosecond resolution streak camera receiver for the Geoscience Laser Altimeter and Ranging System (GLRS). The streak camera receiver requirements are discussed as they pertain specifically to the GLRS system, and estimates of the characteristics of the streak camera are given, based upon existing and near-term technological capabilities. Important problem areas are highlighted, and possible corresponding solutions are discussed.

  20. Automated Camera Array Fine Calibration

    NASA Technical Reports Server (NTRS)

    Clouse, Daniel; Padgett, Curtis; Ansar, Adnan; Cheng, Yang

    2008-01-01

    Using aerial imagery, the JPL FineCalibration (JPL FineCal) software automatically tunes a set of existing CAHVOR camera models for an array of cameras. The software finds matching features in the overlap region between images from adjacent cameras, and uses these features to refine the camera models. It is not necessary to take special imagery of a known target and no surveying is required. JPL FineCal was developed for use with an aerial, persistent surveillance platform.

  1. Visual Feedback Stabilization of Balancing Tasks with Camera Misalignment

    NASA Astrophysics Data System (ADS)

    Hirata, Kentaro; Mizuno, Takashi

    In this paper, we consider visual feedback stabilization which tolerates small camera misalignment. Specifically, a balancing task with a cart-pendulum system using camera image is examined. Such a task is known to rely heavily on the detection of the vertical direction and the angle measurement error due to the camera misalignment could be fatal for stabilization. From a mathematical model of the measurement error, the effect of the misalignment is naturally represented by affine perturbation to the coefficient matrix of the output equation. Motivated by this fact, a special type of robust dynamic output feedback stabilization against polytopic uncertainty is investigated. By solving the related BMI, one can design a controller which tolerates the camera misalignment to some extent. The result is verified via experiments.

  2. Camera Trajectory fromWide Baseline Images

    NASA Astrophysics Data System (ADS)

    Havlena, M.; Torii, A.; Pajdla, T.

    2008-09-01

    Camera trajectory estimation, which is closely related to the structure from motion computation, is one of the fundamental tasks in computer vision. Reliable camera trajectory estimation plays an important role in 3D reconstruction, self localization, and object recognition. There are essential issues for a reliable camera trajectory estimation, for instance, choice of the camera and its geometric projection model, camera calibration, image feature detection and description, and robust 3D structure computation. Most of approaches rely on classical perspective cameras because of the simplicity of their projection models and ease of their calibration. However, classical perspective cameras offer only a limited field of view, and thus occlusions and sharp camera turns may cause that consecutive frames look completely different when the baseline becomes longer. This makes the image feature matching very difficult (or impossible) and the camera trajectory estimation fails under such conditions. These problems can be avoided if omnidirectional cameras, e.g. a fish-eye lens convertor, are used. The hardware which we are using in practice is a combination of Nikon FC-E9 mounted via a mechanical adaptor onto a Kyocera Finecam M410R digital camera. Nikon FC-E9 is a megapixel omnidirectional addon convertor with 180° view angle which provides images of photographic quality. Kyocera Finecam M410R delivers 2272×1704 images at 3 frames per second. The resulting combination yields a circular view of diameter 1600 pixels in the image. Since consecutive frames of the omnidirectional camera often share a common region in 3D space, the image feature matching is often feasible. On the other hand, the calibration of these cameras is non-trivial and is crucial for the accuracy of the resulting 3D reconstruction. We calibrate omnidirectional cameras off-line using the state-of-the-art technique and Mi?ušík's two-parameter model, that links the radius of the image point r to the angle ? of its corresponding rays w.r.t. the optical axis as ? = ar 1+br2 . After a successful calibration, we know the correspondence of the image points to the 3D optical rays in the coordinate system of the camera. The following steps aim at finding the transformation between the camera and the world coordinate systems, i.e. the pose of the camera in the 3D world, using 2D image matches. For computing 3D structure, we construct a set of tentative matches detecting different affine covariant feature regions including MSER, Harris Affine, and Hessian Affine in acquired images. These features are alternative to popular SIFT features and work comparably in our situation. Parameters of the detectors are chosen to limit the number of regions to 1-2 thousands per image. The detected regions are assigned local affine frames (LAF) and transformed into standard positions w.r.t. their LAFs. Discrete Cosine Descriptors are computed for each region in the standard position. Finally, mutual distances of all regions in one image and all regions in the other image are computed as the Euclidean distances of their descriptors and tentative matches are constructed by selecting the mutually closest pairs. Opposed to the methods using short baseline images, simpler image features which are not affine covariant cannot be used because the view point can change a lot between consecutive frames. Furthermore, feature matching has to be performed on the whole frame because no assumptions on the proximity of the consecutive projections can be made for wide baseline images. This is making the feature detection, description, and matching much more time-consuming than it is for short baseline images and limits the usage to low frame rate sequences when operating in real-time. Robust 3D structure can be computed by RANSAC which searches for the largest subset of the set of tentative matches which is, within a predefined threshold ", consistent with an epipolar geometry. We use ordered sampling as suggested in to draw 5-tuples from the list of tentative matches ordered ascendin

  3. Combustion pinhole camera system

    DOEpatents

    Witte, Arvel B. (Rolling Hills, CA)

    1984-02-21

    A pinhole camera system utilizing a sealed optical-purge assembly which provides optical access into a coal combustor or other energy conversion reactors. The camera system basically consists of a focused-purge pinhole optical port assembly, a conventional TV vidicon receiver, an external, variable density light filter which is coupled electronically to the vidicon automatic gain control (agc). The key component of this system is the focused-purge pinhole optical port assembly which utilizes a purging inert gas to keep debris from entering the port and a lens arrangement which transfers the pinhole to the outside of the port assembly. One additional feature of the port assembly is that it is not flush with the interior of the combustor.

  4. Combustion pinhole camera system

    DOEpatents

    Witte, A.B.

    1984-02-21

    A pinhole camera system is described utilizing a sealed optical-purge assembly which provides optical access into a coal combustor or other energy conversion reactors. The camera system basically consists of a focused-purge pinhole optical port assembly, a conventional TV vidicon receiver, an external, variable density light filter which is coupled electronically to the vidicon automatic gain control (agc). The key component of this system is the focused-purge pinhole optical port assembly which utilizes a purging inert gas to keep debris from entering the port and a lens arrangement which transfers the pinhole to the outside of the port assembly. One additional feature of the port assembly is that it is not flush with the interior of the combustor. 2 figs.

  5. Multi-Camera Saliency.

    PubMed

    Luo, Yan; Jiang, Ming; Wong, Yongkang; Zhao, Qi

    2015-10-01

    A significant body of literature on saliency modeling predicts where humans look in a single image or video. Besides the scientific goal of understanding how information is fused from multiple visual sources to identify regions of interest in a holistic manner, there are tremendous engineering applications of multi-camera saliency due to the widespread of cameras. This paper proposes a principled framework to smoothly integrate visual information from multiple views to a global scene map, and to employ a saliency algorithm incorporating high-level features to identify the most important regions by fusing visual information. The proposed method has the following key distinguishing features compared with its counterparts: (1) the proposed saliency detection is global (salient regions from one local view may not be important in a global context), (2) it does not require special ways for camera deployment or overlapping field of view, and (3) the key saliency algorithm is effective in highlighting interesting object regions though not a single detector is used. Experiments on several data sets confirm the effectiveness of the proposed principled framework. PMID:26340257

  6. Gamma ray camera

    DOEpatents

    Perez-Mendez, Victor (Berkeley, CA)

    1997-01-01

    A gamma ray camera for detecting rays emanating from a radiation source such as an isotope. The gamma ray camera includes a sensor array formed of a visible light crystal for converting incident gamma rays to a plurality of corresponding visible light photons, and a photosensor array responsive to the visible light photons in order to form an electronic image of the radiation therefrom. The photosensor array is adapted to record an integrated amount of charge proportional to the incident gamma rays closest to it, and includes a transparent metallic layer, photodiode consisting of a p-i-n structure formed on one side of the transparent metallic layer, and comprising an upper p-type layer, an intermediate layer and a lower n-type layer. In the preferred mode, the scintillator crystal is composed essentially of a cesium iodide (CsI) crystal preferably doped with a predetermined amount impurity, and the p-type upper intermediate layers and said n-type layer are essentially composed of hydrogenated amorphous silicon (a-Si:H). The gamma ray camera further includes a collimator interposed between the radiation source and the sensor array, and a readout circuit formed on one side of the photosensor array.

  7. Hemispherical Laue camera

    DOEpatents

    Li, James C. M. (Pittsford, NY); Chu, Sungnee G. (Rochester, NY)

    1980-01-01

    A hemispherical Laue camera comprises a crystal sample mount for positioning a sample to be analyzed at the center of sphere of a hemispherical, X-radiation sensitive film cassette, a collimator, a stationary or rotating sample mount and a set of standard spherical projection spheres. X-radiation generated from an external source is directed through the collimator to impinge onto the single crystal sample on the stationary mount. The diffracted beam is recorded on the hemispherical X-radiation sensitive film mounted inside the hemispherical film cassette in either transmission or back-reflection geometry. The distances travelled by X-radiation diffracted from the crystal to the hemispherical film are the same for all crystal planes which satisfy Bragg's Law. The recorded diffraction spots or Laue spots on the film thereby preserve both the symmetry information of the crystal structure and the relative intensities which are directly related to the relative structure factors of the crystal orientations. The diffraction pattern on the exposed film is compared with the known diffraction pattern on one of the standard spherical projection spheres for a specific crystal structure to determine the orientation of the crystal sample. By replacing the stationary sample support with a rotating sample mount, the hemispherical Laue camera can be used for crystal structure determination in a manner previously provided in conventional Debye-Scherrer cameras.

  8. Orbiter Camera Payload System

    NASA Technical Reports Server (NTRS)

    1980-01-01

    Components for an orbiting camera payload system (OCPS) include the large format camera (LFC), a gas supply assembly, and ground test, handling, and calibration hardware. The LFC, a high resolution large format photogrammetric camera for use in the cargo bay of the space transport system, is also adaptable to use on an RB-57 aircraft or on a free flyer satellite. Carrying 4000 feet of film, the LFC is usable over the visible to near IR, at V/h rates of from 11 to 41 milliradians per second, overlap of 10, 60, 70 or 80 percent and exposure times of from 4 to 32 milliseconds. With a 12 inch focal length it produces a 9 by 18 inch format (long dimension in line of flight) with full format low contrast resolution of 88 lines per millimeter (AWAR), full format distortion of less than 14 microns and a complement of 45 Reseau marks and 12 fiducial marks. Weight of the OCPS as supplied, fully loaded is 944 pounds and power dissipation is 273 watts average when in operation, 95 watts in standby. The LFC contains an internal exposure sensor, or will respond to external command. It is able to photograph starfields for inflight calibration upon command.

  9. Gamma ray camera

    DOEpatents

    Perez-Mendez, V.

    1997-01-21

    A gamma ray camera is disclosed for detecting rays emanating from a radiation source such as an isotope. The gamma ray camera includes a sensor array formed of a visible light crystal for converting incident gamma rays to a plurality of corresponding visible light photons, and a photosensor array responsive to the visible light photons in order to form an electronic image of the radiation therefrom. The photosensor array is adapted to record an integrated amount of charge proportional to the incident gamma rays closest to it, and includes a transparent metallic layer, photodiode consisting of a p-i-n structure formed on one side of the transparent metallic layer, and comprising an upper p-type layer, an intermediate layer and a lower n-type layer. In the preferred mode, the scintillator crystal is composed essentially of a cesium iodide (CsI) crystal preferably doped with a predetermined amount impurity, and the p-type upper intermediate layers and said n-type layer are essentially composed of hydrogenated amorphous silicon (a-Si:H). The gamma ray camera further includes a collimator interposed between the radiation source and the sensor array, and a readout circuit formed on one side of the photosensor array. 6 figs.

  10. LRO Camera Imaging of Potential Landing Sites in the South Pole-Aitken Basin

    NASA Astrophysics Data System (ADS)

    Jolliff, B. L.; Wiseman, S. M.; Gibson, K. E.; Lauber, C.; Robinson, M.; Gaddis, L. R.; Scholten, F.; Oberst, J.; LROC Science; Operations Team

    2010-12-01

    We show results of WAC (Wide Angle Camera) and NAC (Narrow Angle Camera) imaging of candidate landing sites within the South Pole-Aitken (SPA) basin of the Moon obtained by the Lunar Reconnaissance Orbiter during the first full year of operation. These images enable a greatly improved delineation of geologic units, determination of unit thicknesses and stratigraphy, and detailed surface characterization that has not been possible with previous data. WAC imaging encompasses the entire SPA basin, located within an area ranging from ~ 130-250 degrees east longitude and ~15 degrees south latitude to the South Pole, at different incidence angles, with the specific range of incidence dependent on latitude. The WAC images show morphology and surface detail at better than 100 m per pixel, with spatial coverage and quality unmatched by previous data sets. NAC images reveal details at the sub-meter pixel scale that enable new ways to evaluate the origins and stratigraphy of deposits. Key among new results is the capability to discern extents of ancient volcanic deposits that are covered by later crater ejecta (cryptomare) [see Petro et al., this conference] using new, complementary color data from Kaguya and Chandrayaan-1. Digital topographic models derived from WAC and NAC geometric stereo coverage show broad intercrater-plains areas where slopes are acceptably low for high-probability safe landing [see Archinal et al., this conference]. NAC images allow mapping and measurement of small, fresh craters that excavated boulders and thus provide information on surface roughness and depth to bedrock beneath regolith and plains deposits. We use these data to estimate deposit thickness in areas of interest for landing and potential sample collection to better understand the possible provenance of samples. Also, small regions marked by fresh impact craters and their associated boulder fields are readily identified by their bright ejecta patterns and marked as lander keep-out zones. We will show examples of LROC data including those for Constellation sites on the SPA rim and interior, a site between Bose and Alder Craters, sites east of Bhabha Crater, and sites on and near the “Mafic Mound” [see Pieters et al., this conference]. Together the LROC data and complementary products provide essential information for ensuring identification of safe landing and sampling sites within SPA basin that has never before been available for a planetary mission.

  11. Phoenix Robotic Arm Camera

    NASA Astrophysics Data System (ADS)

    Keller, H. U.; Goetz, W.; Hartwig, H.; Hviid, S. F.; Kramm, R.; Markiewicz, W. J.; Reynolds, R.; Shinohara, C.; Smith, P.; Tanner, R.; Woida, P.; Woida, R.; Bos, B. J.; Lemmon, M. T.

    2008-10-01

    The Phoenix Robotic Arm Camera (RAC) is a variable-focus color camera mounted to the Robotic Arm (RA) of the Phoenix Mars Lander. It is designed to acquire both close-up images of the Martian surface and microscopic images (down to a scale of 23 ?m/pixel) of material collected in the RA scoop. The mounting position at the end of the Robotic Arm allows the RAC to be actively positioned for imaging of targets not easily seen by the Stereo Surface Imager (SSI), such as excavated trench walls and targets under the Lander structure. Color information is acquired by illuminating the target with red, green, and blue light-emitting diodes. Digital terrain models (DTM) can be generated from RAC images acquired from different view points. This can provide high-resolution stereo information about fine details of the trench walls. The large stereo baseline possible with the arm can also provide a far-field DTM. The primary science objectives of the RAC are the search for subsurface soil/ice layering at the landing site and the characterization of scoop samples prior to delivery to other instruments on board Phoenix. The RAC shall also provide low-resolution panoramas in support of SSI activities and acquire images of the Lander deck for instrument and Lander check out. The camera design was inherited from the unsuccessful Mars Polar Lander mission (1999) and further developed for the (canceled) Mars Surveyor 2001 Lander (MSL01). Extensive testing and partial recalibration qualified the MSL01 RAC flight model for integration into the Phoenix science payload.

  12. DEVICE CONTROLLER, CAMERA CONTROL

    SciTech Connect

    1998-07-20

    This is a C++ application that is the server for the cameral control system. Devserv drives serial devices, such as cameras and videoswitchers used in a videoconference, upon request from a client such as the camxfgbfbx ccint program. cc Deverv listens on UPD ports for clients to make network contractions. After a client connects and sends a request to control a device (such as to pan,tilt, or zooma camera or do picture-in-picture with a videoswitcher), devserv formats the request into an RS232 message appropriate for the device and sends this message over the serial port to which the device is connected. Devserv then reads the reply from the device from the serial port to which the device is connected. Devserv then reads the reply from the device from the serial port and then formats and sends via multicast a status message. In addition, devserv periodically multicasts status or description messages so that all clients connected to the multicast channel know what devices are supported and their ranges of motion and the current position. The software design employs a class hierarchy such that an abstract base class for devices can be subclassed into classes for various device categories(e.g. sonyevid30, cononvco4, panasonicwjmx50, etc.). which are further subclassed into classes for various device categories. The devices currently supported are the Sony evi-D30, Canon, VCC1, Canon VCC3, and Canon VCC4 cameras and the Panasonic WJ-MX50 videoswitcher. However, developers can extend the class hierarchy to support other devices.

  13. Object recognition through turbulence with a modified plenoptic camera

    NASA Astrophysics Data System (ADS)

    Wu, Chensheng; Ko, Jonathan; Davis, Christopher

    2015-03-01

    Atmospheric turbulence adds accumulated distortion to images obtained by cameras and surveillance systems. When the turbulence grows stronger or when the object is further away from the observer, increasing the recording device resolution helps little to improve the quality of the image. Many sophisticated methods to correct the distorted images have been invented, such as using a known feature on or near the target object to perform a deconvolution process, or use of adaptive optics. However, most of the methods depend heavily on the object's location, and optical ray propagation through the turbulence is not directly considered. Alternatively, selecting a lucky image over many frames provides a feasible solution, but at the cost of time. In our work, we propose an innovative approach to improving image quality through turbulence by making use of a modified plenoptic camera. This type of camera adds a micro-lens array to a traditional high-resolution camera to form a semi-camera array that records duplicate copies of the object as well as "superimposed" turbulence at slightly different angles. By performing several steps of image reconstruction, turbulence effects will be suppressed to reveal more details of the object independently (without finding references near the object). Meanwhile, the redundant information obtained by the plenoptic camera raises the possibility of performing lucky image algorithmic analysis with fewer frames, which is more efficient. In our work, the details of our modified plenoptic cameras and image processing algorithms will be introduced. The proposed method can be applied to coherently illuminated object as well as incoherently illuminated objects. Our result shows that the turbulence effect can be effectively suppressed by the plenoptic camera in the hardware layer and a reconstructed "lucky image" can help the viewer identify the object even when a "lucky image" by ordinary cameras is not achievable.

  14. Neutron Imaging Camera

    NASA Technical Reports Server (NTRS)

    Hunter, Stanley; deNolfo, G. A.; Barbier, L. M.; Link, J. T.; Son, S.; Floyd, S. R.; Guardala, N.; Skopec, M.; Stark, B.

    2008-01-01

    The Neutron Imaging Camera (NIC) is based on the Three-dimensional Track Imager (3DTI) technology developed at GSFC for gamma-ray astrophysics applications. The 3-DTI, a large volume time-projection chamber, provides accurate, approximately 0.4 mm resolution, 3-D tracking of charged particles. The incident direction of fast neutrons, En > 0.5 MeV, are reconstructed from the momenta and energies of the proton and triton fragments resulting from (sup 3)He(n,p) (sup 3)H interactions in the 3-DTI volume. The performance of the NIC from laboratory and accelerator tests is presented.

  15. Mars Science Laboratory Engineering Cameras

    NASA Technical Reports Server (NTRS)

    Maki, Justin N.; Thiessen, David L.; Pourangi, Ali M.; Kobzeff, Peter A.; Lee, Steven W.; Dingizian, Arsham; Schwochert, Mark A.

    2012-01-01

    NASA's Mars Science Laboratory (MSL) Rover, which launched to Mars in 2011, is equipped with a set of 12 engineering cameras. These cameras are build-to-print copies of the Mars Exploration Rover (MER) cameras, which were sent to Mars in 2003. The engineering cameras weigh less than 300 grams each and use less than 3 W of power. Images returned from the engineering cameras are used to navigate the rover on the Martian surface, deploy the rover robotic arm, and ingest samples into the rover sample processing system. The navigation cameras (Navcams) are mounted to a pan/tilt mast and have a 45-degree square field of view (FOV) with a pixel scale of 0.82 mrad/pixel. The hazard avoidance cameras (Haz - cams) are body-mounted to the rover chassis in the front and rear of the vehicle and have a 124-degree square FOV with a pixel scale of 2.1 mrad/pixel. All of the cameras utilize a frame-transfer CCD (charge-coupled device) with a 1024x1024 imaging region and red/near IR bandpass filters centered at 650 nm. The MSL engineering cameras are grouped into two sets of six: one set of cameras is connected to rover computer A and the other set is connected to rover computer B. The MSL rover carries 8 Hazcams and 4 Navcams.

  16. Subsurface "radar" camera

    NASA Technical Reports Server (NTRS)

    Jain, A.

    1977-01-01

    Long-wave length multiple-frequency radar is used for imaging and determining depth of subsurface stratified layers. Very-low frequency radar signals pinpoint below-ground strata via direct imagery techniques. Variation of frequency and scanning angle adjusts image depth and width.

  17. Narrow-Band Thermal Radiation Based on Microcavity Resonant Effect

    NASA Astrophysics Data System (ADS)

    Huang, Jin-Guo; Xuan, Yi-Min; Li, Qiang

    2014-09-01

    The microcavity resonant effect is used to realize narrow-band thermal radiation. Periodic circular aperture arrays with square lattice are patterned on Si substrates by using standard photolithographic techniques and reactive ion etching techniques. Ag films are deposited on the surface of Si substrates with aperture arrays to improve the infrared reflectance. On the basis of the micromachining process, an Ag/Si structured surface exhibiting narrow-band radiation and directivity insensitivity is presented. The emittance spectra exhibit several selective emittance bands attributed to the microcavity resonance effect. The dependence of emittance spectra on sizes and direction is also experimentally examined. The results indicate that the emittance peak of the Ag/Si structured surface can be modulated by tailoring the structural sizes. Moreover, the emittance peak is independent of the radiant angle, which is very important for designing high-performance thermal emitters.

  18. Stereoscopic camera design

    NASA Astrophysics Data System (ADS)

    Montgomery, David J.; Jones, Christopher K.; Stewart, James N.; Smith, Alan

    2002-05-01

    It is clear from the literature that the majority of work in stereoscopic imaging is directed towards the development of modern stereoscopic displays. As costs come down, wider public interest in this technology is expected to increase. This new technology would require new methods of image formation. Advances in stereo computer graphics will of course lead to the creation of new stereo computer games, graphics in films etc. However, the consumer would also like to see real-world stereoscopic images, pictures of family, holiday snaps etc. Such scenery would have wide ranges of depth to accommodate and would need also to cope with moving objects, such as cars, and in particular other people. Thus, the consumer acceptance of auto/stereoscopic displays and 3D in general would be greatly enhanced by the existence of a quality stereoscopic camera. This paper will cover an analysis of existing stereoscopic camera designs and show that they can be categorized into four different types, with inherent advantages and disadvantages. A recommendation is then made with regard to 3D consumer still and video photography. The paper will go on to discuss this recommendation and describe its advantages and how it can be realized in practice.

  19. MEMS digital camera

    NASA Astrophysics Data System (ADS)

    Gutierrez, R. C.; Tang, T. K.; Calvet, R.; Fossum, E. R.

    2007-02-01

    MEMS technology uses photolithography and etching of silicon wafers to enable mechanical structures with less than 1 ?m tolerance, important for the miniaturization of imaging systems. In this paper, we present the first silicon MEMS digital auto-focus camera for use in cell phones with a focus range of 10 cm to infinity. At the heart of the new silicon MEMS digital camera, a simple and low-cost electromagnetic actuator impels a silicon MEMS motion control stage on which a lens is mounted. The silicon stage ensures precise alignment of the lens with respect to the imager, and enables precision motion of the lens over a range of 300 ?m with < 5 ?m hysteresis and < 2 ?m repeatability. Settling time is < 15 ms for 200 ?m step, and < 5ms for 20 ?m step enabling AF within 0.36 sec at 30 fps. The precise motion allows COTS optics to maintain MTF > 0.8 at 20 cy/mm up to 80% field over the full range of motion. Accelerated lifetime testing has shown that the alignment and precision of motion is maintained after 8,000 g shocks, thermal cycling from - 40 C to 85 C, and operation over 20 million cycles.

  20. Digital synchroballistic schlieren camera for high-speed photography of bullets and rocket sleds

    NASA Astrophysics Data System (ADS)

    Buckner, Benjamin D.; L'Esperance, Drew

    2013-08-01

    A high-speed digital streak camera designed for simultaneous high-resolution color photography and focusing schlieren imaging is described. The camera uses a computer-controlled galvanometer scanner to achieve synchroballistic imaging through a narrow slit. Full color 20 megapixel images of a rocket sled moving at 480 m/s and of projectiles fired at around 400 m/s were captured, with high-resolution schlieren imaging in the latter cases, using conventional photographic flash illumination. The streak camera can achieve a line rate for streak imaging of up to 2.4 million lines/s.

  1. Transmission electron microscope CCD camera

    DOEpatents

    Downing, Kenneth H. (Lafayette, CA)

    1999-01-01

    In order to improve the performance of a CCD camera on a high voltage electron microscope, an electron decelerator is inserted between the microscope column and the CCD. This arrangement optimizes the interaction of the electron beam with the scintillator of the CCD camera while retaining optimization of the microscope optics and of the interaction of the beam with the specimen. Changing the electron beam energy between the specimen and camera allows both to be optimized.

  2. A view of the ET camera on STS-112

    NASA Technical Reports Server (NTRS)

    2002-01-01

    KENNEDY SPACE CENTER, FLA. - A closeup view of the camera mounted on the external tank of Space Shuttle Atlantis. The color video camera mounted to the top of Atlantis' external tank will provide a view of the front and belly of the orbiter and a portion of the solid rocket boosters (SRBs) and external tank during the launch of Atlantis on mission STS-112. It will offer the STS-112 team an opportunity to monitor the shuttle's performance from a new angle. The camera will be turned on fifteen minutes prior to launch and will show the orbiter and solid rocket boosters on the launch pad. The video will be downlinked from the external tank during flight to several NASA data-receiving sites and then relayed to the live television broadcast. The camera is expected to operate for about 15 minutes following liftoff. At liftoff, viewers will see the shuttle clearing the launch tower and, at two minutes after liftoff, see the right SRB separate from the external tank. When the external tank separates from Atlantis about eight minutes into the flight, the camera is expected to continue its live feed for about six more minutes although NASA may be unable to pick up the camera's signal because the tank may have moved out of range.

  3. A view of the ET camera on STS-112

    NASA Technical Reports Server (NTRS)

    2002-01-01

    KENNEDY SPACE CENTER, FLA. - A view of the camera mounted on the external tank of Space Shuttle Atlantis. The color video camera mounted to the top of Atlantis' external tank will provide a view of the front and belly of the orbiter and a portion of the solid rocket boosters (SRBs) and external tank during the launch of Atlantis on mission STS-112. It will offer the STS-112 team an opportunity to monitor the shuttle's performance from a new angle. The camera will be turned on fifteen minutes prior to launch and will show the orbiter and solid rocket boosters on the launch pad. The video will be downlinked from the external tank during flight to several NASA data-receiving sites and then relayed to the live television broadcast. The camera is expected to operate for about 15 minutes following liftoff. At liftoff, viewers will see the shuttle clearing the launch tower and, at two minutes after liftoff, see the right SRB separate from the external tank. When the external tank separates from Atlantis about eight minutes into the flight, the camera is expected to continue its live feed for about six more minutes although NASA may be unable to pick up the camera's signal because the tank may have moved out of range.

  4. LSST Camera Electronics

    NASA Astrophysics Data System (ADS)

    Newcomer, F. Mitchell; Bailey, S.; Britton, C. L.; Felt, N.; Geary, J.; Hashimi, K.; Lebbolo, H.; Lebbolo, H.; Ning, Z.; O'Connor, P.; Oliver, J.; Radeka, V.; Sefri, R.; Tocut, V.; Van Berg, R.

    2009-01-01

    The 3.2 Gpixel LSST camera will be read out by means of 189 highly segmented 4K x 4K CCDs. A total of 3024 video channels will be processed by a modular, in-cryostat electronics package based on two custom multichannel analog ASICs now in development. Performance goals of 5 electrons noise, .01% electronic crosstalk, and 80 mW power dissipation per channel are targeted. The focal plane is organized as a set of 12K x 12K sub-mosaics ("rafts") with front end electronics housed in an enclosure falling within the footprint of the CCDs making up the raft. The assembly of CCDs, baseplate, electronics boards, and cooling components constitutes a self-contained and testable 144 Mpix imager ("raft tower"), and 21 identical raft towers make up the LSST science focal plane. Electronic, mechanical, and thermal prototypes are now undergoing testing and results will be presented at the meeting.

  5. Neutron Imaging Camera

    NASA Technical Reports Server (NTRS)

    Hunter, Stanley D.; DeNolfo, Georgia; Floyd, Sam; Krizmanic, John; Link, Jason; Son, Seunghee; Guardala, Noel; Skopec, Marlene; Stark, Robert

    2008-01-01

    We describe the Neutron Imaging Camera (NIC) being developed for DTRA applications by NASA/GSFC and NSWC/Carderock. The NIC is based on the Three-dimensional Track Imager (3-DTI) technology developed at GSFC for gamma-ray astrophysics applications. The 3-DTI, a large volume time-projection chamber, provides accurate, approximately 0.4 mm resolution. 3-D tracking of charged particles. The incident direction of fast neutrons, E(sub N) > 0.5 MeV. arc reconstructed from the momenta and energies of the proton and triton fragments resulting from 3He(n,p)3H interactions in the 3-DTI volume. We present angular and energy resolution performance of the NIC derived from accelerator tests.

  6. Camera Calibration for Uav Application Using Sensor of Mobile Camera

    NASA Astrophysics Data System (ADS)

    Takahashi, Y.; Chikatsu, H.

    2015-05-01

    Recently, 3D measurements using small unmanned aerial vehicles (UAVs) have increased in Japan, because small type UAVs is easily available at low cost and the analysis software can be created the easily 3D models. However, small type UAVs have a problem: they have very short flight times and a small payload. In particular, as the payload of a small type UAV increases, its flight time decreases. Therefore, it is advantageous to use lightweight sensors in small type UAVs. A mobile camera is lightweight and has many sensors such as an accelerometer, a magnetic field, and a gyroscope. Moreover, these sensors can be used simultaneously. Therefore, the authors think that the problems of small UAVs can be solved using the mobile camera. The authors executed camera calibration using a test target for evaluating sensor values measured using a mobile camera. Consequently, the authors confirmed the same accuracy with normal camera calibration.

  7. 3. Elevation view of entire midsection using ultrawide angle lens. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    3. Elevation view of entire midsection using ultrawide angle lens. Note opened south doors and closed north doors. The following photo WA-203-C-4 is similar except the camera position was moved right to include the slope of the south end. - Puget Sound Naval Shipyard, Munitions Storage Bunker, Naval Ammunitions Depot, South of Campbell Trail, Bremerton, Kitsap County, WA

  8. Wide Angle View of Arsia Mons Volcano

    NASA Technical Reports Server (NTRS)

    1999-01-01

    Arsia Mons (above) is one of the largest volcanoes known. This shield volcano is part of an aligned trio known as the Tharsis Montes--the others are Pavonis Mons and Ascraeus Mons. Arsia Mons is rivaled only by Olympus Mons in terms of its volume. The summit of Arsia Mons is more than 9 kilometers (5.6 miles) higher than the surrounding plains. The crater--or caldera--at the volcano summit is approximately 110 km (68 mi) across. This view of Arsia Mons was taken by the red and blue wide angle cameras of the Mars Global Surveyor Mars Orbiter Camera (MOC) system. Bright water ice clouds (the whitish/bluish wisps) hang above the volcano--a common sight every martian afternoon in this region. Arsia Mons is located at 120o west longitude and 9o south latitude. Illumination is from the left.

  9. An Educational PET Camera Model

    ERIC Educational Resources Information Center

    Johansson, K. E.; Nilsson, Ch.; Tegner, P. E.

    2006-01-01

    Positron emission tomography (PET) cameras are now in widespread use in hospitals. A model of a PET camera has been installed in Stockholm House of Science and is used to explain the principles of PET to school pupils as described here.

  10. Camera artifacts in IUE spectra

    NASA Technical Reports Server (NTRS)

    Bruegman, O. W.; Crenshaw, D. M.

    1994-01-01

    This study of emission line mimicking features in the IUE cameras has produced an atlas of artifiacts in high-dispersion images with an accompanying table of prominent artifacts and a table of prominent artifacts in the raw images along with a medium image of the sky background for each IUE camera.

  11. The "All Sky Camera Network"

    ERIC Educational Resources Information Center

    Caldwell, Andy

    2005-01-01

    In 2001, the "All Sky Camera Network" came to life as an outreach program to connect the Denver Museum of Nature and Science (DMNS) exhibit "Space Odyssey" with Colorado schools. The network is comprised of cameras placed strategically at schools throughout Colorado to capture fireballs--rare events that produce meteorites. Meteorites have great…

  12. CDM Equipment Center Cameras Quantity

    E-print Network

    Schaefer, Marcus

    CDM Equipment Center Cameras Quantity DSLRs Camera Lenses Canon 7D 18/28-135mm Kit Lens 28 Canon 6D Equipment Matthews RoadRags II Kit 24" x 36" Flag Kit 6 Matthews Baby Boa weight Bag 5lbs 5 Matthews Baby

  13. Mars Exploration Rover engineering cameras

    USGS Publications Warehouse

    Maki, J.N.; Bell, J.F., III; Herkenhoff, K.E.; Squyres, S.W.; Kiely, A.; Klimesh, M.; Schwochert, M.; Litwin, T.; Willson, R.; Johnson, Aaron H.; Maimone, M.; Baumgartner, E.; Collins, A.; Wadsworth, M.; Elliot, S.T.; Dingizian, A.; Brown, D.; Hagerott, E.C.; Scherr, L.; Deen, R.; Alexander, D.; Lorre, J.

    2003-01-01

    NASA's Mars Exploration Rover (MER) Mission will place a total of 20 cameras (10 per rover) onto the surface of Mars in early 2004. Fourteen of the 20 cameras are designated as engineering cameras and will support the operation of the vehicles on the Martian surface. Images returned from the engineering cameras will also be of significant importance to the scientific community for investigative studies of rock and soil morphology. The Navigation cameras (Navcams, two per rover) are a mast-mounted stereo pair each with a 45?? square field of view (FOV) and an angular resolution of 0.82 milliradians per pixel (mrad/pixel). The Hazard Avoidance cameras (Hazcams, four per rover) are a body-mounted, front- and rear-facing set of stereo pairs, each with a 124?? square FOV and an angular resolution of 2.1 mrad/pixel. The Descent camera (one per rover), mounted to the lander, has a 45?? square FOV and will return images with spatial resolutions of ???4 m/pixel. All of the engineering cameras utilize broadband visible filters and 1024 x 1024 pixel detectors. Copyright 2003 by the American Geophysical Union.

  14. Digital Cameras for Student Use.

    ERIC Educational Resources Information Center

    Simpson, Carol

    1997-01-01

    Describes the features, equipment and operations of digital cameras and compares three different digital cameras for use in education. Price, technology requirements, features, transfer software, and accessories for the Kodak DC25, Olympus D-200L and Casio QV-100 are presented in a comparison table. (AEF)

  15. Star Identification Algorithm for Uncalibrated, Wide FOV Cameras

    NASA Astrophysics Data System (ADS)

    Ajdadi, Mohamad Javad; Ghafarzadeh, Mahdi; Taheri, Mojtaba; Mosadeq, Ehsan; Khakian Ghomi, Mahdi

    2015-06-01

    A novel method is proposed for star identification via uncalibrated cameras with wide fields of view (FOVs). In this approach some of the triangles created by the stars in the FOV are selected for pattern recognition. The triangles are selected considering the sensitivity of their interior angles to the calibration error. The algorithm is based on the intersection between sets of triangles that are found in the database for each selected triangle of the image. By this method, most of the image stars contribute to pattern recognition and thereby it is very robust against the noise and the calibration error. The algorithm is performed on 150 night sky images, which are taken by an uncalibrated camera in FOV of 114° ± 12° with a success rate of 94% and no false positives. Based on the identification approach, an adaptive method is also developed for calibrating and obtaining the projection function of an uncalibrated camera.

  16. Narrow-band radiation wavelength measurement by processing digital photographs in RAW format

    SciTech Connect

    Kraiskii, A V; Mironova, T V; Sultanov, T T

    2012-12-31

    The technique of measuring the mean wavelength of narrow-band radiation in the 455 - 625-nm range using the image of the emitting surface is presented. The data from the camera array unprocessed by the built-in processor (RAW format) are used. The method is applied for determining the parameters of response of holographic sensors. Depending on the wavelength and brightness of the image fragment, the mean square deviation of the wavelength amounts to 0.3 - 3 nm. (experimental techniques)

  17. IMAX camera (12-IML-1)

    NASA Technical Reports Server (NTRS)

    1992-01-01

    The IMAX camera system is used to record on-orbit activities of interest to the public. Because of the extremely high resolution of the IMAX camera, projector, and audio systems, the audience is afforded a motion picture experience unlike any other. IMAX and OMNIMAX motion picture systems were designed to create motion picture images of superior quality and audience impact. The IMAX camera is a 65 mm, single lens, reflex viewing design with a 15 perforation per frame horizontal pull across. The frame size is 2.06 x 2.77 inches. Film travels through the camera at a rate of 336 feet per minute when the camera is running at the standard 24 frames/sec.

  18. Coherent infrared imaging camera (CIRIC)

    SciTech Connect

    Hutchinson, D.P.; Simpson, M.L.; Bennett, C.A.; Richards, R.K.; Emery, M.S.; Crutcher, R.I.; Sitter, D.N. Jr.; Wachter, E.A.; Huston, M.A.

    1995-07-01

    New developments in 2-D, wide-bandwidth HgCdTe (MCT) and GaAs quantum-well infrared photodetectors (QWIP) coupled with Monolithic Microwave Integrated Circuit (MMIC) technology are now making focal plane array coherent infrared (IR) cameras viable. Unlike conventional IR cameras which provide only thermal data about a scene or target, a coherent camera based on optical heterodyne interferometry will also provide spectral and range information. Each pixel of the camera, consisting of a single photo-sensitive heterodyne mixer followed by an intermediate frequency amplifier and illuminated by a separate local oscillator beam, constitutes a complete optical heterodyne receiver. Applications of coherent IR cameras are numerous and include target surveillance, range detection, chemical plume evolution, monitoring stack plume emissions, and wind shear detection.

  19. Accuracy in fixing ship's positions by camera survey of bearings

    NASA Astrophysics Data System (ADS)

    Naus, Krzysztof; W??, Mariusz

    2011-01-01

    The paper presents the results of research on the possibilities of fixing ship position coordinates based on results of surveying bearings on navigational marks with the use of the CCD camera. Accuracy of the determination of ship position coordinates, expressed in terms of the mean error, was assumed to be the basic criterion of this estimation. The first part of the paper describes the method of the determination of the resolution and the mean error of the angle measurement, taken with a camera, and also the method of the determination of the mean error of position coordinates when two or more bearings were measured. There have been defined three software applications assigned for the development of navigational sea charts with accuracy areas mapped on. The second part contains the results of studying accuracy in fixing ship position coordinates, carried out in the Gulf of Gdansk, with the use of bearings taken obtained with the Rolleiflex and Sony cameras. The results are presented in a form of diagrams of the mean error of angle measurement, also in the form of navigational charts with accuracy fields mapped on. In the final part, basing on results obtained, the applicability of CCD cameras in automation of coastal navigation performance process is discussed.

  20. Observation of Planetary Motion Using a Digital Camera

    ERIC Educational Resources Information Center

    Meyn, Jan-Peter

    2008-01-01

    A digital SLR camera with a standard lens (50 mm focal length, f/1.4) on a fixed tripod is used to obtain photographs of the sky which contain stars up to 8[superscript m] apparent magnitude. The angle of view is large enough to ensure visual identification of the photograph with a large sky region in a stellar map. The resolution is sufficient to…

  1. Critical Heat Flux In Inclined Rectangular Narrow Long Channel

    SciTech Connect

    J. L. Rempe; S. W. Noh; Y. H. Kim; K. Y. Suh; F.B.Cheung; S. B. Kim

    2005-05-01

    In the TMI-2 accident, the lower part of the reactor pressure vessel had been overheated and then rather rapidly cooled down, as was later identified in a vessel investigation project. This accounted for the possibility of gap cooling feasibility. For this reason, several investigations were performed to determine the critical heat flux (CHF) from the standpoint of invessel retention. The experiments are conducted to investigate the general boiling phenomena, and the triggering mechanism for the CHF in a narrow gap using a 5 x 105 mm2 crevice type heater assembly and de-mineralized water. The test parameters include the gap size of 5 mm, and the surface orientation angles from the downward facing position (180o) to the vertical position (90o). The orientation angle affects the bubble layer and escape from the narrow gap. The CHF is less than that in a shorter channel, compared with the previous experiments having a heated length of 35 mmin the copper test section.

  2. 31. TACOMA NARROWS BRIDGE, LOOKING WEST ACROSS TOLL PLAZA, 29 ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    31. TACOMA NARROWS BRIDGE, LOOKING WEST ACROSS TOLL PLAZA, 29 AUGUST 1940. (ELDRIDGE, CLARK M. TACOMA NARROWS BRIDGE, TACOMA, WASHINGTON, FINAL REPORT ON DESIGN AND CONSTRUCTION, 1941) - Tacoma Narrows Bridge, Spanning Narrows at State Route 16, Tacoma, Pierce County, WA

  3. 30. TACOMA NARROWS BRIDGE, LOOKING EAST THROUGH TOLL LANES, 29 ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    30. TACOMA NARROWS BRIDGE, LOOKING EAST THROUGH TOLL LANES, 29 AUGUST 1940. (ELDRIDGE, CLARK H. TACOMA NARROWS BRIDGE, TACOMA, WASHINGTON, FINAL REPORT ON DESIGN AND CONSTRUCTION, 1941) - Tacoma Narrows Bridge, Spanning Narrows at State Route 16, Tacoma, Pierce County, WA

  4. 16 CFR 501.1 - Camera film.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ...Camera film. 501.1 Section 501.1 Commercial Practices FEDERAL TRADE...REQUIREMENTS AND PROHIBITIONS UNDER PART 500 § 501.1 Camera film. Camera film packaged...quantity statement requirements of part 500 of this chapter which specify...

  5. CSC418 / CSCD18 / CSC2504 Camera Models 6 Camera Models

    E-print Network

    Toronto, University of

    . The word camera is Latin for "room;" camera obscura means "dark room." 18th-century camera obscuras with the model used by OpenGL. Aside: The earliest cameras were room-sized pinhole cameras, called camera obscuras. You would walk in the room and see an upside-down projection of the outside world on the far wall

  6. Reading Angles in Maps

    ERIC Educational Resources Information Center

    Izard, Véronique; O'Donnell, Evan; Spelke, Elizabeth S.

    2014-01-01

    Preschool children can navigate by simple geometric maps of the environment, but the nature of the geometric relations they use in map reading remains unclear. Here, children were tested specifically on their sensitivity to angle. Forty-eight children (age 47:15-53:30 months) were presented with fragments of geometric maps, in which angle sections…

  7. Focal Plane Metrology for the LSST Camera

    SciTech Connect

    A Rasmussen, Andrew P.; Hale, Layton; Kim, Peter; Lee, Eric; Perl, Martin; Schindler, Rafe; Takacs, Peter; Thurston, Timothy; /SLAC

    2007-01-10

    Meeting the science goals for the Large Synoptic Survey Telescope (LSST) translates into a demanding set of imaging performance requirements for the optical system over a wide (3.5{sup o}) field of view. In turn, meeting those imaging requirements necessitates maintaining precise control of the focal plane surface (10 {micro}m P-V) over the entire field of view (640 mm diameter) at the operating temperature (T {approx} -100 C) and over the operational elevation angle range. We briefly describe the hierarchical design approach for the LSST Camera focal plane and the baseline design for assembling the flat focal plane at room temperature. Preliminary results of gravity load and thermal distortion calculations are provided, and early metrological verification of candidate materials under cold thermal conditions are presented. A detailed, generalized method for stitching together sparse metrology data originating from differential, non-contact metrological data acquisition spanning multiple (non-continuous) sensor surfaces making up the focal plane, is described and demonstrated. Finally, we describe some in situ alignment verification alternatives, some of which may be integrated into the camera's focal plane.

  8. X-ray Pinhole Camera Measurements

    SciTech Connect

    Nelson, D. S.; Berninger, M. J.; Flores, P. A.; Good, D. E.; Henderson, D. J.; Hogge, K. W.; Huber, S. R.; Lutz, S. S.; Mitchell, S. E.; Howe, R. A.; Mitton, C. V.; Molina, I.; Bozman, D. R.; Cordova, S. R.; Mitchell, D. R.; Oliver, B. V.; Ormond, E. C.

    2013-07-01

    The development of the rod pinch diode [1] has led to high-resolution radiography for dynamic events such as explosive tests. Rod pinch diodes use a small diameter anode rod, which extends through the aperture of a cathode plate. Electrons borne off the aperture surface can self-insulate and pinch onto the tip of the rod, creating an intense, small x-ray source (Primary Pinch). This source has been utilized as the main diagnostic on numerous experiments that include high-value, single-shot events. In such applications there is an emphasis on machine reliability, x-ray reproducibility, and x-ray quality [2]. In tests with the baseline rod pinch diode, we have observed that an additional pinch (Secondary Pinch) occurs at the interface near the anode rod and the rod holder. This suggests that stray electrons exist that are not associated with the Primary Pinch. In this paper we present measurements on both pinches using an x-ray pinhole camera. The camera is placed downstream of the Primary Pinch at an angle of 60° with respect to the diode centerline. This diagnostic will be employed to diagnose x-ray reproducibility and quality. In addition, we will investigate the performance of hybrid diodes relating to the formation of the Primary and Secondary Pinches.

  9. Omnidirectional narrow bandpass filters based on one-dimensional superconductor-dielectric photonic crystal heterostructors

    NASA Astrophysics Data System (ADS)

    Barvestani, Jamal

    2015-01-01

    By using transfer matrix method, narrow passbands of TE wave from one-dimensional superconductor-dielectric photonic crystal heterostructures are presented. Various superconductor within the two-fluid model are considered. Results show that by selecting proper width for superconductor and dielectric layers and proper materials selection, single narrow passband in visible region can be obtained. Behavior of these passbands versus the temperature of superconductors, external magnetic field and incident angle are considered. We have shown that it is possible to obtain omnidirectional passbands with examining temperature, the dilation factor of the half part of a heterostructure and the other parameters of the heterostrutures. These tunable narrow passband may be useful in designing of narrow band filters or multichannel filters.

  10. CCD Camera Observations

    NASA Astrophysics Data System (ADS)

    Buchheim, Bob; Argyle, R. W.

    One night late in 1918, astronomer William Milburn, observing the region of Cassiopeia from Reverend T.H.E.C. Espin's observatory in Tow Law (England), discovered a hitherto unrecorded double star (Wright 1993). He reported it to Rev. Espin, who measured the pair using his 24-in. reflector: the fainter star was 6.0 arcsec from the primary, at position angle 162.4 ^{circ } (i.e. the fainter star was south-by-southeast from the primary) (Espin 1919). Some time later, it was recognized that the astrograph of the Vatican Observatory had taken an image of the same star-field a dozen years earlier, in late 1906. At that earlier epoch, the fainter star had been separated from the brighter one by only 4.8 arcsec, at position angle 186.2 ^{circ } (i.e. almost due south). Were these stars a binary pair, or were they just two unrelated stars sailing past each other? Some additional measurements might have begun to answer this question. If the secondary star was following a curved path, that would be a clue of orbital motion; if it followed a straight-line path, that would be a clue that these are just two stars passing in the night. Unfortunately, nobody took the trouble to re-examine this pair for almost a century, until the 2MASS astrometric/photometric survey recorded it in late 1998. After almost another decade, this amateur astronomer took some CCD images of the field in 2007, and added another data point on the star's trajectory, as shown in Fig. 15.1.

  11. Person re-identification over camera networks using multi-task distance metric learning.

    PubMed

    Ma, Lianyang; Yang, Xiaokang; Tao, Dacheng

    2014-08-01

    Person reidentification in a camera network is a valuable yet challenging problem to solve. Existing methods learn a common Mahalanobis distance metric by using the data collected from different cameras and then exploit the learned metric for identifying people in the images. However, the cameras in a camera network have different settings and the recorded images are seriously affected by variability in illumination conditions, camera viewing angles, and background clutter. Using a common metric to conduct person reidentification tasks on different camera pairs overlooks the differences in camera settings; however, it is very time-consuming to label people manually in images from surveillance videos. For example, in most existing person reidentification data sets, only one image of a person is collected from each of only two cameras; therefore, directly learning a unique Mahalanobis distance metric for each camera pair is susceptible to over-fitting by using insufficiently labeled data. In this paper, we reformulate person reidentification in a camera network as a multitask distance metric learning problem. The proposed method designs multiple Mahalanobis distance metrics to cope with the complicated conditions that exist in typical camera networks. We address the fact that these Mahalanobis distance metrics are different but related, and learned by adding joint regularization to alleviate over-fitting. Furthermore, by extending, we present a novel multitask maximally collapsing metric learning (MtMCML) model for person reidentification in a camera network. Experimental results demonstrate that formulating person reidentification over camera networks as multitask distance metric learning problem can improve performance, and our proposed MtMCML works substantially better than other current state-of-the-art person reidentification methods. PMID:24956368

  12. Vision Sensors and Cameras

    NASA Astrophysics Data System (ADS)

    Hoefflinger, Bernd

    Silicon charge-coupled-device (CCD) imagers have been and are a specialty market ruled by a few companies for decades. Based on CMOS technologies, active-pixel sensors (APS) began to appear in 1990 at the 1 ?m technology node. These pixels allow random access, global shutters, and they are compatible with focal-plane imaging systems combining sensing and first-level image processing. The progress towards smaller features and towards ultra-low leakage currents has provided reduced dark currents and ?m-size pixels. All chips offer Mega-pixel resolution, and many have very high sensitivities equivalent to ASA 12.800. As a result, HDTV video cameras will become a commodity. Because charge-integration sensors suffer from a limited dynamic range, significant processing effort is spent on multiple exposure and piece-wise analog-digital conversion to reach ranges >10,000:1. The fundamental alternative is log-converting pixels with an eye-like response. This offers a range of almost a million to 1, constant contrast sensitivity and constant colors, important features in professional, technical and medical applications. 3D retino-morphic stacking of sensing and processing on top of each other is being revisited with sub-100 nm CMOS circuits and with TSV technology. With sensor outputs directly on top of neurons, neural focal-plane processing will regain momentum, and new levels of intelligent vision will be achieved. The industry push towards thinned wafers and TSV enables backside-illuminated and other pixels with a 100% fill-factor. 3D vision, which relies on stereo or on time-of-flight, high-speed circuitry, will also benefit from scaled-down CMOS technologies both because of their size as well as their higher speed.

  13. A lexicon for Camera Obscura

    E-print Network

    Rosinsky, Robert David

    1984-01-01

    The camera obscura has allowed artists, scientists, and philosophers to view the world as a flat image. Two - dimensional renditions of visual reality seem to be more manageable and easier to grasp than reality itself. A ...

  14. An Inexpensive Digital Infrared Camera

    ERIC Educational Resources Information Center

    Mills, Allan

    2012-01-01

    Details are given for the conversion of an inexpensive webcam to a camera specifically sensitive to the near infrared (700-1000 nm). Some experiments and practical applications are suggested and illustrated. (Contains 9 figures.)

  15. Infants Experience Perceptual Narrowing for Nonprimate Faces

    ERIC Educational Resources Information Center

    Simpson, Elizabeth A.; Varga, Krisztina; Frick, Janet E.; Fragaszy, Dorothy

    2011-01-01

    Perceptual narrowing--a phenomenon in which perception is broad from birth, but narrows as a function of experience--has previously been tested with primate faces. In the first 6 months of life, infants can discriminate among individual human and monkey faces. Though the ability to discriminate monkey faces is lost after about 9 months, infants…

  16. Do narrow {Sigma}-hypernuclear states exist?

    SciTech Connect

    Chrien, R.E.

    1995-12-31

    Reports of narrow states in {Sigma}-hypernucleus production have appeared from time to time. The present experiment is a repeat of the first and seemingly most definitive such experiment, that on a target of {sup 9}Be, but with much better statistics. No narrow states were observed.

  17. Method of rotation angle measurement in machine vision based on calibration pattern with spot array

    SciTech Connect

    Li Weimin; Jin Jing; Li Xiaofeng; Li Bin

    2010-02-20

    We propose a method of rotation angle measurement with high precision in machine vision. An area scan CCD camera, imaging lens, and calibration pattern with a spot array make up the measurement device for measuring the rotation angle. The calibration pattern with a spot array is installed at the rotation part, and the CCD camera is set at a certain distance from the rotation components. The coordinates of the spots on the calibration pattern is acquired through the vision image of the calibration pattern captured by the CCD camera. At the initial position of the calibration pattern, the camera is calibrated with the spot array; the mathematical model of distortion error of the CCD camera is built. With the equation of coordinate rotation measurement, the rotation angle of the spot array is detected. In the theoretic simulation, noise of different levels is added to the coordinates of the spot array. The experiment results show that the measurement device can measure the rotation angle precisely with a noncontact method. The standard deviation of rotation angle measurement is smaller than 3 arc sec. The measurement device can measure both microangles and large angles.

  18. The future of consumer cameras

    NASA Astrophysics Data System (ADS)

    Battiato, Sebastiano; Moltisanti, Marco

    2015-03-01

    In the last two decades multimedia, and in particular imaging devices (camcorders, tablets, mobile phones, etc.) have been dramatically diffused. Moreover the increasing of their computational performances, combined with an higher storage capability, allows them to process large amount of data. In this paper an overview of the current trends of consumer cameras market and technology will be given, providing also some details about the recent past (from Digital Still Camera up today) and forthcoming key issues.

  19. Solid State Television Camera (CID)

    NASA Technical Reports Server (NTRS)

    Steele, D. W.; Green, W. T.

    1976-01-01

    The design, development and test are described of a charge injection device (CID) camera using a 244x248 element array. A number of video signal processing functions are included which maximize the output video dynamic range while retaining the inherently good resolution response of the CID. Some of the unique features of the camera are: low light level performance, high S/N ratio, antiblooming, geometric distortion, sequential scanning and AGC.

  20. Narrow band gap amorphous silicon semiconductors

    DOEpatents

    Madan, A.; Mahan, A.H.

    1985-01-10

    Disclosed is a narrow band gap amorphous silicon semiconductor comprising an alloy of amorphous silicon and a band gap narrowing element selected from the group consisting of Sn, Ge, and Pb, with an electron donor dopant selected from the group consisting of P, As, Sb, Bi and N. The process for producing the narrow band gap amorphous silicon semiconductor comprises the steps of forming an alloy comprising amorphous silicon and at least one of the aforesaid band gap narrowing elements in amount sufficient to narrow the band gap of the silicon semiconductor alloy below that of amorphous silicon, and also utilizing sufficient amounts of the aforesaid electron donor dopant to maintain the amorphous silicon alloy as an n-type semiconductor.

  1. Streak camera dynamic range optimization

    SciTech Connect

    Wiedwald, J.D.; Lerche, R.A.

    1987-09-01

    The LLNL optical streak camera is used by the Laser Fusion Program in a wide range of applications. Many of these applications require a large recorded dynamic range. Recent work has focused on maximizing the dynamic range of the streak camera recording system. For our streak cameras, image intensifier saturation limits the upper end of the dynamic range. We have developed procedures to set the image intensifier gain such that the system dynamic range is maximized. Specifically, the gain is set such that a single streak tube photoelectron is recorded with an exposure of about five times the recording system noise. This ensures detection of single photoelectrons, while not consuming intensifier or recording system dynamic range through excessive intensifier gain. The optimum intensifier gain has been determined for two types of film and for a lens-coupled CCD camera. We have determined that by recording the streak camera image with a CCD camera, the system is shot-noise limited up to the onset of image intensifier nonlinearity. When recording on film, the film determines the noise at high exposure levels. There is discussion of the effects of slit width and image intensifier saturation on dynamic range. 8 refs.

  2. Method for pan-tilt camera calibration using single control point.

    PubMed

    Li, Yunting; Zhang, Jun; Hu, Wenwen; Tian, Jinwen

    2015-01-01

    The pan-tilt (PT) camera is widely used in video surveillance systems due to its rotatable property and low cost. The rough output of a PT camera may not satisfy the demand of practical applications; hence an accurate calibration method of a PT camera is desired. However, high-precision camera calibration methods usually require sufficient control points not guaranteed in some practical cases of a PT camera. In this paper, we present a novel method to online calibrate the rotation angles of a PT camera by using only one control point. This is achieved by assuming that the intrinsic parameters and position of the camera are known in advance. More specifically, we first build a nonlinear PT camera model with respect to two parameters Pan and Tilt. We then convert the nonlinear model into a linear model according to sine and cosine of Tilt, where each element in the augmented coefficient matrix is a function of the single variable Pan. A closed-form solution of Pan and Tilt can then be derived by solving a quadratic equation of tangent of Pan. Our method is noniterative and does not need features matching; thus its time efficiency is better. We evaluate our calibration method on various synthetic and real data. The quantitative results demonstrate that the proposed method outperforms other state-of-the-art methods if the intrinsic parameters and position of the camera are known in advance. PMID:26366500

  3. Yearly comparisons of the martian polar caps: 1999 2003 Mars Orbiter Camera observations

    NASA Astrophysics Data System (ADS)

    Benson, Jennifer L.; James, Philip B.

    2005-04-01

    The Mars Global Surveyor Mars Orbiter Camera wide-angle cameras were used to obtain images of the north and south seasonal and residual polar caps between 1999 and 2003. Wide-angle red camera images were used in assembling mosaics of the north and south polar recessions and regression rates were measured and compared. There are small variations in the north polar recession between 2000 and 2002, especially between L=7° and L=50°, however there is no evidence for the plateau in the recession curves that has been observed in some prior years. The south polar recession changes very little from year to year, and the 2001 dust storm had little if any effect on the average cap recession that year. Albedo values of the geographic north pole were measured using wide-angle red and blue camera images, and the residual south polar cap configuration was compared between the three years observed by MOC. The albedo of the geographic north pole generally varies between 0.5 and 0.6 as measured from MOC wide-angle red camera images. There were only minor variations near the edges of the residual south polar cap between the three years examined.

  4. Angles, Time, and Proportion

    ERIC Educational Resources Information Center

    Pagni, David L.

    2005-01-01

    This article describes an investigation making connections between the time on an analog clock and the angle between the minute hand and the hour hand. It was posed by a middle school mathematics teacher. (Contains 8 tables and 6 figures.)

  5. Multi-Angle View of the Canary Islands

    NASA Technical Reports Server (NTRS)

    2000-01-01

    A multi-angle view of the Canary Islands in a dust storm, 29 February 2000. At left is a true-color image taken by the Multi-angle Imaging SpectroRadiometer (MISR) instrument on NASA's Terra satellite. This image was captured by the MISR camera looking at a 70.5-degree angle to the surface, ahead of the spacecraft. The middle image was taken by the MISR downward-looking (nadir) camera, and the right image is from the aftward 70.5-degree camera. The images are reproduced using the same radiometric scale, so variations in brightness, color, and contrast represent true variations in surface and atmospheric reflectance with angle. Windblown dust from the Sahara Desert is apparent in all three images, and is much brighter in the oblique views. This illustrates how MISR's oblique imaging capability makes the instrument a sensitive detector of dust and other particles in the atmosphere. Data for all channels are presented in a Space Oblique Mercator map projection to facilitate their co-registration. The images are about 400 km (250 miles)wide, with a spatial resolution of about 1.1 kilometers (1,200 yards). North is toward the top. MISR was built and is managed by NASA's Jet Propulsion Laboratory, Pasadena, CA, for NASA's Office of Earth Science, Washington, DC. The Terra satellite is managed by NASA's Goddard Space Flight Center, Greenbelt, MD. JPL is a division of the California Institute of Technology.

  6. 'Magic Angle Precession'

    SciTech Connect

    Binder, Bernd

    2008-01-21

    An advanced and exact geometric description of nonlinear precession dynamics modeling very accurately natural and artificial couplings showing Lorentz symmetry is derived. In the linear description it is usually ignored that the geometric phase of relativistic motion couples back to the orbital motion providing for a non-linear recursive precession dynamics. The high coupling strength in the nonlinear case is found to be a gravitomagnetic charge proportional to the precession angle and angular velocity generated by geometric phases, which are induced by high-speed relativistic rotations and are relevant to propulsion technologies but also to basic interactions. In the quantum range some magic precession angles indicating strong coupling in a phase-locked chaotic system are identified, emerging from a discrete time dynamical system known as the cosine map showing bifurcations at special precession angles relevant to heavy nuclei stability. The 'Magic Angle Precession' (MAP) dynamics can be simulated and visualized by cones rolling in or on each other, where the apex and precession angles are indexed by spin, charge or precession quantum numbers, and corresponding magic angles. The most extreme relativistic warping and twisting effect is given by the Dirac spinor half spin constellation with 'Hyperdiamond' MAP, which resembles quark confinement.

  7. Campus Security Camera Issued: April 2009

    E-print Network

    .Bowersock@is.mines.edu Page 1 of 3 1.0 BACKGROUND AND PURPOSE Closed circuit television (CCTV) cameras are used in various locations on the Colorado School of Mines campus. CCTV cameras, also known as security cameras, are utilized the network technology appropriate to the use of CCTV cameras. 2.0 POLICY The Colorado School of Mines may use

  8. Photometric Calibration of Consumer Video Cameras

    NASA Technical Reports Server (NTRS)

    Suggs, Robert; Swift, Wesley, Jr.

    2007-01-01

    Equipment and techniques have been developed to implement a method of photometric calibration of consumer video cameras for imaging of objects that are sufficiently narrow or sufficiently distant to be optically equivalent to point or line sources. Heretofore, it has been difficult to calibrate consumer video cameras, especially in cases of image saturation, because they exhibit nonlinear responses with dynamic ranges much smaller than those of scientific-grade video cameras. The present method not only takes this difficulty in stride but also makes it possible to extend effective dynamic ranges to several powers of ten beyond saturation levels. The method will likely be primarily useful in astronomical photometry. There are also potential commercial applications in medical and industrial imaging of point or line sources in the presence of saturation.This development was prompted by the need to measure brightnesses of debris in amateur video images of the breakup of the Space Shuttle Columbia. The purpose of these measurements is to use the brightness values to estimate relative masses of debris objects. In most of the images, the brightness of the main body of Columbia was found to exceed the dynamic ranges of the cameras. A similar problem arose a few years ago in the analysis of video images of Leonid meteors. The present method is a refined version of the calibration method developed to solve the Leonid calibration problem. In this method, one performs an endto- end calibration of the entire imaging system, including not only the imaging optics and imaging photodetector array but also analog tape recording and playback equipment (if used) and any frame grabber or other analog-to-digital converter (if used). To automatically incorporate the effects of nonlinearity and any other distortions into the calibration, the calibration images are processed in precisely the same manner as are the images of meteors, space-shuttle debris, or other objects that one seeks to analyze. The light source used to generate the calibration images is an artificial variable star comprising a Newtonian collimator illuminated by a light source modulated by a rotating variable neutral- density filter. This source acts as a point source, the brightness of which varies at a known rate. A video camera to be calibrated is aimed at this source. Fixed neutral-density filters are inserted in or removed from the light path as needed to make the video image of the source appear to fluctuate between dark and saturated bright. The resulting video-image data are analyzed by use of custom software that determines the integrated signal in each video frame and determines the system response curve (measured output signal versus input brightness). These determinations constitute the calibration, which is thereafter used in automatic, frame-by-frame processing of the data from the video images to be analyzed.

  9. Video camera use at nuclear power plants

    SciTech Connect

    Estabrook, M.L.; Langan, M.O.; Owen, D.E. )

    1990-08-01

    A survey of US nuclear power plants was conducted to evaluate video camera use in plant operations, and determine equipment used and the benefits realized. Basic closed circuit television camera (CCTV) systems are described and video camera operation principles are reviewed. Plant approaches for implementing video camera use are discussed, as are equipment selection issues such as setting task objectives, radiation effects on cameras, and the use of disposal cameras. Specific plant applications are presented and the video equipment used is described. The benefits of video camera use --- mainly reduced radiation exposure and increased productivity --- are discussed and quantified. 15 refs., 6 figs.

  10. Wide field camera observations of Baade's Window

    NASA Technical Reports Server (NTRS)

    Holtzman, Jon A.; Light, R. M.; Baum, William A.; Worthey, Guy; Faber, S. M.; Hunter, Deidre A.; O'Neil, Earl J., Jr.; Kreidl, Tobias J.; Groth, E. J.; Westphal, James A.

    1993-01-01

    We have observed a field in Baade's Window using the Wide Field Camera (WFC) of the Hubble Space Telescope (HST) and obtain V- and I-band photometry down to V approximately 22.5. These data go several magnitudes fainter than previously obtained from the ground. The location of the break in the luminosity function suggests that there are a significant number of intermediate age (less than 10 Gyr) stars in the Galactic bulge. This conclusion rests on the assumptions that the extinction towards our field is similar to that seen in other parts of Baade's Window, that the distance to the bulge is approximately 8 kpc, and that we can determine fairly accurate zero points for the HST photometry. Changes in any one of these assumptions could increase the inferred age, but a conspiracy of lower reddening, a shorter distance to the bulge, and/or photometric zero-point errors would be needed to imply a population entirely older than 10 Gyr. We infer an initial mass function slope for the main-sequence stars, and find that it is consistent with that measured in the solar neighborhood; unfortunately, the slope is poorly constrained because we sample only a narrow range of stellar mass and because of uncertainties in the observed luminosity function at the faint end.

  11. WIDE-FIELD ASTRONOMICAL MULTISCALE CAMERAS

    SciTech Connect

    Marks, Daniel L.; Brady, David J.

    2013-05-15

    In order to produce sufficiently low aberrations with a large aperture, telescopes have a limited field of view. Because of this narrow field, large areas of the sky at a given time are unobserved. We propose several telescopes based on monocentric reflective, catadioptric, and refractive objectives that may be scaled to wide fields of view and achieve 1.''1 resolution, which in most locations is the practical seeing limit of the atmosphere. The reflective and Schmidt catadioptric objectives have relatively simple configurations and enable large fields to be captured at the expense of the obscuration of the mirror by secondary optics, a defect that may be managed by image plane design. The refractive telescope design does not have an obscuration but the objective has substantial bulk. The refractive design is a 38 gigapixel camera which consists of a single monocentric objective and 4272 microcameras. Monocentric multiscale telescopes, with their wide fields of view, may observe phenomena that might otherwise be unnoticed, such as supernovae, glint from orbital space debris, and near-earth objects.

  12. The Clementine longwave infrared camera

    SciTech Connect

    Priest, R.E.; Lewis, I.T.; Sewall, N.R.; Park, H.S.; Shannon, M.J.; Ledebuhr, A.G.; Pleasance, L.D.; Massie, M.A.; Metschuleit, K.

    1995-04-01

    The Clementine mission provided the first ever complete, systematic surface mapping of the moon from the ultra-violet to the near-infrared regions. More than 1.7 million images of the moon, earth and space were returned from this mission. The longwave-infrared (LWIR) camera supplemented the UV/Visible and near-infrared mapping cameras providing limited strip coverage of the moon, giving insight to the thermal properties of the soils. This camera provided {approximately}100 m spatial resolution at 400 km periselene, and a 7 km across-track swath. This 2.1 kg camera using a 128 x 128 Mercury-Cadmium-Telluride (MCT) FPA viewed thermal emission of the lunar surface and lunar horizon in the 8.0 to 9.5 {micro}m wavelength region. A description of this light-weight, low power LWIR camera along with a summary of lessons learned is presented. Design goals and preliminary on-orbit performance estimates are addressed in terms of meeting the mission`s primary objective for flight qualifying the sensors for future Department of Defense flights.

  13. The fly's eye camera system

    NASA Astrophysics Data System (ADS)

    Mészáros, L.; Pál, A.; Csépány, G.; Jaskó, A.; Vida, K.; Oláh, K.; Mezö, G.

    2014-12-01

    We introduce the Fly's Eye Camera System, an all-sky monitoring device intended to perform time domain astronomy. This camera system design will provide complementary data sets for other synoptic sky surveys such as LSST or Pan-STARRS. The effective field of view is obtained by 19 cameras arranged in a spherical mosaic form. These individual cameras of the device stand on a hexapod mount that is fully capable of achieving sidereal tracking for the subsequent exposures. This platform has many advantages. First of all it requires only one type of moving component and does not include unique parts. Hence this design not only eliminates problems implied by unique elements, but the redundancy of the hexapod allows smooth operations even if one or two of the legs are stuck. In addition, it can calibrate itself by observed stars independently from both the geographical location (including northen and southern hemisphere) and the polar alignment of the full mount. All mechanical elements and electronics are designed within the confines of our institute Konkoly Observatory. Currently, our instrument is in testing phase with an operating hexapod and reduced number of cameras.

  14. Exploring the Moon at High-Resolution: First Results From the Lunar Reconnaissance Orbiter Camera (LROC)

    NASA Astrophysics Data System (ADS)

    Robinson, Mark; Hiesinger, Harald; McEwen, Alfred; Jolliff, Brad; Thomas, Peter C.; Turtle, Elizabeth; Eliason, Eric; Malin, Mike; Ravine, A.; Bowman-Cisneros, Ernest

    The Lunar Reconnaissance Orbiter (LRO) spacecraft was launched on an Atlas V 401 rocket from the Cape Canaveral Air Force Station Launch Complex 41 on June 18, 2009. After spending four days in Earth-Moon transit, the spacecraft entered a three month commissioning phase in an elliptical 30×200 km orbit. On September 15, 2009, LRO began its planned one-year nominal mapping mission in a quasi-circular 50 km orbit. A multi-year extended mission in a fixed 30×200 km orbit is optional. The Lunar Reconnaissance Orbiter Camera (LROC) consists of a Wide Angle Camera (WAC) and two Narrow Angle Cameras (NACs). The WAC is a 7-color push-frame camera, which images the Moon at 100 and 400 m/pixel in the visible and UV, respectively, while the two NACs are monochrome narrow-angle linescan imagers with 0.5 m/pixel spatial resolution. LROC was specifically designed to address two of the primary LRO mission requirements and six other key science objectives, including 1) assessment of meter-and smaller-scale features in order to select safe sites for potential lunar landings near polar resources and elsewhere on the Moon; 2) acquire multi-temporal synoptic 100 m/pixel images of the poles during every orbit to unambiguously identify regions of permanent shadow and permanent or near permanent illumination; 3) meter-scale mapping of regions with permanent or near-permanent illumination of polar massifs; 4) repeat observations of potential landing sites and other regions to derive high resolution topography; 5) global multispectral observations in seven wavelengths to characterize lunar resources, particularly ilmenite; 6) a global 100-m/pixel basemap with incidence angles (60° -80° ) favorable for morphological interpretations; 7) sub-meter imaging of a variety of geologic units to characterize their physical properties, the variability of the regolith, and other key science questions; 8) meter-scale coverage overlapping with Apollo-era panoramic images (1-2 m/pixel) to document the number of small impacts since 1971-1972. LROC allows us to determine the recent impact rate of bolides in the size range of 0.5 to 10 meters, which is currently not well known. Determining the impact rate at these sizes enables engineering remediation measures for future surface operations and interplanetary travel. The WAC has imaged nearly the entire Moon in seven wavelengths. A preliminary global WAC stereo-based topographic model is in preparation [1] and global color processing is underway [2]. As the mission progresses repeat global coverage will be obtained as lighting conditions change providing a robust photometric dataset. The NACs are revealing a wealth of morpho-logic features at the meter scale providing the engineering and science constraints needed to support future lunar exploration. All of the Apollo landing sites have been imaged, as well as the majority of robotic landing and impact sites. Through the use of off-nadir slews a collection of stereo pairs is being acquired that enable 5-m scale topographic mapping [3-7]. Impact mor-phologies (terraces, impact melt, rays, etc) are preserved in exquisite detail at all Copernican craters and are enabling new studies of impact mechanics and crater size-frequency distribution measurements [8-12]. Other topical studies including, for example, lunar pyroclastics, domes, and tectonics are underway [e.g., 10-17]. The first PDS data release of LROC data will be in March 2010, and will include all images from the commissioning phase and the first 3 months of the mapping phase. [1] Scholten et al. (2010) 41st LPSC, #2111; [2] Denevi et al. (2010a) 41st LPSC, #2263; [3] Beyer et al. (2010) 41st LPSC, #2678; [4] Archinal et al. (2010) 41st LPSC, #2609; [5] Mattson et al. (2010) 41st LPSC, #1871; [6] Tran et al. (2010) 41st LPSC, #2515; [7] Oberst et al. (2010) 41st LPSC, #2051; [8] Bray et al. (2010) 41st LPSC, #2371; [9] Denevi et al. (2010b) 41st LPSC, #2582; [10] Hiesinger et al. (2010a) 41st LPSC, #2278; [11] Hiesinger et al. (2010b) 41st LPSC, #2304; [12] van der Bogert et al. (2010) 41st LPSC, #2165;

  15. Measurement of camera image sensor depletion thickness with cosmic rays

    E-print Network

    J. Vandenbroucke; S. BenZvi; S. Bravo; K. Jensen; P. Karn; M. Meehan; J. Peacock; M. Plewa; T. Ruggles; M. Santander; D. Schultz; A. L. Simons; D. Tosi

    2015-10-30

    Camera image sensors can be used to detect ionizing radiation in addition to optical photons. In particular, cosmic-ray muons are detected as long, straight tracks passing through multiple pixels. The distribution of track lengths can be related to the thickness of the active (depleted) region of the camera image sensor through the known angular distribution of muons at sea level. We use a sample of cosmic-ray muon tracks recorded by the Distributed Electronic Cosmic-ray Observatory to measure the thickness of the depletion region of the camera image sensor in a commercial smart phone, the HTC Wildfire S. The track length distribution prefers a cosmic-ray muon angular distribution over an isotropic distribution. Allowing either distribution, we measure the depletion thickness to be between 13.9~$\\mu$m and 27.7~$\\mu$m. The same method can be applied to additional models of image sensor. Once measured, the thickness can be used to convert track length to incident polar angle on a per-event basis. Combined with a determination of the incident azimuthal angle directly from the track orientation in the sensor plane, this enables direction reconstruction of individual cosmic-ray events.

  16. Narrow deeply bound $K^-$ atomic states

    E-print Network

    E. Friedman; A. Gal

    1999-05-30

    Using optical potentials fitted to a comprehensive set of strong interaction level shifts and widths in $K^-$ atoms, we predict that the $K^-$ atomic levels which are inaccessible in the atomic cascade process are generally narrow, spanning a range of widths about 50 - 1500 keV over the entire periodic table. The mechanism for this narrowing is different from the mechanism for narrowing of pionic atom levels. Examples of such `deeply bound' $K^-$ atomic states are given, showing that in many cases these states should be reasonably well resolved. Several reactions which could be used to form these `deeply bound' states are mentioned. Narrow deeply bound states are expected also in $\\bar{p}$ atoms.

  17. Race Gap in Life Expectancy Is Narrowing

    MedlinePLUS

    ... medlineplus/news/fullstory_155566.html Race Gap in Life Expectancy Is Narrowing: U.S. Study Difference is now ... Black Americans are catching up to whites in life expectancy -- largely due to declining rates of death ...

  18. Narrow Vertical Caves: Mapping Volcanic Fissure Geometries

    NASA Astrophysics Data System (ADS)

    Parcheta, C.; Nash, J.; Parness, A.; Mitchell, K. L.; Pavlov, C. A.

    2015-10-01

    Volcanic conduits are difficult to quantify, but their geometry fundamentally influences how eruptions occur. We robotically map old fissure conduits – elongated narrow cracks in the ground that transported magma to the surface during an eruption.

  19. Camera-on-a-Chip

    NASA Technical Reports Server (NTRS)

    1999-01-01

    Jet Propulsion Laboratory's research on a second generation, solid-state image sensor technology has resulted in the Complementary Metal- Oxide Semiconductor Active Pixel Sensor (CMOS), establishing an alternative to the Charged Coupled Device (CCD). Photobit Corporation, the leading supplier of CMOS image sensors, has commercialized two products of their own based on this technology: the PB-100 and PB-300. These devices are cameras on a chip, combining all camera functions. CMOS "active-pixel" digital image sensors offer several advantages over CCDs, a technology used in video and still-camera applications for 30 years. The CMOS sensors draw less energy, they use the same manufacturing platform as most microprocessors and memory chips, and they allow on-chip programming of frame size, exposure, and other parameters.

  20. Perceptual Color Characterization of Cameras

    PubMed Central

    Vazquez-Corral, Javier; Connah, David; Bertalmío, Marcelo

    2014-01-01

    Color camera characterization, mapping outputs from the camera sensors to an independent color space, such as XY Z, is an important step in the camera processing pipeline. Until now, this procedure has been primarily solved by using a 3 × 3 matrix obtained via a least-squares optimization. In this paper, we propose to use the spherical sampling method, recently published by Finlayson et al., to perform a perceptual color characterization. In particular, we search for the 3 × 3 matrix that minimizes three different perceptual errors, one pixel based and two spatially based. For the pixel-based case, we minimize the CIE ?E error, while for the spatial-based case, we minimize both the S-CIELAB error and the CID error measure. Our results demonstrate an improvement of approximately 3% for the ?E error, 7% for the S-CIELAB error and 13% for the CID error measures. PMID:25490586

  1. Cameras for semiconductor process control

    NASA Technical Reports Server (NTRS)

    Porter, W. A.; Parker, D. L.

    1977-01-01

    The application of X-ray topography to semiconductor process control is described, considering the novel features of the high speed camera and the difficulties associated with this technique. The most significant results on the effects of material defects on device performance are presented, including results obtained using wafers processed entirely within this institute. Defects were identified using the X-ray camera and correlations made with probe data. Also included are temperature dependent effects of material defects. Recent applications and improvements of X-ray topographs of silicon-on-sapphire and gallium arsenide are presented with a description of a real time TV system prototype and of the most recent vacuum chuck design. Discussion is included of our promotion of the use of the camera by various semiconductor manufacturers.

  2. The GISMO-2 Bolometer Camera

    NASA Technical Reports Server (NTRS)

    Staguhn, Johannes G.; Benford, Dominic J.; Fixsen, Dale J.; Hilton, Gene; Irwin, Kent D.; Jhabvala, Christine A.; Kovacs, Attila; Leclercq, Samuel; Maher, Stephen F.; Miller, Timothy M.; Moseley, Samuel H.; Sharp, Elemer H.; Wollack, Edward J.

    2012-01-01

    We present the concept for the GISMO-2 bolometer camera) which we build for background-limited operation at the IRAM 30 m telescope on Pico Veleta, Spain. GISM0-2 will operate Simultaneously in the 1 mm and 2 mm atmospherical windows. The 1 mm channel uses a 32 x 40 TES-based Backshort Under Grid (BUG) bolometer array, the 2 mm channel operates with a 16 x 16 BUG array. The camera utilizes almost the entire full field of view provided by the telescope. The optical design of GISM0-2 was strongly influenced by our experience with the GISMO 2 mm bolometer camera which is successfully operating at the 30m telescope. GISMO is accessible to the astronomical community through the regular IRAM call for proposals.

  3. Payload topography camera of Chang'e-3

    NASA Astrophysics Data System (ADS)

    Yu, Guo-Bin; Liu, En-Hai; Zhao, Ru-Jin; Zhong, Jie; Zhou, Xiang-Dong; Zhou, Wu-Lin; Wang, Jin; Chen, Yuan-Pei; Hao, Yong-Jie

    2015-11-01

    Chang'e-3 was China's first soft-landing lunar probe that achieved a successful roving exploration on the Moon. A topography camera functioning as the lander's “eye” was one of the main scientific payloads installed on the lander. It was composed of a camera probe, an electronic component that performed image compression, and a cable assembly. Its exploration mission was to obtain optical images of the lunar topography in the landing zone for investigation and research. It also observed rover movement on the lunar surface and finished taking pictures of the lander and rover. After starting up successfully, the topography camera obtained static images and video of rover movement from different directions, 360° panoramic pictures of the lunar surface around the lander from multiple angles, and numerous pictures of the Earth. All images of the rover, lunar surface, and the Earth were clear, and those of the Chinese national flag were recorded in true color. This paper describes the exploration mission, system design, working principle, quality assessment of image compression, and color correction of the topography camera. Finally, test results from the lunar surface are provided to serve as a reference for scientific data processing and application.

  4. Characterization of a PET Camera Optimized for ProstateImaging

    SciTech Connect

    Huber, Jennifer S.; Choong, Woon-Seng; Moses, William W.; Qi,Jinyi; Hu, Jicun; Wang, G.C.; Wilson, David; Oh, Sang; Huesman, RonaldH.; Derenzo, Stephen E.

    2005-11-11

    We present the characterization of a positron emission tomograph for prostate imaging that centers a patient between a pair of external curved detector banks (ellipse: 45 cm minor, 70 cm major axis). The distance between detector banks adjusts to allow patient access and to position the detectors as closely as possible for maximum sensitivity with patients of various sizes. Each bank is composed of two axial rows of 20 HR+ block detectors for a total of 80 detectors in the camera. The individual detectors are angled in the transaxial plane to point towards the prostate to reduce resolution degradation in that region. The detectors are read out by modified HRRT data acquisition electronics. Compared to a standard whole-body PET camera, our dedicated-prostate camera has the same sensitivity and resolution, less background (less randoms and lower scatter fraction) and a lower cost. We have completed construction of the camera. Characterization data and reconstructed images of several phantoms are shown. Sensitivity of a point source in the center is 946 cps/mu Ci. Spatial resolution is 4 mm FWHM in the central region.

  5. In-flight calibration of the Dawn Framing Camera

    NASA Astrophysics Data System (ADS)

    Schröder, S. E.; Maue, T.; Gutiérrez Marqués, P.; Mottola, S.; Aye, K. M.; Sierks, H.; Keller, H. U.; Nathues, A.

    2013-11-01

    We present a method for calibrating images acquired by the Dawn Framing Camera (FC) that is based on the results of an in-flight calibration campaign performed during the cruise from Earth to Vesta. We describe this campaign and the data analysis in full. Both the primary camera FC2 and the backup camera FC1 are radiometrically and geometrically calibrated through observations of standard stars, star fields, and Solar System objects. The calibration in each spectral filter is accurate to within a few percent for point sources. Geometric distortion, small by design, is characterized with high accuracy. Dark current, monitored on a regular basis, is very low at flight operational temperatures. Out-of-field stray light was characterized using the Sun as a stray light source. In-field stray light is confirmed in narrow-band filter images of Vesta. Its magnitude and distribution are scene-dependent, and expected to contribute significantly to images of extended objects. Description of a method for in-field stray light correction is deferred to a follow-up paper, as is a discussion of the closely related topic of flat-fielding.

  6. Contact Angle Measurements Using a Simplified Experimental Setup

    ERIC Educational Resources Information Center

    Lamour, Guillaume; Hamraoui, Ahmed; Buvailo, Andrii; Xing, Yangjun; Keuleyan, Sean; Prakash, Vivek; Eftekhari-Bafrooei, Ali; Borguet, Eric

    2010-01-01

    A basic and affordable experimental apparatus is described that measures the static contact angle of a liquid drop in contact with a solid. The image of the drop is made with a simple digital camera by taking a picture that is magnified by an optical lens. The profile of the drop is then processed with ImageJ free software. The ImageJ contact…

  7. An Iterative Angle Trisection

    ERIC Educational Resources Information Center

    Muench, Donald L.

    2007-01-01

    The problem of angle trisection continues to fascinate people even though it has long been known that it can't be done with straightedge and compass alone. However, for practical purposes, a good iterative procedure can get you as close as you want. In this note, we present such a procedure. Using only straightedge and compass, our procedure…

  8. 33 CFR 162.240 - Tongass Narrows, Alaska; navigation.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Narrows, Alaska; navigation. (a) Definitions. The term “Tongass Narrows” includes the body of water lying... registered length or less, shall exceed a speed of 7 knots in the region of Tongass Narrows bounded to...

  9. 33 CFR 162.240 - Tongass Narrows, Alaska; navigation.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Narrows, Alaska; navigation. (a) Definitions. The term “Tongass Narrows” includes the body of water lying... registered length or less, shall exceed a speed of 7 knots in the region of Tongass Narrows bounded to...

  10. 33 CFR 162.240 - Tongass Narrows, Alaska; navigation.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Narrows, Alaska; navigation. (a) Definitions. The term “Tongass Narrows” includes the body of water lying... registered length or less, shall exceed a speed of 7 knots in the region of Tongass Narrows bounded to...

  11. 33 CFR 162.240 - Tongass Narrows, Alaska; navigation.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Narrows, Alaska; navigation. (a) Definitions. The term “Tongass Narrows” includes the body of water lying... registered length or less, shall exceed a speed of 7 knots in the region of Tongass Narrows bounded to...

  12. The Camera Comes to Court.

    ERIC Educational Resources Information Center

    Floren, Leola

    After the Lindbergh kidnapping trial in 1935, the American Bar Association sought to eliminate electronic equipment from courtroom proceedings. Eventually, all but two states adopted regulations applying that ban to some extent, and a 1965 Supreme Court decision encouraged the banning of television cameras at trials as well. Currently, some states…

  13. High speed multiwire photon camera

    NASA Technical Reports Server (NTRS)

    Lacy, Jeffrey L. (Inventor)

    1989-01-01

    An improved multiwire proportional counter camera having particular utility in the field of clinical nuclear medicine imaging. The detector utilizes direct coupled, low impedance, high speed delay lines, the segments of which are capacitor-inductor networks. A pile-up rejection test is provided to reject confused events otherwise caused by multiple ionization events occurring during the readout window.

  14. High speed multiwire photon camera

    NASA Technical Reports Server (NTRS)

    Lacy, Jeffrey L. (Inventor)

    1991-01-01

    An improved multiwire proportional counter camera having particular utility in the field of clinical nuclear medicine imaging. The detector utilizes direct coupled, low impedance, high speed delay lines, the segments of which are capacitor-inductor networks. A pile-up rejection test is provided to reject confused events otherwise caused by multiple ionization events occuring during the readout window.

  15. Stratoscope 2 integrating television camera

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The development, construction, test and delivery of an integrating television camera for use as the primary data sensor on Flight 9 of Stratoscope 2 is described. The system block diagrams are presented along with the performance data, and definition of the interface of the telescope with the power, telemetry, and communication system.

  16. Gamma-ray camera flyby

    SciTech Connect

    2010-01-01

    Animation based on an actual classroom demonstration of the prototype CCI-2 gamma-ray camera's ability to image a hidden radioactive source, a cesium-137 line source, in three dimensions. For more information see http://newscenter.lbl.gov/feature-stories/2010/06/02/applied-nuclear-physics/.

  17. Measuring Distances Using Digital Cameras

    ERIC Educational Resources Information Center

    Kendal, Dave

    2007-01-01

    This paper presents a generic method of calculating accurate horizontal and vertical object distances from digital images taken with any digital camera and lens combination, where the object plane is parallel to the image plane or tilted in the vertical plane. This method was developed for a project investigating the size, density and spatial…

  18. Recent advances in digital camera optics

    NASA Astrophysics Data System (ADS)

    Ishiguro, Keizo

    2012-10-01

    The digital camera market has extremely expanded in the last ten years. The zoom lens for digital camera is especially the key determining factor of the camera body size and image quality. Its technologies have been based on several analog technological progresses including the method of aspherical lens manufacturing and the mechanism of image stabilization. Panasonic is one of the pioneers of both technologies. I will introduce the previous trend in optics of zoom lens as well as original optical technologies of Panasonic digital camera "LUMIX", and in addition optics in 3D camera system. Besides, I would like to suppose the future trend in digital cameras.

  19. Preliminary Mapping of Permanently Shadowed and Sunlit Regions Using the Lunar Reconnaissance Orbiter Camera (LROC)

    NASA Astrophysics Data System (ADS)

    Speyerer, E.; Koeber, S.; Robinson, M. S.

    2010-12-01

    The spin axis of the Moon is tilted by only 1.5° (compared with the Earth's 23.5°), leaving some areas near the poles in permanent shadow while other nearby regions remain sunlit for a majority of the year. Theory, radar data, neutron measurements, and Lunar CRater Observation and Sensing Satellite (LCROSS) observations suggest that volatiles may be present in the cold traps created inside these permanently shadowed regions. While areas of near permanent illumination are prime locations for future lunar outposts due to benign thermal conditions and near constant solar power. The Lunar Reconnaissance Orbiter (LRO) has two imaging systems that provide medium and high resolution views of the poles. During almost every orbit the LROC Wide Angle Camera (WAC) acquires images at 100 m/pixel of the polar region (80° to 90° north and south latitude). In addition, the LROC Narrow Angle Camera (NAC) targets selected regions of interest at 0.7 to 1.5 m/pixel [Robinson et al., 2010]. During the first 11 months of the nominal mission, LROC acquired almost 6,000 WAC images and over 7,300 NAC images of the polar region (i.e., within 2° of pole). By analyzing this time series of WAC and NAC images, regions of permanent shadow and permanent, or near-permanent illumination can be quantified. The LROC Team is producing several reduced data products that graphically illustrate the illumination conditions of the polar regions. Illumination movie sequences are being produced that show how the lighting conditions change over a calendar year. Each frame of the movie sequence is a polar stereographic projected WAC image showing the lighting conditions at that moment. With the WAC’s wide field of view (~100 km at an altitude of 50 km), each frame has repeat coverage between 88° and 90° at each pole. The same WAC images are also being used to develop multi-temporal illumination maps that show the percent each 100 m × 100 m area is illuminated over a period of time. These maps are derived by stacking all the WAC frames, selecting a threshold to determine if the surface is illuminated, and summing the resulting binary images. In addition, mosaics of NAC images are also being produced for regions of interest at a scale of 0.7 to 1.5 m/pixel. The mosaics produced so far have revealed small illuminated surfaces on the tens of meters scale that were previously thought to be shadowed during that time. The LROC dataset of the polar regions complements previous illumination analysis of Clementine images [Bussey et al., 1999], Kaguya topography [Bussey et al., 2010], and the current efforts underway by the Lunar Orbiter Laser Altimeter (LOLA) Team [Mazarico et al., 2010] and provide an important new dataset for science and exploration. References: Bussey et al. (1999), Illumination conditions at the lunar south pole, Geophysical Research Letters, 26(9), 1187-1190. Bussey et al. (2010), Illumination conditions of the south pole of the Moon derived from Kaguya topography, Icarus, 208, 558-564. Mazarico et al. (2010), Illumination of the lunar poles from the Lunar Orbiter Laser Altimeter (LOLA) Topography Data, paper presented at 41st LPSC, Houston, TX. Robinson et al. (2010), Lunar Reconnaissance Orbiter Camera (LROC) Instrument Overview, Space Sci Rev, 150, 81-124.

  20. A Different Angle on Perspective

    ERIC Educational Resources Information Center

    Frantz, Marc

    2012-01-01

    When a plane figure is photographed from different viewpoints, lengths and angles appear distorted. Hence it is often assumed that lengths, angles, protractors, and compasses have no place in projective geometry. Here we describe a sense in which certain angles are preserved by projective transformations. These angles can be constructed with…

  1. Interference-induced angle-independent acoustical transparency

    SciTech Connect

    Qi, Lehua; Yu, Gaokun Wang, Ning; Wang, Xinlong; Wang, Guibo

    2014-12-21

    It is revealed that the Fano-like interference leads to the extraordinary acoustic transmission through a slab metamaterial of thickness much smaller than the wavelength, with each unit cell consisting of a Helmholtz resonator and a narrow subwavelength slit. More importantly, both the theoretical analysis and experimental measurement show that the angle-independent acoustical transparency can be realized by grafting a Helmholtz resonator and a quarter-wave resonator to the wall of a narrow subwavelength slit in each unit cell of a slit array. The observed phenomenon results from the interferences between the waves propagating in the slit, those re-radiated by the Helmholtz resonator, and those re-radiated by the quarter-wave resonator. The proposed design may find its applications in designing angle-independent acoustical filters and controlling the phase of the transmitted waves.

  2. Method for shaping and aiming narrow beams. [sonar mapping and target identification

    NASA Technical Reports Server (NTRS)

    Heyser, R. C. (inventor)

    1981-01-01

    A sonar method and apparatus is discribed which utilizes a linear frequency chirp in a transmitter/receiver having a correlator to synthesize a narrow beamwidth pattern from otherwise broadbeam transducers when there is relative velocity between the transmitter/receiver and the target. The chirp is so produced in a generator in bandwidth, B, and time, T, as to produce a time bandwidth product, TB, that is increased for a narrower angle. A replica of the chirp produced in a generator is time delayed and Doppler shifted for use as a reference in the receiver for correlation of received chirps from targets. This reference is Doppler shifted to select targets preferentially, thereby to not only synthesize a narrow beam but also aim the beam in azimuth and elevation.

  3. Boiling Visualization and Critical Heat Flux Phenomena In Narrow Rectangular Gap

    SciTech Connect

    J. J. Kim; Y. H. Kim; S. J. Kim; S. W. Noh; K. Y. Suh; J. Rempe; F. B. Cheung; S. B. Kim

    2004-12-01

    An experimental study was performed to investifate the pool boling critical hear flux (CHF) on one-dimensional inclined rectangular channels with narrow gaps by changing the orientation of a copper test heater assembly. In a pool of saturated water at atmospheric pressure, the test parameters include the gap sizes of 1,2,5, and 10 mm, andthe surface orientation angles from the downward facing position (180 degrees) to the vertical position (90 degress) respectively.

  4. Lytro camera technology: theory, algorithms, performance analysis

    NASA Astrophysics Data System (ADS)

    Georgiev, Todor; Yu, Zhan; Lumsdaine, Andrew; Goma, Sergio

    2013-03-01

    The Lytro camera is the first implementation of a plenoptic camera for the consumer market. We consider it a successful example of the miniaturization aided by the increase in computational power characterizing mobile computational photography. The plenoptic camera approach to radiance capture uses a microlens array as an imaging system focused on the focal plane of the main camera lens. This paper analyzes the performance of Lytro camera from a system level perspective, considering the Lytro camera as a black box, and uses our interpretation of Lytro image data saved by the camera. We present our findings based on our interpretation of Lytro camera file structure, image calibration and image rendering; in this context, artifacts and final image resolution are discussed.

  5. 21 CFR 892.1110 - Positron camera.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES RADIOLOGY DEVICES Diagnostic Devices § 892.1110 Positron camera. (a) Identification. A positron camera is a...

  6. HDTV light camera in triax version

    NASA Astrophysics Data System (ADS)

    Delmas, F.; Tichit, D.

    1992-08-01

    The history of color television cameras has been marked by three major breakthroughs: The possibility of remoting camera heads up to I km from the OB van (1971) - Portable cameras (1976); - The emergence of CCDs (1986). Professionals expect high-definition cameras to offer all the features and benefits connected with remoting camera heads and portable operation. Thomson Broadcast HD 1250 Light camera offers the same kind of operational flexibility as a standard television camera. It is a lightweight, portable unit and can be connected to the 08 van by a triax link more than 1 km long. The basic characteristics of this camera are described below; special emphasis is placed on the original features that make the HD 1250 Light a pioneer in its field.

  7. General linear cameras : theory and applications

    E-print Network

    Yu, Jingyi, 1978-

    2005-01-01

    I present a General Linear Camera (GLC) model that unifies many previous camera models into a single representation. The GLC model describes all perspective (pinhole), orthographic, and many multiperspective (including ...

  8. 21 CFR 892.1110 - Positron camera.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ...Positron camera. (a) Identification. A positron camera is a device intended to image the distribution of positron-emitting radionuclides in the body. This generic type of device may include signal analysis and display equipment,...

  9. 21 CFR 892.1110 - Positron camera.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ...Positron camera. (a) Identification. A positron camera is a device intended to image the distribution of positron-emitting radionuclides in the body. This generic type of device may include signal analysis and display equipment,...

  10. 21 CFR 892.1110 - Positron camera.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ...Positron camera. (a) Identification. A positron camera is a device intended to image the distribution of positron-emitting radionuclides in the body. This generic type of device may include signal analysis and display equipment,...

  11. 21 CFR 892.1110 - Positron camera.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ...Positron camera. (a) Identification. A positron camera is a device intended to image the distribution of positron-emitting radionuclides in the body. This generic type of device may include signal analysis and display equipment,...

  12. Understanding food consumption lifecycles using wearable cameras

    E-print Network

    Ng, Kher Hui; Shipp, Victoria; Mortier, Richard; Benford, Steve; Flintham, Martin; Rodden, Tom

    2015-08-20

    in question. In this paper, we describe use of wearable cameras to study motivations and behaviours around food consumption by focusing on two contrasting cultures, Malaysia and the UK. Our findings highlight the potential of wearable cameras to enhance...

  13. Angles in the Sky?

    NASA Astrophysics Data System (ADS)

    Behr, Bradford

    2005-09-01

    Tycho Brahe lived and worked in the late 1500s before the telescope was invented. He made highly accurate observations of the positions of planets, stars, and comets using large angle-measuring devices of his own design. You can use his techniques to observe the sky as well. For example, the degree, a common unit of measurement in astronomy, can be measured by holding your fist at arm's length up to the sky. Open your fist and observe the distance across the sky covered by the width of your pinky fingernail. That is, roughly, a degree! After some practice, and knowing that one degree equals four minutes, you can measure elapsed time by measuring the angle of the distance that the Moon appears to have moved and multiplying that number by four. You can also figure distances and sizes of things. These are not precise measurements, but rough estimates that can give you a "close-enough" answer.

  14. Silicone Contamination Camera for Developed for Shuttle Payloads

    NASA Technical Reports Server (NTRS)

    1996-01-01

    On many shuttle missions, silicone contamination from unknown sources from within or external to the shuttle payload bay has been a chronic problem plaguing experiment payloads. There is currently a wide range of silicone usage on the shuttle. Silicones are used to coat the shuttle tiles to enhance their ability to shed rain, and over 100 kg of RTV 560 silicone is used to seal white tiles to the shuttle surfaces. Silicones are also used in electronic components, potting compounds, and thermal control blankets. Efforts to date to identify and eliminate the sources of silicone contamination have not been highly successful and have created much controversy. To identify the sources of silicone contamination on the space shuttle, the NASA Lewis Research Center developed a contamination camera. This specially designed pinhole camera utilizes low-Earth-orbit atomic oxygen to develop a picture that identifies sources of silicone contamination on shuttle-launched payloads. The volatile silicone species travel through the aperture of the pinhole camera, and since volatile silicone species lose their hydrocarbon functionalities under atomic oxygen attack, the silicone adheres to the substrate as SiO_x. This glassy deposit should be spatially arranged in the image of the sources of silicone contamination. To view the contamination image, one can use ultrasensitive thickness measurement techniques, such as scanning variable-angle ellipsometry, to map the surface topography of the camera's substrate. The demonstration of a functional contamination camera would resolve the controversial debate concerning the amount and location of contamination sources, would allow corrective actions to be taken, and would demonstrate a useful tool for contamination documentation on future shuttle payloads, with near negligible effect on cost and weight.

  15. Image quality testing of assembled IR camera modules

    NASA Astrophysics Data System (ADS)

    Winters, Daniel; Erichsen, Patrik

    2013-10-01

    Infrared (IR) camera modules for the LWIR (8-12_m) that combine IR imaging optics with microbolometer focal plane array (FPA) sensors with readout electronics are becoming more and more a mass market product. At the same time, steady improvements in sensor resolution in the higher priced markets raise the requirement for imaging performance of objectives and the proper alignment between objective and FPA. This puts pressure on camera manufacturers and system integrators to assess the image quality of finished camera modules in a cost-efficient and automated way for quality control or during end-of-line testing. In this paper we present recent development work done in the field of image quality testing of IR camera modules. This technology provides a wealth of additional information in contrast to the more traditional test methods like minimum resolvable temperature difference (MRTD) which give only a subjective overall test result. Parameters that can be measured are image quality via the modulation transfer function (MTF) for broadband or with various bandpass filters on- and off-axis and optical parameters like e.g. effective focal length (EFL) and distortion. If the camera module allows for refocusing the optics, additional parameters like best focus plane, image plane tilt, auto-focus quality, chief ray angle etc. can be characterized. Additionally, the homogeneity and response of the sensor with the optics can be characterized in order to calculate the appropriate tables for non-uniformity correction (NUC). The technology can also be used to control active alignment methods during mechanical assembly of optics to high resolution sensors. Other important points that are discussed are the flexibility of the technology to test IR modules with different form factors, electrical interfaces and last but not least the suitability for fully automated measurements in mass production.

  16. Small Angle Neutron Scattering

    SciTech Connect

    Urban, Volker S

    2012-01-01

    Small Angle Neutron Scattering (SANS) probes structural details at the nanometer scale in a non-destructive way. This article gives an introduction to scientists who have no prior small-angle scattering knowledge, but who seek a technique that allows elucidating structural information in challenging situations that thwart approaches by other methods. SANS is applicable to a wide variety of materials including metals and alloys, ceramics, concrete, glasses, polymers, composites and biological materials. Isotope and magnetic interactions provide unique methods for labeling and contrast variation to highlight specific structural features of interest. In situ studies of a material s responses to temperature, pressure, shear, magnetic and electric fields, etc., are feasible as a result of the high penetrating power of neutrons. SANS provides statistical information on significant structural features averaged over the probed sample volume, and one can use SANS to quantify with high precision the structural details that are observed, for example, in electron microscopy. Neutron scattering is non-destructive; there is no need to cut specimens into thin sections, and neutrons penetrate deeply, providing information on the bulk material, free from surface effects. The basic principles of a SANS experiment are fairly simple, but the measurement, analysis and interpretation of small angle scattering data involves theoretical concepts that are unique to the technique and that are not widely known. This article includes a concise description of the basics, as well as practical know-how that is essential for a successful SANS experiment.

  17. Primary cerebellopontine angle angiosarcoma.

    PubMed

    Guode, Zhai; Qi, Pang; Hua, Guo; Shangchen, Xu; Hanbin, Wang

    2008-08-01

    Primary intracranial angiosarcomas are rare. Only a few cases have been reported in the literature. All cases reported were located in the supratentorial areas. To our knowledge, no cerebellopontine (CP) angle angiosarcoma has been reported. We report a 16-year-old girl who had mild headache, right-sided tinnitus and amblyacousia of 1-year's duration. She later developed abruptly severe headache and vomiting, accompanied by left hemiparesis, numbness, ataxia and bucking, and computerized tomography scan and magnetic resonance imaging were performed. There was a lesion in the right CP angle with haemorrhage and edema. The preoperative diagnosis was neurogenic tumor with haemorrhage. The patient underwent an emergency suboccipital craniectomy, and the lesion was excised completely. Histopathology and immunohistochemistry revealed an angiosarcoma. Postoperative radiotherapy was given. At the time of hospital discharge, she was in better clinical and neurological condition than her preoperative state. She has been followed up for 6 months and is is still in excellent condition without any sign of recurrence. This case report highlights that clinicians should be aware of the characteristics of angiosarcoma, and also stresses the need to include angiosarcoma in the differential diagnosis of rare lesions located in the CP angle. PMID:18314334

  18. Measurement of the surface wavelength distribution of narrow-band radiation by a colorimetric method

    SciTech Connect

    Kraiskii, A V; Mironova, T V; Sultanov, T T

    2010-09-10

    A method is suggested for determining the wavelength of narrow-band light from a digital photograph of a radiating surface. The digital camera used should be appropriately calibrated. The accuracy of the wavelength measurement is better than 1 nm. The method was tested on the yellow doublet of mercury spectrum and on the adjacent continuum of the incandescent lamp radiation spectrum. By means of the method suggested the homogeneity of holographic sensor swelling was studied in stationary and transient cases. (laser applications and other topics in quantum electronics)

  19. The Streak Camera Development at LLE

    SciTech Connect

    Jaanimagi, P.A.; Boni. R.; Butler, D.; Ghosh, S.; Donaldson, W.R.; Keck, R.L.

    2005-03-31

    The Diagnostic Development Group at the Laboratory for Laser Energetics has endeavored to build a stand-alone, remotely operated streak camera with comprehensive autofocus and self-calibration capability. Designated as the Rochester Optical Streak System (ROSS), it is a generic streak camera platform, capable of accepting a variety of streak tubes. The system performance is limited by the installed tube's electron optics, not by any camera subsystem. Moreover, the ROSS camera can be photometrically calibrated.

  20. Efficient, Narrow-Pass-Band Optical Filters

    NASA Technical Reports Server (NTRS)

    Sandford, Stephen P.

    1996-01-01

    Optical filters with both narrow pass bands and high efficiencies fabricated to design specifications. Offer tremendous improvements in performance for number of optical (including infrared) systems. In fiber-optic and free-space communication systems, precise frequency discrimination afforded by narrow pass bands of filters provide higher channel capacities. In active and passive remote sensors like lidar and gas-filter-correlation radiometers, increased efficiencies afforded by filters enhance detection of small signals against large background noise. In addition, sizes, weights, and power requirements of many optical and infrared systems reduced by taking advantage of gains in signal-to-noise ratios delivered by filters.

  1. Bio-inspired hemispherical compound eye camera

    NASA Astrophysics Data System (ADS)

    Xiao, Jianliang; Song, Young Min; Xie, Yizhu; Malyarchuk, Viktor; Jung, Inhwa; Choi, Ki-Joong; Liu, Zhuangjian; Park, Hyunsung; Lu, Chaofeng; Kim, Rak-Hwan; Li, Rui; Crozier, Kenneth B.; Huang, Yonggang; Rogers, John A.

    2014-03-01

    Compound eyes in arthropods demonstrate distinct imaging characteristics from human eyes, with wide angle field of view, low aberrations, high acuity to motion and infinite depth of field. Artificial imaging systems with similar geometries and properties are of great interest for many applications. However, the challenges in building such systems with hemispherical, compound apposition layouts cannot be met through established planar sensor technologies and conventional optics. We present our recent progress in combining optics, materials, mechanics and integration schemes to build fully functional artificial compound eye cameras. Nearly full hemispherical shapes (about 160 degrees) with densely packed artificial ommatidia were realized. The number of ommatidia (180) is comparable to those of the eyes of fire ants and bark beetles. The devices combine elastomeric compound optical elements with deformable arrays of thin silicon photodetectors, which were fabricated in the planar geometries and then integrated and elastically transformed to hemispherical shapes. Imaging results and quantitative ray-tracing-based simulations illustrate key features of operation. These general strategies seem to be applicable to other compound eye devices, such as those inspired by moths and lacewings (refracting superposition eyes), lobster and shrimp (reflecting superposition eyes), and houseflies (neural superposition eyes).

  2. Cooling the dark energy camera instrument

    SciTech Connect

    Schmitt, R.L.; Cease, H.; DePoy, D.; Diehl, H.T.; Estrada, J.; Flaugher, B.; Kuhlmann, S.; Onal, Birce; Stefanik, A.; /Fermilab

    2008-06-01

    DECam, camera for the Dark Energy Survey (DES), is undergoing general design and component testing. For an overview see DePoy, et al in these proceedings. For a description of the imager, see Cease, et al in these proceedings. The CCD instrument will be mounted at the prime focus of the CTIO Blanco 4m telescope. The instrument temperature will be 173K with a heat load of 113W. In similar applications, cooling CCD instruments at the prime focus has been accomplished by three general methods. Liquid nitrogen reservoirs have been constructed to operate in any orientation, pulse tube cryocoolers have been used when tilt angles are limited and Joule-Thompson or Stirling cryocoolers have been used with smaller heat loads. Gifford-MacMahon cooling has been used at the Cassegrain but not at the prime focus. For DES, the combined requirements of high heat load, temperature stability, low vibration, operation in any orientation, liquid nitrogen cost and limited space available led to the design of a pumped, closed loop, circulating nitrogen system. At zenith the instrument will be twelve meters above the pump/cryocooler station. This cooling system expected to have a 10,000 hour maintenance interval. This paper will describe the engineering basis including the thermal model, unbalanced forces, cooldown time, the single and two-phase flow model.

  3. Ortho-Rectification of Narrow Band Multi-Spectral Imagery Assisted by Dslr RGB Imagery Acquired by a Fixed-Wing Uas

    NASA Astrophysics Data System (ADS)

    Rau, J.-Y.; Jhan, J.-P.; Huang, C.-Y.

    2015-08-01

    Miniature Multiple Camera Array (MiniMCA-12) is a frame-based multilens/multispectral sensor composed of 12 lenses with narrow band filters. Due to its small size and light weight, it is suitable to mount on an Unmanned Aerial System (UAS) for acquiring high spectral, spatial and temporal resolution imagery used in various remote sensing applications. However, due to its wavelength range is only 10 nm that results in low image resolution and signal-to-noise ratio which are not suitable for image matching and digital surface model (DSM) generation. In the meantime, the spectral correlation among all 12 bands of MiniMCA images are low, it is difficult to perform tie-point matching and aerial triangulation at the same time. In this study, we thus propose the use of a DSLR camera to assist automatic aerial triangulation of MiniMCA-12 imagery and to produce higher spatial resolution DSM for MiniMCA12 ortho-image generation. Depending on the maximum payload weight of the used UAS, these two kinds of sensors could be collected at the same time or individually. In this study, we adopt a fixed-wing UAS to carry a Canon EOS 5D Mark2 DSLR camera and a MiniMCA-12 multi-spectral camera. For the purpose to perform automatic aerial triangulation between a DSLR camera and the MiniMCA-12, we choose one master band from MiniMCA-12 whose spectral range has overlap with the DSLR camera. However, all lenses of MiniMCA-12 have different perspective centers and viewing angles, the original 12 channels have significant band misregistration effect. Thus, the first issue encountered is to reduce the band misregistration effect. Due to all 12 MiniMCA lenses being frame-based, their spatial offsets are smaller than 15 cm and all images are almost 98% overlapped, we thus propose a modified projective transformation (MPT) method together with two systematic error correction procedures to register all 12 bands of imagery on the same image space. It means that those 12 bands of images acquired at the same exposure time will have same interior orientation parameters (IOPs) and exterior orientation parameters (EOPs) after band-to-band registration (BBR). Thus, in the aerial triangulation stage, the master band of MiniMCA-12 was treated as a reference channel to link with DSLR RGB images. It means, all reference images from the master band of MiniMCA-12 and all RGB images were triangulated at the same time with same coordinate system of ground control points (GCP). Due to the spatial resolution of RGB images is higher than the MiniMCA-12, the GCP can be marked on the RGB images only even they cannot be recognized on the MiniMCA images. Furthermore, a one meter gridded digital surface model (DSM) is created by the RGB images and applied to the MiniMCA imagery for ortho-rectification. Quantitative error analyses show that the proposed BBR scheme can achieve 0.33 pixels of average misregistration residuals length and the co-registration errors among 12 MiniMCA ortho-images and between MiniMCA and Canon RGB ortho-images are all less than 0.6 pixels. The experimental results demonstrate that the proposed method is robust, reliable and accurate for future remote sensing applications.

  4. 16 CFR 501.1 - Camera film.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 16 Commercial Practices 1 2010-01-01 2010-01-01 false Camera film. 501.1 Section 501.1 Commercial... 500 § 501.1 Camera film. Camera film packaged and labeled for retail sale is exempt from the net... should be expressed, provided: (a) The net quantity of contents on packages of movie film and bulk...

  5. 16 CFR 501.1 - Camera film.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 16 Commercial Practices 1 2013-01-01 2013-01-01 false Camera film. 501.1 Section 501.1 Commercial... 500 § 501.1 Camera film. Camera film packaged and labeled for retail sale is exempt from the net... should be expressed, provided: (a) The net quantity of contents on packages of movie film and bulk...

  6. 16 CFR 501.1 - Camera film.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 16 Commercial Practices 1 2011-01-01 2011-01-01 false Camera film. 501.1 Section 501.1 Commercial... 500 § 501.1 Camera film. Camera film packaged and labeled for retail sale is exempt from the net... should be expressed, provided: (a) The net quantity of contents on packages of movie film and bulk...

  7. 16 CFR 501.1 - Camera film.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 16 Commercial Practices 1 2012-01-01 2012-01-01 false Camera film. 501.1 Section 501.1 Commercial... 500 § 501.1 Camera film. Camera film packaged and labeled for retail sale is exempt from the net... should be expressed, provided: (a) The net quantity of contents on packages of movie film and bulk...

  8. 16 CFR 501.1 - Camera film.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 16 Commercial Practices 1 2014-01-01 2014-01-01 false Camera film. 501.1 Section 501.1 Commercial... 500 § 501.1 Camera film. Camera film packaged and labeled for retail sale is exempt from the net... should be expressed, provided: (a) The net quantity of contents on packages of movie film and bulk...

  9. Introducing the Single Camera VTR System.

    ERIC Educational Resources Information Center

    Mattingly, Grayson; Smith, Welby

    This basic manual designed to introduce helical scan videotape recording is written in nontechnical style. The operating principles of videotape recording are explained, and practical standards for selecting basic equipment for a single camera system are suggested. This includes cameras, camera supports, lenses, cables, tape recorders, monitors,…

  10. 21 CFR 892.1110 - Positron camera.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Positron camera. 892.1110 Section 892.1110 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES RADIOLOGY DEVICES Diagnostic Devices § 892.1110 Positron camera. (a) Identification. A positron camera is a device intended to image...

  11. Vehicle Motion Estimation Using an Infrared Camera

    E-print Network

    Schön, Thomas

    Vehicle Motion Estimation Using an Infrared Camera Emil Nilsson Christian Lundquist Thomas B. Sch a far infrared camera, inertial sensors and the vehicle speed. This information is already present of estimating the vehicle motion using measurements from a far infrared (FIR) camera, along with proprioceptive

  12. LDEF yaw and pitch angle estimates

    NASA Technical Reports Server (NTRS)

    Banks, Bruce A.; Gebauer, Linda

    1992-01-01

    Quantification of the LDEF yaw and pitch misorientations is crucial to the knowledge of atomic oxygen exposure of samples placed on LDEF. Video camera documentation of the LDEF spacecraft prior to grapple attachment, atomic oxygen shadows on experiment trays and longerons, and a pinhole atomic oxygen camera placed on LDEF provided sources of documentation of the yaw and pitch misorientation. Based on uncertainty-weighted averaging of data, the LDEF yaw offset was found to be 8.1 plus or minus 0.6 degrees, allowing higher atomic oxygen exposure of row 12 than initially anticipated. The LDEF pitch angle offset was found to be 0.8 plus or minus 0.4 degrees, such that the space end was tipped forward toward the direction of travel. The resulting consequences of the yaw and pitch misorientation of LDEF on the atomic oxygen fluence is a factor of 2.16 increase for samples located on row 12, and a factor of 1.18 increase for samples located on the space end compared to that which would be expected for perfect orientation.

  13. Switchable viewing angle display with a compact directional backlight and striped diffuser.

    PubMed

    Wang, Yi-Jun; Lu, Jian-Gang; Chao, Wei-Chung; Shieh, Han-Ping D

    2015-08-10

    A compact high-directionality backlight module combined with a striped diffuser is proposed to achieve an adjustable viewing angle for eco-display. The micro-prisms on the compact light guide plate guide the emitting rays to the normal viewing angle, whereas a set of striped diffusers scatter the rays to a wide viewing angle. View cones of ± 10° / ± 55° were obtained for narrow/wide viewing modes with 88% / 85% uniformity of spatial luminance, respectively. Compared with the conventional backlight, the optical efficiencies were increased by factors of 1.47 and 1.38 in narrow and wide viewing modes, respectively. In addition, only 5% of power consumption was needed when the backlight worked in private narrow viewing mode to maintain the same luminance as that of a conventional backlight. PMID:26367992

  14. Combustion pinhole-camera system

    DOEpatents

    Witte, A.B.

    1982-05-19

    A pinhole camera system is described utilizing a sealed optical-purge assembly which provides optical access into a coal combustor or other energy conversion reactors. The camera system basically consists of a focused-purge pinhole optical port assembly, a conventional TV vidicon receiver, an external, variable density light filter which is coupled electronically to the vidicon automatic gain control (agc). The key component of this system is the focused-purge pinhole optical port assembly which utilizes a purging inert gas to keep debris from entering the port and a lens arrangement which transfers the pinhole to the outside of the port assembly. One additional feature of the port assembly is that it is not flush with the interior of the combustor.

  15. Electronographic cameras for space astronomy.

    NASA Technical Reports Server (NTRS)

    Carruthers, G. R.; Opal, C. B.

    1972-01-01

    Magnetically-focused electronographic cameras have been under development at the Naval Research Laboratory for use in far-ultraviolet imagery and spectrography, primarily in astronomical and optical-geophysical observations from sounding rockets and space vehicles. Most of this work has been with cameras incorporating internal optics of the Schmidt or wide-field all-reflecting types. More recently, we have begun development of electronographic spectrographs incorporating an internal concave grating, operating at normal or grazing incidence. We also are developing electronographic image tubes of the conventional end-window-photo-cathode type, for far-ultraviolet imagery at the focus of a large space telescope, with image formats up to 120 mm in diameter.

  16. A 10-microm infrared camera.

    PubMed

    Arens, J F; Jernigan, J G; Peck, M C; Dobson, C A; Kilk, E; Lacy, J; Gaalema, S

    1987-09-15

    An IR camera has been built at the University of California at Berkeley for astronomical observations. The camera has been used primarily for high angular resolution imaging at mid-IR wavelengths. It has been tested at the University of Arizona 61- and 90-in. telescopes near Tucson and the NASA Infrared Telescope Facility on Mauna Kea, HI. In the observations the system has been used as an imager with interference coated and Fabry-Perot filters. These measurements have demonstrated a sensitivity consistent with photon shot noise, showing that the system is limited by the radiation from the telescope and atmosphere. Measurements of read noise, crosstalk, and hysteresis have been made in our laboratory. PMID:20490151

  17. Artificial compound eye zoom camera.

    PubMed

    Duparré, Jacques; Wippermann, Frank; Dannberg, Peter; Bräuer, Andreas

    2008-12-01

    We demonstrate a highly compact image capturing system with variable field of view but without any mechanically moving parts. The camera combines an ultra-thin artificial apposition compound eye with one variable focal length liquid lens. The change of optical power of the liquid lens when applying a voltage results in a change of the magnification of the microlens array imaging system. However, its effect on focusing of the individual microlenses can be neglected due to their small focal length. PMID:19029582

  18. Graphic design of pinhole cameras

    NASA Technical Reports Server (NTRS)

    Edwards, H. B.; Chu, W. P.

    1979-01-01

    The paper describes a graphic technique for the analysis and optimization of pinhole size and focal length. The technique is based on the use of the transfer function of optical elements described by Scott (1959) to construct the transfer function of a circular pinhole camera. This transfer function is the response of a component or system to a pattern of lines having a sinusoidally varying radiance at varying spatial frequencies. Some specific examples of graphic design are presented.

  19. ISO camera array development status

    NASA Technical Reports Server (NTRS)

    Sibille, F.; Cesarsky, C.; Agnese, P.; Rouan, D.

    1989-01-01

    A short outline is given of the Infrared Space Observatory Camera (ISOCAM), one of the 4 instruments onboard the Infrared Space Observatory (ISO), with the current status of its two 32x32 arrays, an InSb charge injection device (CID) and a Si:Ga direct read-out (DRO), and the results of the in orbit radiation simulation with gamma ray sources. A tentative technique for the evaluation of the flat fielding accuracy is also proposed.

  20. Effects of red light cameras on violations and crashes: a review of the international literature.

    PubMed

    Retting, Richard A; Ferguson, Susan A; Hakkert, A Shalom

    2003-03-01

    Red light running is a frequent cause of motor vehicle crashes and injuries. A primary countermeasure for red light running crashes is police traffic enforcement. In recent years, many police agencies have begun using automated red light cameras as a supplement to conventional enforcement methods. The present study reviewed and evaluated available evidence in the international literature regarding the effectiveness of cameras to reduce both red light violations and crashes. Camera enforcement generally reduces violations by an estimated 40-50%. In terms of crash effects, most studies contain methodological flaws that, to varying degrees, either overestimate (failure to adjust for regression to the mean) or underestimate (comparison with nearby signalized intersections affected by cameras) crash effects. Mindful of these limitations, the research generally indicates that camera enforcement can significantly reduce injury crashes at signalized intersections, in particular right-angle injury crashes. Most studies reported increases in rear-end crashes following camera installation. Taken together the studies indicate that, overall, injury crashes, including rear-end collisions, were reduced by 25-30% as a result of camera enforcement. PMID:14522657

  1. Rover mast calibration, exact camera pointing, and camara handoff for visual target tracking

    NASA Technical Reports Server (NTRS)

    Kim, Won S.; Ansar, Adnan I.; Steele, Robert D.

    2005-01-01

    This paper presents three technical elements that we have developed to improve the accuracy of the visual target tracking for single-sol approach-and-instrument placement in future Mars rover missions. An accurate, straightforward method of rover mast calibration is achieved by using a total station, a camera calibration target, and four prism targets mounted on the rover. The method was applied to Rocky8 rover mast calibration and yielded a 1.1-pixel rms residual error. Camera pointing requires inverse kinematic solutions for mast pan and tilt angles such that the target image appears right at the center of the camera image. Two issues were raised. Mast camera frames are in general not parallel to the masthead base frame. Further, the optical axis of the camera model in general does not pass through the center of the image. Despite these issues, we managed to derive non-iterative closed-form exact solutions, which were verified with Matlab routines. Actual camera pointing experiments aver 50 random target image paints yielded less than 1.3-pixel rms pointing error. Finally, a purely geometric method for camera handoff using stereo views of the target has been developed. Experimental test runs show less than 2.5 pixels error on high-resolution Navcam for Pancam-to-Navcam handoff, and less than 4 pixels error on lower-resolution Hazcam for Navcam-to-Hazcam handoff.

  2. The Spacelab Wide Angle Telescope (SWAT)

    NASA Technical Reports Server (NTRS)

    West, R. M.; Gull, T. R.; Henize, K. G.; Bertola, F.

    1979-01-01

    A fast wide angle telescope that will be capable of imaging to the darker sky limit and in the ultraviolet wavelength region available above the atmosphere is described. The telescope (SWAT) has a resolution comparable to that of the large ground-based Schmidt telescope and a field of at least five degrees. A number of astrophysically important investigations can only be accomplished with such a telescope, e.g., detection of hidden, hot objects like hot white dwarfs and subwarfs in stellar binary systems, and energetic regions in globular clusters and galaxy nuclei. It permits unique studies of the UV-morphology of extended objects and allows discovery of very faint extensions, halos, jets, and filaments in galaxies. It can contribute to the investigation of dust in the Milky Way and in other galaxies and, with an objective prism, spectra of very faint objects can be obtained. The SWAT will localize objects for further study with the narrow-field Space Telescope.

  3. Dual-mode switching of a liquid crystal panel for viewing angle control

    NASA Astrophysics Data System (ADS)

    Baek, Jong-In; Kwon, Yong-Hoan; Kim, Jae Chang; Yoon, Tae-Hoon

    2007-03-01

    The authors propose a method to control the viewing angle of a liquid crystal (LC) panel using dual-mode switching. To realize both wide viewing angle (WVA) characteristics and narrow viewing angle (NVA) characteristics with a single LC panel, the authors use two different dark states. The LC layer can be aligned homogeneously parallel to the transmission axis of the bottom polarizer for WVA dark state operation, while it can be aligned vertically for NVA dark state operation. The authors demonstrated that viewing angle control can be achieved with a single panel without any loss of contrast at the front.

  4. Unassisted 3D camera calibration

    NASA Astrophysics Data System (ADS)

    Atanassov, Kalin; Ramachandra, Vikas; Nash, James; Goma, Sergio R.

    2012-03-01

    With the rapid growth of 3D technology, 3D image capture has become a critical part of the 3D feature set on mobile phones. 3D image quality is affected by the scene geometry as well as on-the-device processing. An automatic 3D system usually assumes known camera poses accomplished by factory calibration using a special chart. In real life settings, pose parameters estimated by factory calibration can be negatively impacted by movements of the lens barrel due to shaking, focusing, or camera drop. If any of these factors displaces the optical axes of either or both cameras, vertical disparity might exceed the maximum tolerable margin and the 3D user may experience eye strain or headaches. To make 3D capture more practical, one needs to consider unassisted (on arbitrary scenes) calibration. In this paper, we propose an algorithm that relies on detection and matching of keypoints between left and right images. Frames containing erroneous matches, along with frames with insufficiently rich keypoint constellations, are detected and discarded. Roll, pitch yaw , and scale differences between left and right frames are then estimated. The algorithm performance is evaluated in terms of the remaining vertical disparity as compared to the maximum tolerable vertical disparity.

  5. Coaxial fundus camera for opthalmology

    NASA Astrophysics Data System (ADS)

    de Matos, Luciana; Castro, Guilherme; Castro Neto, Jarbas C.

    2015-09-01

    A Fundus Camera for ophthalmology is a high definition device which needs to meet low light illumination of the human retina, high resolution in the retina and reflection free image1. Those constraints make its optical design very sophisticated, but the most difficult to comply with is the reflection free illumination and the final alignment due to the high number of non coaxial optical components in the system. Reflection of the illumination, both in the objective and at the cornea, mask image quality, and a poor alignment make the sophisticated optical design useless. In this work we developed a totally axial optical system for a non-midriatic Fundus Camera. The illumination is performed by a LED ring, coaxial with the optical system and composed of IR of visible LEDs. The illumination ring is projected by the objective lens in the cornea. The Objective, LED illuminator, CCD lens are coaxial making the final alignment easily to perform. The CCD + capture lens module is a CCTV camera with autofocus and Zoom built in, added to a 175 mm focal length doublet corrected for infinity, making the system easily operated and very compact.

  6. Narrow vision after view-broadening travel.

    PubMed

    Melo, Mariana de Mendonça; Ciriano, Jose P Martinez; van Genderen, Perry J J

    2008-01-01

    Loss of vision is a threatening presentation of disease. We describe a case of acute idiopathic blind spot enlargement in a 26-year-old male traveler who presented with narrow vision after a journey to Indonesia. Although the patient used mefloquine at time of presentation, we were unable to retrieve sound data incriminating mefloquine in this rare eye disorder. PMID:18666929

  7. Narrow-Band Applications of Communications Satellites.

    ERIC Educational Resources Information Center

    Cowlan, Bert; Horowitz, Andrew

    This paper attempts to describe the advantages of "narrow-band" applications of communications satellites for education. It begins by discussing the general controversy surrounding the use of satellites in education, by placing the concern within the larger context of the general debate over the uses of new technologies in education, and by…

  8. ANIR : Atacama Near-Infrared Camera for the 1.0-m miniTAO Telescope

    E-print Network

    Konishi, Masahiro; Tateuchi, Ken; Takahashi, Hidenori; Kitagawa, Yutaro; Kato, Natsuko; Sako, Shigeyuki; Uchimoto, Yuka K; Toshikawa, Koji; Ohsawa, Ryou; Yamamuro, Tomoyasu; Asano, Kentaro; Ita, Yoshifusa; Kamizuka, Takafumi; Komugi, Shinya; Koshida, Shintaro; Manabe, Sho; Matsunaga, Noriyuki; Minezaki, Takeo; Morokuma, Tomoki; Nakashima, Asami; Takagi, Toshinobu; Tanabé, Toshihiko; Uchiyama, Mizuho; Aoki, Tsutomu; Doi, Mamoru; Handa, Toshihiro; Kato, Daisuke; Kawara, Kimiaki; Kohno, Kotaro; Miyata, Takashi; Nakamura, Tomohiko; Okada, Kazushi; Soyano, Takao; Tamura, Yoichi; Tanaka, Masuo; Tarusawa, Ken'ichi; Yoshii, Yuzuru

    2015-01-01

    We have developed a near-infrared camera called ANIR (Atacama Near-InfraRed camera) for the University of Tokyo Atacama Observatory 1.0m telescope (miniTAO) installed at the summit of Cerro Chajnantor (5640 m above sea level) in northern Chile. The camera provides a field of view of 5'.1 $\\times$ 5'.1 with a spatial resolution of 0".298 /pixel in the wavelength range of 0.95 to 2.4 $\\mu$m. Taking advantage of the dry site, the camera is capable of hydrogen Paschen-$\\alpha$ (Pa$\\alpha$, $\\lambda=$1.8751 $\\mu$m in air) narrow-band imaging observations, at which wavelength ground-based observations have been quite difficult due to deep atmospheric absorption mainly from water vapor. We have been successfully obtaining Pa$\\alpha$ images of Galactic objects and nearby galaxies since the first-light observation in 2009 with ANIR. The throughputs at the narrow-band filters ($N1875$, $N191$) including the atmospheric absorption show larger dispersion (~10%) than those at broad-band filters (a few %), indicating that ...

  9. On the absolute calibration of SO2 cameras

    NASA Astrophysics Data System (ADS)

    Lübcke, P.; Bobrowski, N.; Illing, S.; Kern, C.; Alvarez Nieves, J. M.; Vogel, L.; Zielcke, J.; Delgado Granados, H.; Platt, U.

    2013-03-01

    Sulphur dioxide emission rate measurements are an important tool for volcanic monitoring and eruption risk assessment. The SO2 camera technique remotely measures volcanic emissions by analysing the ultraviolet absorption of SO2 in a narrow spectral window between 300 and 320 nm using solar radiation scattered in the atmosphere. The SO2 absorption is selectively detected by mounting band-pass interference filters in front of a two-dimensional, UV-sensitive CCD detector. One important step for correct SO2 emission rate measurements that can be compared with other measurement techniques is a correct calibration. This requires conversion from the measured optical density to the desired SO2 column density (CD). The conversion factor is most commonly determined by inserting quartz cells (cuvettes) with known amounts of SO2 into the light path. Another calibration method uses an additional narrow field-of-view Differential Optical Absorption Spectroscopy system (NFOV-DOAS), which measures the column density simultaneously in a small area of the camera's field-of-view. This procedure combines the very good spatial and temporal resolution of the SO2 camera technique with the more accurate column densities obtainable from DOAS measurements. This work investigates the uncertainty of results gained through the two commonly used, but quite different, calibration methods (DOAS and calibration cells). Measurements with three different instruments, an SO2 camera, a NFOV-DOAS system and an Imaging DOAS (I-DOAS), are presented. We compare the calibration-cell approach with the calibration from the NFOV-DOAS system. The respective results are compared with measurements from an I-DOAS to verify the calibration curve over the spatial extent of the image. The results show that calibration cells, while working fine in some cases, can lead to an overestimation of the SO2 CD by up to 60% compared with CDs from the DOAS measurements. Besides these errors of calibration, radiative transfer effects (e.g. light dilution, multiple scattering) can significantly influence the results of both instrument types. The measurements presented in this work were taken at Popocatépetl, Mexico, between 1 March 2011 and 4 March 2011. Average SO2 emission rates between 4.00 and 14.34 kg s-1 were observed.

  10. HHEBBES! All sky camera system: status update

    NASA Astrophysics Data System (ADS)

    Bettonvil, F.

    2015-01-01

    A status update is given of the HHEBBES! All sky camera system. HHEBBES!, an automatic camera for capturing bright meteor trails, is based on a DSLR camera and a Liquid Crystal chopper for measuring the angular velocity. Purpose of the system is to a) recover meteorites; b) identify origin/parental bodies. In 2015, two new cameras were rolled out: BINGO! -alike HHEBBES! also in The Netherlands-, and POgLED, in Serbia. BINGO! is a first camera equipped with a longer focal length fisheye lens, to further increase the accuracy. Several minor improvements have been done and the data reduction pipeline was used for processing two prominent Dutch fireballs.

  11. Passive Millimeter Wave Camera (PMMWC) at TRW

    NASA Technical Reports Server (NTRS)

    1997-01-01

    Engineers at TRW, Redondo Beach, California, inspect the Passive Millimeter Wave Camera, a weather-piercing camera designed to 'see' through fog, clouds, smoke and dust. Operating in the millimeter wave portion of the electromagnetic spectrum, the camera creates visual-like video images of objects, people, runways, obstacles and the horizon. A demonstration camera (shown in photo) has been completed and is scheduled for checkout tests and flight demonstration. Engineer (left) holds a compact, lightweight circuit board containing 40 complete radiometers, including antenna, monolithic millimeter wave integrated circuit (MMIC) receivers and signal processing and readout electronics that forms the basis for the camera's 1040-element focal plane array.

  12. Passive Millimeter Wave Camera (PMMWC) at TRW

    NASA Technical Reports Server (NTRS)

    1997-01-01

    Engineers at TRW, Redondo Beach, California, inspect the Passive Millimeter Wave Camera, a weather-piercing camera designed to see through fog, clouds, smoke and dust. Operating in the millimeter wave portion of the electromagnetic spectrum, the camera creates visual-like video images of objects, people, runways, obstacles and the horizon. A demonstration camera (shown in photo) has been completed and is scheduled for checkout tests and flight demonstration. Engineer (left) holds a compact, lightweight circuit board containing 40 complete radiometers, including antenna, monolithic millimeter wave integrated circuit (MMIC) receivers and signal processing and readout electronics that forms the basis for the camera's 1040-element focal plane array.

  13. On the absolute calibration of SO2 cameras

    NASA Astrophysics Data System (ADS)

    Lübcke, P.; Bobrowski, N.; Illing, S.; Kern, C.; Alvarez Nieves, J. M.; Vogel, L.; Zielcke, J.; Delgado Granados, H.; Platt, U.

    2012-09-01

    Sulphur dioxide emission flux measurements are an important tool for volcanic monitoring and eruption risk assessment. The SO2 camera technique remotely measures volcanic emissions by analysing the ultraviolet absorption of SO2 in a narrow spectral window between 305 nm and 320 nm using solar radiation scattered in the atmosphere. The SO2 absorption is selectively detected by mounting band-pass interference filters in front of a two-dimensional, UV-sensitive CCD detector. While this approach is simple and delivers valuable insights into the two-dimensional SO2 distribution, absolute calibration has proven to be difficult. An accurate calibration of the SO2 camera (i.e., conversion from optical density to SO2 column density, CD) is crucial to obtain correct SO2 CDs and flux measurements that are comparable to other measurement techniques and can be used for volcanological applications. The most common approach for calibrating SO2 camera measurements is based on inserting quartz cells (cuvettes) containing known amounts of SO2 into the light path. It has been found, however, that reflections from the windows of the calibration cell can considerably affect the signal measured by the camera. Another possibility for calibration relies on performing simultaneous measurements in a small area of the camera's field-of-view (FOV) by a narrow-field-of-view Differential Optical Absorption Spectroscopy (NFOV-DOAS) system. This procedure combines the very good spatial and temporal resolution of the SO2 camera technique with the more accurate column densities obtainable from DOAS measurements. This work investigates the uncertainty of results gained through the two commonly used, but quite different calibration methods (DOAS and calibration cells). Measurements with three different instruments, an SO2 camera, a NFOV-DOAS system and an Imaging DOAS (IDOAS), are presented. We compare the calibration-cell approach with the calibration from the NFOV-DOAS system. The respective results are compared with measurements from an IDOAS to verify the calibration curve over the spatial extend of the image. Our results show that calibration cells can lead to an overestimation of the SO2 CD by up to 60% compared with CDs from the DOAS measurements. Besides these errors of calibration, radiative transfer effects (e.g. light dilution, multiple scattering) can significantly influence the results of both instrument types. These effects can lead to an even more significant overestimation or, depending on the measurement conditions, an underestimation of the true CD. Previous investigations found that possible errors can be more than an order of magnitude. However, the spectral information from the DOAS measurements allows to correct for these radiative transfer effects. The measurement presented in this work were taken at Popocatépetl, Mexico, between 1 March 2011 and 4 March 2011. Average SO2 emission rates between 4.00 kg s-1 and 14.34 kg s-1 were observed.

  14. Optimum Projection Angle for Attaining Maximum Distance in a Soccer Punt Kick

    PubMed Central

    Linthorne, Nicholas P.; Patel, Dipesh S.

    2011-01-01

    To produce the greatest horizontal distance in a punt kick the ball must be projected at an appropriate angle. Here, we investigated the optimum projection angle that maximises the distance attained in a punt kick by a soccer goalkeeper. Two male players performed many maximum-effort kicks using projection angles of between 10° and 90°. The kicks were recorded by a video camera at 100 Hz and a 2 D biomechanical analysis was conducted to obtain measures of the projection velocity, projection angle, projection height, ball spin rate, and foot velocity at impact. The player’s optimum projection angle was calculated by substituting mathematical equations for the relationships between the projection variables into the equations for the aerodynamic flight of a soccer ball. The calculated optimum projection angles were in agreement with the player’s preferred projection angles (40° and 44°). In projectile sports even a small dependence of projection velocity on projection angle is sufficient to produce a substantial shift in the optimum projection angle away from 45°. In the punt kicks studied here, the optimum projection angle was close to 45° because the projection velocity of the ball remained almost constant across all projection angles. This result is in contrast to throwing and jumping for maximum distance, where the projection velocity the athlete is able to achieve decreases substantially with increasing projection angle and so the optimum projection angle is well below 45°. Key points The optimum projection angle that maximizes the distance of a punt kick by a soccer goalkeeper is about 45°. The optimum projection angle is close to 45° because the projection velocity of the ball is almost the same at all projection angles. This result is in contrast to throwing and jumping for maximum distance, where the optimum projection angle is well below 45° because the projection velocity the athlete is able to achieve decreases substantially with increasing projection angle. PMID:24149315

  15. A Note on Angle Construction

    ERIC Educational Resources Information Center

    Francis, Richard L.

    1978-01-01

    The author investigates the construction of angles (using Euclidean tools) through a numerical approach. He calls attention to the surprising impossibility of constructing the conventional units of angle measure--the degree, minute, second, radian, and mil. (MN)

  16. Mini gamma camera, camera system and method of use

    DOEpatents

    Majewski, Stanislaw (Grafton, VA); Weisenberger, Andrew G. (Grafton, VA); Wojcik, Randolph F. (Yorktown, VA)

    2001-01-01

    A gamma camera comprising essentially and in order from the front outer or gamma ray impinging surface: 1) a collimator, 2) a scintillator layer, 3) a light guide, 4) an array of position sensitive, high resolution photomultiplier tubes, and 5) printed circuitry for receipt of the output of the photomultipliers. There is also described, a system wherein the output supplied by the high resolution, position sensitive photomultipiler tubes is communicated to: a) a digitizer and b) a computer where it is processed using advanced image processing techniques and a specific algorithm to calculate the center of gravity of any abnormality observed during imaging, and c) optional image display and telecommunications ports.

  17. Calibration of angle standards

    NASA Astrophysics Data System (ADS)

    Henrique Brum Vieira, Luiz; Stone, Jack; Viliesid, Miguel; Gastaldi, Bruno R.; Przybylska, Joanna; Chaudhary, K. P.

    2015-01-01

    In 2000, a key comparison, CCL-K3 (optical polygon and angle blocks) was started, piloted by NMISA. Based on it, in 2007, the SIM metrological region started a SIM.L-K3 key comparison piloted by INMETRO. The results of this regional comparison (RMO key comparison) contribute to the Mutual Recognition Arrangement (MRA) between the national metrology institutes of the Metre Convention. It is linked with the CCL-K3 key comparison via laboratories that participated in both the CIPM and the RMO comparisons. This common participation establishes the link between the comparisons and ensures equivalence of national metrology institutes, according to the MRA between NMIs. The SIM NMIs that took part in the CCL-K3 were NIST, NRC and CENAM. However, NRC withdrew from it. GUM from Poland (EURAMET) and NPLI from India (APMP) were invited to participate in the SIM.L-K3 key comparison. The circulation of artefacts (a 12 faces polygon and 4 angle blocks) started in 2008 and was completed in 2009. Main text. To reach the main text of this paper, click on Final Report. Note that this text is that which appears in Appendix B of the BIPM key comparison database kcdb.bipm.org/. The final report has been peer-reviewed and approved for publication by the CCL, according to the provisions of the CIPM Mutual Recognition Arrangement (CIPM MRA).

  18. Glancing angle RF sheaths

    NASA Astrophysics Data System (ADS)

    D'Ippolito, D. A.; Myra, J. R.

    2013-10-01

    RF sheaths occur in tokamaks when ICRF waves encounter conducting boundaries. The sheath plays an important role in determining the efficiency of ICRF heating, the impurity influxes from the edge plasma, and the plasma-facing component damage. An important parameter in sheath theory is the angle ? between the equilibrium B field and the wall. Recent work with 1D and 2D sheath models has shown that the rapid variation of ? around a typical limiter can lead to enhanced sheath potentials and localized power deposition (hot spots) when the B field is near glancing incidence. The physics model used to obtain these results does not include some glancing-angle effects, e.g. possible modification of the angular dependence of the Child-Langmuir law and the role of the magnetic pre-sheath. Here, we report on calculations which explore these effects, with the goal of improving the fidelity of the rf sheath BC used in analytical and numerical calculations. Work supported by US DOE grants DE-FC02-05ER54823 and DE-FG02-97ER54392.

  19. Variable angle correlation spectroscopy

    SciTech Connect

    Lee, Y K

    1994-05-01

    In this dissertation, a novel nuclear magnetic resonance (NMR) technique, variable angle correlation spectroscopy (VACSY) is described and demonstrated with {sup 13}C nuclei in rapidly rotating samples. These experiments focus on one of the basic problems in solid state NMR: how to extract the wealth of information contained in the anisotropic component of the NMR signal while still maintaining spectral resolution. Analysis of the anisotropic spectral patterns from poly-crystalline systems reveal information concerning molecular structure and dynamics, yet in all but the simplest of systems, the overlap of spectral patterns from chemically distinct sites renders the spectral analysis difficult if not impossible. One solution to this problem is to perform multi-dimensional experiments where the high-resolution, isotropic spectrum in one dimension is correlated with the anisotropic spectral patterns in the other dimensions. The VACSY technique incorporates the angle between the spinner axis and the static magnetic field as an experimental parameter that may be incremented during the course of the experiment to help correlate the isotropic and anisotropic components of the spectrum. The two-dimensional version of the VACSY experiments is used to extract the chemical shift anisotropy tensor values from multi-site organic molecules, study molecular dynamics in the intermediate time regime, and to examine the ordering properties of partially oriented samples. The VACSY technique is then extended to three-dimensional experiments to study slow molecular reorientations in a multi-site polymer system.

  20. Measurement of camera image sensor depletion thickness with cosmic rays

    E-print Network

    Vandenbroucke, J; Bravo, S; Jensen, K; Karn, P; Meehan, M; Peacock, J; Plewa, M; Ruggles, T; Santander, M; Schultz, D; Simons, A L; Tosi, D

    2015-01-01

    Camera image sensors can be used to detect ionizing radiation in addition to optical photons. In particular, cosmic-ray muons are detected as long, straight tracks passing through multiple pixels. The distribution of track lengths can be related to the thickness of the active (depleted) region of the camera image sensor through the known angular distribution of muons at sea level. We use a sample of cosmic-ray muon tracks recorded by the Distributed Electronic Cosmic-ray Observatory to measure the thickness of the depletion region of the camera image sensor in a commercial smart phone, the HTC Wildfire S. The track length distribution prefers a cosmic-ray muon angular distribution over an isotropic distribution. Allowing either distribution, we measure the depletion thickness to be between 13.9~$\\mu$m and 27.7~$\\mu$m. The same method can be applied to additional models of image sensor. Once measured, the thickness can be used to convert track length to incident polar angle on a per-event basis. Combined with ...

  1. Digital cameras with designs inspired by the arthropod eye.

    PubMed

    Song, Young Min; Xie, Yizhu; Malyarchuk, Viktor; Xiao, Jianliang; Jung, Inhwa; Choi, Ki-Joong; Liu, Zhuangjian; Park, Hyunsung; Lu, Chaofeng; Kim, Rak-Hwan; Li, Rui; Crozier, Kenneth B; Huang, Yonggang; Rogers, John A

    2013-05-01

    In arthropods, evolution has created a remarkably sophisticated class of imaging systems, with a wide-angle field of view, low aberrations, high acuity to motion and an infinite depth of field. A challenge in building digital cameras with the hemispherical, compound apposition layouts of arthropod eyes is that essential design requirements cannot be met with existing planar sensor technologies or conventional optics. Here we present materials, mechanics and integration schemes that afford scalable pathways to working, arthropod-inspired cameras with nearly full hemispherical shapes (about 160 degrees). Their surfaces are densely populated by imaging elements (artificial ommatidia), which are comparable in number (180) to those of the eyes of fire ants (Solenopsis fugax) and bark beetles (Hylastes nigrinus). The devices combine elastomeric compound optical elements with deformable arrays of thin silicon photodetectors into integrated sheets that can be elastically transformed from the planar geometries in which they are fabricated to hemispherical shapes for integration into apposition cameras. Our imaging results and quantitative ray-tracing-based simulations illustrate key features of operation. These general strategies seem to be applicable to other compound eye devices, such as those inspired by moths and lacewings (refracting superposition eyes), lobster and shrimp (reflecting superposition eyes), and houseflies (neural superposition eyes). PMID:23636401

  2. The effects of orientation angle, subcooling, heat flux, mass flux, and pressure on bubble growth and detachment in subcooled flow boiling

    E-print Network

    Sugrue, Rosemary M

    2012-01-01

    The effects of orientation angle, subcooling, heat flux, mass flux, and pressure on bubble growth and detachment in subcooled flow boiling were studied using a high-speed video camera in conjunction with a two-phase flow ...

  3. Angle-resolved scattering spectroscopy of explosives using an external cavity quantum cascade laser

    SciTech Connect

    Suter, Jonathan D.; Bernacki, Bruce E.; Phillips, Mark C.

    2012-04-01

    Investigation of angle-resolved scattering from solid explosives residues on a car door for non-contact sensing geometries. Illumination with a mid-infrared external cavity quantum cascade laser tuning between 7 and 8 microns was detected both with a sensitive single point detector and a hyperspectral imaging camera. Spectral scattering phenomena were discussed and possibilities for hyperspectral imaging at large scattering angles were outlined.

  4. Limbus Impact on Off-angle Iris Degradation

    SciTech Connect

    Karakaya, Mahmut; Barstow, Del R; Santos-Villalobos, Hector J; Thompson, Joseph W; Bolme, David S; Boehnen, Chris Bensing

    2013-01-01

    The accuracy of iris recognition depends on the quality of data capture and is negatively affected by several factors such as angle, occlusion, and dilation. Off-angle iris recognition is a new research focus in biometrics that tries to address several issues including corneal refraction, complex 3D iris texture, and blur. In this paper, we present an additional significant challenge that degrades the performance of the off-angle iris recognition systems, called the limbus effect . The limbus is the region at the border of the cornea where the cornea joins the sclera. The limbus is a semitransparent tissue that occludes a side portion of the iris plane. The amount of occluded iris texture on the side nearest the camera increases as the image acquisition angle increases. Without considering the role of the limbus effect, it is difficult to design an accurate off-angle iris recognition system. To the best of our knowledge, this is the first work that investigates the limbus effect in detail from a biometrics perspective. Based on results from real images and simulated experiments with real iris texture, the limbus effect increases the hamming distance score between frontal and off-angle iris images ranging from 0.05 to 0.2 depending upon the limbus height.

  5. A state observer for using a slow camera as a sensor for fast control applications

    NASA Astrophysics Data System (ADS)

    Gahleitner, Reinhard; Schagerl, Martin

    2013-03-01

    This contribution concerns about a problem that often arises in vision based control, when a camera is used as a sensor for fast control applications, or more precisely, when the sample rate of the control loop is higher than the frame rate of the camera. In control applications for mechanical axes, e.g. in robotics or automated production, a camera and some image processing can be used as a sensor to detect positions or angles. The sample time in these applications is typically in the range of a few milliseconds or less and this demands the use of a camera with a high frame rate up to 1000 fps. The presented solution is a special state observer that can work with a slower and therefore cheaper camera to estimate the state variables at the higher sample rate of the control loop. To simplify the image processing for the determination of positions or angles and make it more robust, some LED markers are applied to the plant. Simulation and experimental results show that the concept can be used even if the plant is unstable like the inverted pendulum.

  6. Three-dimensional location and attitude evaluation for rendezvous and docking operation using a single camera

    NASA Astrophysics Data System (ADS)

    Wang, Zhiling; Losito, S.; Mugnuolo, Raffaele; Pasquariello, Guido

    1993-01-01

    In the automatic rendezvous and docking manoeuvre (RVD) of space activity, determining the 3-D location and attitude between two vehicles is most important. A vision system to perform the docking manipulation in RVD is described in this paper. An improved algorithm is used for calibrating the geometric parameters of a camera fixed on the tracking vehicle off-line. Because the line-off-sight angles of four markers on the target vehicle to the lens center of the camera can be computed according to the optical principle and vector theory, the locations of the vehicle are obtained from the solution for a set of nonlinear equations from the triangular theory. The attitude angles for the vehicles are solved by a translational matrix of target frame to vehicle frame. As the vehicle closes in to the target, sets of markers having different distance intervals or a list of calibration parameters for cameras with different fields of view are selected at the proper moment to improve the situation when at least one of the markers exceeds the field of camera view. The series of experiments is given. The vision system is run on a SUN-4/330 Sparc station system equipped with one image board IT-151 and a CCD TV camera. All software is written in C language.

  7. Fruit Detectability Analysis for Different Camera Positions in Sweet-Pepper †

    PubMed Central

    Hemming, Jochen; Ruizendaal, Jos; Hofstee, Jan Willem; van Henten, Eldert J.

    2014-01-01

    For robotic harvesting of sweet-pepper fruits in greenhouses a sensor system is required to detect and localize the fruits on the plants. Due to the complex structure of the plant, most fruits are (partially) occluded when an image is taken from one viewpoint only. In this research the effect of multiple camera positions and viewing angles on fruit visibility and detectability was investigated. A recording device was built which allowed to place the camera under different azimuth and zenith angles and to move the camera horizontally along the crop row. Fourteen camera positions were chosen and the fruit visibility in the recorded images was manually determined for each position. For images taken from one position only with the criterion of maximum 50% occlusion per fruit, the fruit detectability (FD) was in no case higher than 69%. The best single positions were the front views and looking with a zenith angle of 60° upwards. The FD increased when a combination was made of multiple viewpoint positions. With a combination of five favourite positions the maximum FD was 90%. PMID:24681670

  8. A telephoto camera system with shooting direction control by gaze detection

    NASA Astrophysics Data System (ADS)

    Teraya, Daiki; Hachisu, Takumi; Yendo, Tomohiro

    2015-05-01

    For safe driving, it is important for driver to check traffic conditions such as traffic lights, or traffic signs as early as soon. If on-vehicle camera takes image of important objects to understand traffic conditions from long distance and shows these to driver, driver can understand traffic conditions earlier. To take image of long distance objects clearly, the focal length of camera must be long. When the focal length is long, on-vehicle camera doesn't have enough field of view to check traffic conditions. Therefore, in order to get necessary images from long distance, camera must have long-focal length and controllability of shooting direction. In previous study, driver indicates shooting direction on displayed image taken by a wide-angle camera, a direction controllable camera takes telescopic image, and displays these to driver. However, driver uses a touch panel to indicate the shooting direction in previous study. It is cause of disturb driving. So, we propose a telephoto camera system for driving support whose shooting direction is controlled by driver's gaze to avoid disturbing drive. This proposed system is composed of a gaze detector and an active telephoto camera whose shooting direction is controlled. We adopt non-wear detecting method to avoid hindrance to drive. The gaze detector measures driver's gaze by image processing. The shooting direction of the active telephoto camera is controlled by galvanometer scanners and the direction can be switched within a few milliseconds. We confirmed that the proposed system takes images of gazing straight ahead of subject by experiments.

  9. Optimization of the exposure time of aerospace camera through its signal-to-noise ratio

    NASA Astrophysics Data System (ADS)

    Chen, Yuheng; Ji, Yiqun; Zhou, Jiankang; Chen, Xinhua; Shen, Weimin

    2008-12-01

    The aerospace camera developed is an exclusive functional load of a micro satellite. The signal-to-noise ratio of the aerospace camera reflects its radiance response and is the parameter that directly associates with the quality of its acquired images. The traditional way to calculate the signal-to-noise ratio of a camera is to substitute the related parameters of its subassemblies into the deduced formulas. This kind of method lacks the focalization on the diversities of its components and specific application occasions. The result tested by using standard uniform source can certainly be utilized to evaluate the work performance of the camera, but it ignores its actual orbital atmospheric condition and consequentially leads to unavoidable data deviation. The atmospheric transmission model is built and the radiation condition of the aerospace camera in orbit is simulated by means of MODTRAN. Instead of building the noise model based on electronic devices of the camera to get theoretical noise data, considering the difference of the noises of the camera between in-lab and on-orbit condition, we adopt the measured noise data of the CCD camera to calculate the signal-to-noise ratio so as to make it approach the real value as possible. The influences of the changes of solar altitude angle, earth surface albedo and weather condition on the signal-to-noise ratio of the camera are quantitatively determined. The result of the signal-to-noise ratio can be used as the basis to evaluate the remote sensing imaging quality and to decide the feasible exposure time.

  10. Focal lengths of Venus Monitoring Camera from limb locations

    NASA Astrophysics Data System (ADS)

    Limaye, Sanjay S.; Markiewicz, W. J.; Krauss, R.; Ignatiev, N.; Roatsch, T.; Matz, K. D.

    2015-08-01

    The Venus Monitoring Camera (VMC) carried by European Space Agency's Venus Express orbiter (Svedhem et al., 2007) consists of four optical units, each with a separate filter casting an image on a single CCD (Markiewicz et al., 2007a, 2007b). The desire to capture as much of the planet in a single frame during the spacecraft's 24 h, 0.84 eccentricity orbit led to optics with 18° field of view. Analysis of Venus images obtained by the VMC indicated that the computed limb radius and altitude of haze layers were somewhat inconsistent with prior knowledge and expectations. Possible causes include errors in the knowledge of image geometry, misalignment of the optic axis from the pointing direction, and optical distortion. These were explored and eliminated, leaving only deviations from the ground and pre-solar damage estimate of the focal length lengths as the most likely reason. We use the location of planet's limb to estimate the focal length of each camera using images of the planet when the orbiter was more than 20,000 km from planet center. The method relies on the limb radius to be constant at least over a small range of solar zenith angles. We were able to achieve better estimates for the focal lengths for all four cameras and also estimate small offsets to the boresight alignment. An outcome of this analysis is the finding that the slant unit optical depth varies more rapidly with solar zenith angle in the afternoon as compared to morning, with lowest values at local noon. A variation of this level is also observed with latitude. Both are indicative of the presence of overlying haze above the clouds, and the morning afternoon asymmetry suggests different photochemical processes in destruction and production of the haze.

  11. Cryogenic mechanism for ISO camera

    NASA Astrophysics Data System (ADS)

    Luciano, G.

    1987-12-01

    The Infrared Space Observatory (ISO) camera configuration, architecture, materials, tribology, motorization, and development status are outlined. The operating temperature is 2 to 3 K, at 2.5 to 18 microns. Selected material is a titanium alloy, with MoS2/TiC lubrication. A stepping motor drives the ball-bearing mounted wheels to which the optical elements are fixed. Model test results are satisfactory, and also confirm the validity of the test facilities, particularly for vibration tests at 4K.

  12. Disequilibrium dihedral angles in dolerite sills

    USGS Publications Warehouse

    Holness, Marian B.; Richardson, Chris; Helz, Rosalind T.

    2012-01-01

    The geometry of clinopyroxene-plagioclase-plagioclase junctions in mafic rocks, measured by the median dihedral angle ?cpp, is created during solidification. In the solidifying Kilauea Iki (Hawaii) lava lake, the wider junctions between plagioclase grains are the first to be filled by pyroxene, followed by the narrower junctions. The final ?cpp, attained when all clinopyroxene-plagioclase-plagioclase junctions are formed, is 78° in the upper crust of the lake, and 85° in the lower solidification front. ?cpp in the 3.5-m-thick Traigh Bhàn na Sgùrra sill (Inner Hebrides) is everywhere 78°. In the Whin Sill (northern England, 38 m thick) and the Portal Peak sill (Antarctica, 129 m thick), ?cpp varies symmetrically, with the lowest values at the margins. The 266-m-thick Basement Sill (Antarctica) has asymmetric variation of ?cpp, attributed to a complex filling history. The chilled margins of the Basement Sill are partially texturally equilibrated, with high ?cpp. The plagioclase grain size in the two widest sills varies asymmetrically, with the coarsest rocks found in the upper third. Both ?cpp and average grain size are functions of model crystallization times. ?cpp increases from 78° to a maximum of ?100° as the crystallization time increases from 1 to 500 yr. Because the use of grain size as a measure of crystallization time is dependent on an estimate of crystal growth rates, dihedral angles provide a more direct proxy for cooling rates in dolerites.

  13. An attentive multi-camera system

    NASA Astrophysics Data System (ADS)

    Napoletano, Paolo; Tisato, Francesco

    2014-03-01

    Intelligent multi-camera systems that integrate computer vision algorithms are not error free, and thus both false positive and negative detections need to be revised by a specialized human operator. Traditional multi-camera systems usually include a control center with a wall of monitors displaying videos from each camera of the network. Nevertheless, as the number of cameras increases, switching from a camera to another becomes hard for a human operator. In this work we propose a new method that dynamically selects and displays the content of a video camera from all the available contents in the multi-camera system. The proposed method is based on a computational model of human visual attention that integrates top-down and bottom-up cues. We believe that this is the first work that tries to use a model of human visual attention for the dynamic selection of the camera view of a multi-camera system. The proposed method has been experimented in a given scenario and has demonstrated its effectiveness with respect to the other methods and manually generated ground-truth. The effectiveness has been evaluated in terms of number of correct best-views generated by the method with respect to the camera views manually generated by a human operator.

  14. Current Propagation in Narrow Bipolar Pulses

    NASA Astrophysics Data System (ADS)

    Watson, S. S.; Marshall, T. C.

    2005-12-01

    We model the observed electric fields of a particular narrow bipolar pulse (NBP) published in Eack [2004]. We assume an exponential growth of current carriers due to a runaway breakdown avalanche and show that this leads to a corresponding increase in current. With specific input values for discharge altitude, length, current, and propagation velocity, the model does a good job of reproducing the observed near and far electric field. The ability of the model to reproduce the observed electric fields is an indication that our assumptions concerning the runaway avalanche may be correct, and this indication is further strengthened by the inability of the simple transmission line model to reproduce simultaneously both the near and far electric fields. Eack, K. B. (2004), Electrical characteristics of narrow bipolar events, Geophys. Res. Lett., 31, L20102, doi:10.1029/2004/GL021117.

  15. Heterodyne Interferometer Angle Metrology

    NASA Technical Reports Server (NTRS)

    Hahn, Inseob; Weilert, Mark A.; Wang, Xu; Goullioud, Renaud

    2010-01-01

    A compact, high-resolution angle measurement instrument has been developed that is based on a heterodyne interferometer. The common-path heterodyne interferometer metrology is used to measure displacements of a reflective target surface. In the interferometer setup, an optical mask is used to sample the measurement laser beam reflecting back from a target surface. Angular rotations, around two orthogonal axes in a plane perpendicular to the measurement- beam propagation direction, are determined simultaneously from the relative displacement measurement of the target surface. The device is used in a tracking telescope system where pitch and yaw measurements of a flat mirror were simultaneously performed with a sensitivity of 0.1 nrad, per second, and a measuring range of 0.15 mrad at a working distance of an order of a meter. The nonlinearity of the device is also measured less than one percent over the measurement range.

  16. Cerebellopontine Angle Lipoma

    PubMed Central

    Schuhmann, Martin U.; Lüdemann, Wolf O.; Schreiber, Hartwig; Samii, Madjid

    1997-01-01

    Intracranial lipomas in an infratentorial and extra-axial location are extremely rare. The presented case of an extensive lipoma of the cerebellopontine angle (CPA) represents 0.05% of all CPA tumors operated on in our department from 1978 to 1996. The lipoma constitutes an important differential diagnosis because the clinical management differs significantly from other CPA lesions. The clinical presentation and management of the presented case are analyzed in comparison to all previously described cases of CPA lipomas. The etiology and the radiological features of CPA lipomas are reviewed and discussed. CPA lipomas are maldevelopmental lesions that may cause slowly progressive symptoms. Neuroradiology enables a reliable preoperative diagnosis. Attempts of complete lipoma resection usually result in severe neurological deficits. Therefore, we recommend a conservative approach in managing these patients. Limited surgery is indicated if the patient has an associated vascular compression syndrome or suffers from disabling vertigo. ImagesFigure 1Figure 2Figure 3Figure 4 PMID:17171031

  17. Dynamic-angle spinning and double rotation of quadrupolar nuclei

    SciTech Connect

    Mueller, K.T. California Univ., Berkeley, CA . Dept. of Chemistry)

    1991-07-01

    Nuclear magnetic resonance (NMR) spectroscopy of quadrupolar nuclei is complicated by the coupling of the electric quadrupole moment of the nucleus to local variations in the electric field. The quadrupolar interaction is a useful source of information about local molecular structure in solids, but it tends to broaden resonance lines causing crowding and overlap in NMR spectra. Magic- angle spinning, which is routinely used to produce high resolution spectra of spin-{1/2} nuclei like carbon-13 and silicon-29, is incapable of fully narrowing resonances from quadrupolar nuclei when anisotropic second-order quadrupolar interactions are present. Two new sample-spinning techniques are introduced here that completely average the second-order quadrupolar coupling. Narrow resonance lines are obtained and individual resonances from distinct nuclear sites are identified. In dynamic-angle spinning (DAS) a rotor containing a powdered sample is reoriented between discrete angles with respect to high magnetic field. Evolution under anisotropic interactions at the different angles cancels, leaving only the isotropic evolution of the spin system. In the second technique, double rotation (DOR), a small rotor spins within a larger rotor so that the sample traces out a complicated trajectory in space. The relative orientation of the rotors and the orientation of the larger rotor within the magnetic field are selected to average both first- and second-order anisotropic broadening. The theory of quadrupolar interactions, coherent averaging theory, and motional narrowing by sample reorientation are reviewed with emphasis on the chemical shift anisotropy and second-order quadrupolar interactions experienced by half-odd integer spin quadrupolar nuclei. The DAS and DOR techniques are introduced and illustrated with application to common quadrupolar systems such as sodium-23 and oxygen-17 nuclei in solids.

  18. A Second Generation Multi-Angle Imaging SpectroRadiometer (MISR-2)

    NASA Technical Reports Server (NTRS)

    Bothwell, Graham; Diner, David J.; Pagano, Thomas S.; Duval, Valerie G.; Beregovski, Yuri; Hovland, Larry E.; Preston, Daniel J.

    2001-01-01

    The Multi-angle Imaging SpectroRadiometer (MISR) has been in Earth orbit since December 1999 on NASA's Terra spacecraft. This instrument provides new ways of looking at the Earth's atmosphere, clouds, and surface for the purpose of understanding the Earth's ecology, environment, and climate. To facilitate the potential future continuation of MISR's multi-angle observations, a study was undertaken in 1999 and 2000 under the Instrument Incubator Program (IIP) of NASA Code Y's Earth Science Technology Office (ESTO) to investigate and demonstrate the feasibility of a successor to MISR that will have greatly reduced size and mass. The kernel of the program was the design, construction, and testing of a highly miniaturized camera, one of the nine that would probably be used on a future space borne MISR-like instrument. This demonstrated that the size and mass reduction of the optical system and camera electronics are possible and that filters can be assembled to meet the miniaturized packaging requirements. An innovative, reflective optics design was used, enabling the wavelength range to be extended into the shortwave infrared. This was the smallest all-reflective camera ever produced by the contractor. A study was undertaken to determine the feasibility of implementing nine (multi-angle) cameras within a single structure. This resulted in several possible configurations. It would also be possible to incorporate one of the cameras into an airborne instrument.

  19. A grazing incidence x-ray streak camera for ultrafast, single-shot measurements

    SciTech Connect

    Feng, Jun; Engelhorn, K.; Cho, B.I.; Lee, H.J.; Greaves, M.; Weber, C.P.; Falcone, R.W.; Padmore, H. A.; Heimann, P.A.

    2010-02-18

    An ultrafast x-ray streak camera has been realized using a grazing incidence reflection photocathode. X-rays are incident on a gold photocathode at a grazing angle of 20 degree and photoemitted electrons are focused by a large aperture magnetic solenoid lens. The streak camera has high quantum efficiency, 600fs temporal resolution, and 6mm imaging length in the spectral direction. Its single shot capability eliminates temporal smearing due to sweep jitter, and allows recording of the ultrafast dynamics of samples that undergo non-reversible changes.

  20. The Dark Energy Camera (DECam)

    E-print Network

    K. Honscheid; D. L. DePoy; for the DES Collaboration

    2008-10-20

    In this paper we describe the Dark Energy Camera (DECam), which will be the primary instrument used in the Dark Energy Survey. DECam will be a 3 sq. deg. mosaic camera mounted at the prime focus of the Blanco 4m telescope at the Cerro-Tololo International Observatory (CTIO). It consists of a large mosaic CCD focal plane, a five element optical corrector, five filters (g,r,i,z,Y), a modern data acquisition and control system and the associated infrastructure for operation in the prime focus cage. The focal plane includes of 62 2K x 4K CCD modules (0.27"/pixel) arranged in a hexagon inscribed within the roughly 2.2 degree diameter field of view and 12 smaller 2K x 2K CCDs for guiding, focus and alignment. The CCDs will be 250 micron thick fully-depleted CCDs that have been developed at the Lawrence Berkeley National Laboratory (LBNL). Production of the CCDs and fabrication of the optics, mechanical structure, mechanisms, and control system for DECam are underway; delivery of the instrument to CTIO is scheduled for 2010.

  1. Toward the camera rain gauge

    NASA Astrophysics Data System (ADS)

    Allamano, P.; Croci, A.; Laio, F.

    2015-03-01

    We propose a novel technique based on the quantitative detection of rain intensity from images, i.e., from pictures taken in rainy conditions. The method is fully analytical and based on the fundamentals of camera optics. A rigorous statistical framing of the technique allows one to obtain the rain rate estimates in terms of expected values and associated uncertainty. We show that the method can be profitably applied to real rain events, and we obtain promising results with errors of the order of ±25%. A precise quantification of the method's accuracy will require a more systematic and long-term comparison with benchmark measures. The significant step forward with respect to standard rain gauges resides in the possibility to retrieve measures at very high temporal resolution (e.g., 30 measures per minute) at a very low cost. Perspective applications include the possibility to dramatically increase the spatial density of rain observations by exporting the technique to crowdsourced pictures of rain acquired with cameras and smartphones.

  2. Advantages of improved timing accuracy in PET cameras using LSOscintillator

    SciTech Connect

    Moses, William W.

    2002-12-02

    PET scanners based on LSO have the potential forsignificantly better coincidence timing resolution than the 6 ns fwhmtypically achieved with BGO. This study analyzes the performanceenhancements made possible by improved timing as a function of thecoincidence time resolution. If 500 ps fwhm coincidence timing resolutioncan be achieved in a complete PET camera, the following four benefits canbe realized for whole-body FDG imaging: 1) The random event rate can bereduced by using a narrower coincidence timing window, increasing thepeak NECR by~;50 percent. 2) Using time-of-flight in the reconstructionalgorithm will reduce the noise variance by a factor of 5. 3) Emissionand transmission data can be acquired simultaneously, reducing the totalscan time. 4) Axial blurring can be reduced by using time-of-flight todetermine the correct axial plane that each event originated from. Whiletime-of-flight was extensively studied in the 1980's, practical factorslimited its effectiveness at that time and little attention has been paidto timing in PET since then. As these potential improvements aresubstantial and the advent of LSO PET cameras gives us the means toobtain them without other sacrifices, efforts to improve PET timingshould resume after their long dormancy.

  3. Wide field and diffraction limited array camera for SIRTF

    NASA Technical Reports Server (NTRS)

    Fazio, G. G.; Koch, D. G.; Melnick, G. J.; Tresch-Fienberg, R. M.; Willner, S. P.; Gezari, D. Y.; Lamb, G.; Shu, P.; Chin, G.; Mccreight, C. R.

    1986-01-01

    The Infrared Array Camera for the Space Infrared Telescope Facility (SIRTF/IRAC) is capable of two-dimensional photometry in either a wide field or diffraction-limited mode over the wavelength interval from 2 to 30 microns. Three different two-dimensional direct readout (DRO) array detectors are being considered: Band 1-InSb or Si:In (2-5 microns) 128 x 128 pixels, Band 2-Si:Ga (5-18 microns) 64 x 64 pixels, and Band 3-Si:Sb (18-30 microns) 64 x 64 pixels. The hybrid DRO readout architecture has the advantages of low read noise, random pixel access with individual readout rates, and nondestructive readout. The scientific goals of IRAC are discussed, which are the basis for several important requirements and capabilities of the array camera: (1) diffraction-limited resolution from 2-30 microns, (2) use of the maximum unvignetted field of view of SIRTF, (3) simultaneous observations within the three infrared spectral bands, and (4) the capability for broad and narrow bandwidth spectral resolution. A strategy has been developed to minimize the total electronic and environmental noise sources to satisfy the scientific requirements.

  4. Wide field and diffraction limited array camera for SIRTF

    NASA Technical Reports Server (NTRS)

    Fazio, G. G.; Koch, D. G.; Melnick, G. J.; Tresch-Fienberg, R. M.; Willner, S. P.; Gezari, D. Y.; Lamb, G.; Shu, P.; Chin, G.; Mccreight, C. R.

    1986-01-01

    The Infrared Array Camera for the space Infrared Telescope Facility (SIRTF/IRAC) is capable of two-dimensional photometry in either a wide field or diffraction-limited mode over the wavelength interval from 2 to 30 microns. Three different two-dimensional direct readout (DRO) array detectors will be used: Band 1-InSb or Si:In (2-5 microns) 128 x 128 pixels, Band 2-Si:Ga (5-18 microns) 64 x 64 pixels, and Band 3-Si:Sb (18-30 microns) 64 x 64 pixels. The hybrid DRO readout architecture has the advantages of low read noise, random pixel access with individual readout rates, and nondestructive readout. The scientific goals of IRAC are discussed, which are the basis for several important requirements and capabilities of the array camera: (1) diffraction-limited resolution from 2-30 microns, (2) use of the maximum unvignetted field of view of SIRTF, (3) simultaneous observations within the three infrared spectral bands, and (4) the capability for broad and narrow bandwidth spectral resolution. A strategy has been developed to minimize the total electronic and environmental noise sources to satisfy the scientific requirements.

  5. High Speed Digital Camera Technology Review

    NASA Technical Reports Server (NTRS)

    Clements, Sandra D.

    2009-01-01

    A High Speed Digital Camera Technology Review (HSD Review) is being conducted to evaluate the state-of-the-shelf in this rapidly progressing industry. Five HSD cameras supplied by four camera manufacturers participated in a Field Test during the Space Shuttle Discovery STS-128 launch. Each camera was also subjected to Bench Tests in the ASRC Imaging Development Laboratory. Evaluation of the data from the Field and Bench Tests is underway. Representatives from the imaging communities at NASA / KSC and the Optical Systems Group are participating as reviewers. A High Speed Digital Video Camera Draft Specification was updated to address Shuttle engineering imagery requirements based on findings from this HSD Review. This draft specification will serve as the template for a High Speed Digital Video Camera Specification to be developed for the wider OSG imaging community under OSG Task OS-33.

  6. Television camera video level control system

    NASA Technical Reports Server (NTRS)

    Kravitz, M.; Freedman, L. A.; Fredd, E. H.; Denef, D. E. (inventors)

    1985-01-01

    A video level control system is provided which generates a normalized video signal for a camera processing circuit. The video level control system includes a lens iris which provides a controlled light signal to a camera tube. The camera tube converts the light signal provided by the lens iris into electrical signals. A feedback circuit in response to the electrical signals generated by the camera tube, provides feedback signals to the lens iris and the camera tube. This assures that a normalized video signal is provided in a first illumination range. An automatic gain control loop, which is also responsive to the electrical signals generated by the camera tube 4, operates in tandem with the feedback circuit. This assures that the normalized video signal is maintained in a second illumination range.

  7. Dependence of astigmatism, far-field pattern, and spectral envelope width on active layer thickness of gain guided lasers with narrow stripe geometry

    SciTech Connect

    Mamine, T.

    1984-06-15

    The effects of active layer thickness on the astigmatism, the angle of far-field pattern width parallel to the junction, and the spectral envelope width of a gain guided laser with a narrow stripe geometry have been investigated analytically and experimentally. It is concluded that a large level of astigmatism, a narrow far-field pattern width, and a rapid convergence of the spectral envelope width are inherent to the gain guided lasers with thin active layers.

  8. Mission report on the Orbiter Camera Payload System (OCPS) Large Format Camera (LFC) and Attitude Reference System (ARS)

    NASA Technical Reports Server (NTRS)

    Mollberg, Bernard H.; Schardt, Bruton B.

    1988-01-01

    The Orbiter Camera Payload System (OCPS) is an integrated photographic system which is carried into earth orbit as a payload in the Space Transportation System (STS) Orbiter vehicle's cargo bay. The major component of the OCPS is a Large Format Camera (LFC), a precision wide-angle cartographic instrument that is capable of producing high resolution stereo photography of great geometric fidelity in multiple base-to-height (B/H) ratios. A secondary, supporting system to the LFC is the Attitude Reference System (ARS), which is a dual lens Stellar Camera Array (SCA) and camera support structure. The SCA is a 70-mm film system which is rigidly mounted to the LFC lens support structure and which, through the simultaneous acquisition of two star fields with each earth-viewing LFC frame, makes it possible to determine precisely the pointing of the LFC optical axis with reference to the earth nadir point. Other components complete the current OCPS configuration as a high precision cartographic data acquisition system. The primary design objective for the OCPS was to maximize system performance characteristics while maintaining a high level of reliability compatible with Shuttle launch conditions and the on-orbit environment. The full-up OCPS configuration was launched on a highly successful maiden voyage aboard the STS Orbiter vehicle Challenger on October 5, 1984, as a major payload aboard mission STS 41-G. This report documents the system design, the ground testing, the flight configuration, and an analysis of the results obtained during the Challenger mission STS 41-G.

  9. The Mars Science Laboratory Engineering Cameras

    NASA Astrophysics Data System (ADS)

    Maki, J.; Thiessen, D.; Pourangi, A.; Kobzeff, P.; Litwin, T.; Scherr, L.; Elliott, S.; Dingizian, A.; Maimone, M.

    2012-09-01

    NASA's Mars Science Laboratory (MSL) Rover is equipped with a set of 12 engineering cameras. These cameras are build-to-print copies of the Mars Exploration Rover cameras described in Maki et al. (J. Geophys. Res. 108(E12): 8071, 2003). Images returned from the engineering cameras will be used to navigate the rover on the Martian surface, deploy the rover robotic arm, and ingest samples into the rover sample processing system. The Navigation cameras (Navcams) are mounted to a pan/tilt mast and have a 45-degree square field of view (FOV) with a pixel scale of 0.82 mrad/pixel. The Hazard Avoidance Cameras (Hazcams) are body-mounted to the rover chassis in the front and rear of the vehicle and have a 124-degree square FOV with a pixel scale of 2.1 mrad/pixel. All of the cameras utilize a 1024×1024 pixel detector and red/near IR bandpass filters centered at 650 nm. The MSL engineering cameras are grouped into two sets of six: one set of cameras is connected to rover computer "A" and the other set is connected to rover computer "B". The Navcams and Front Hazcams each provide similar views from either computer. The Rear Hazcams provide different views from the two computers due to the different mounting locations of the "A" and "B" Rear Hazcams. This paper provides a brief description of the engineering camera properties, the locations of the cameras on the vehicle, and camera usage for surface operations.

  10. Stationary Camera Aims And Zooms Electronically

    NASA Technical Reports Server (NTRS)

    Zimmermann, Steven D.

    1994-01-01

    Microprocessors select, correct, and orient portions of hemispherical field of view. Video camera pans, tilts, zooms, and provides rotations of images of objects of field of view, all without moving parts. Used for surveillance in areas where movement of camera conspicuous or constrained by obstructions. Also used for closeup tracking of multiple objects in field of view or to break image into sectors for simultaneous viewing, thereby replacing several cameras.

  11. Development of biostereometric experiments. [stereometric camera system

    NASA Technical Reports Server (NTRS)

    Herron, R. E.

    1978-01-01

    The stereometric camera was designed for close-range techniques in biostereometrics. The camera focusing distance of 360 mm to infinity covers a broad field of close-range photogrammetry. The design provides for a separate unit for the lens system and interchangeable backs on the camera for the use of single frame film exposure, roll-type film cassettes, or glass plates. The system incorporates the use of a surface contrast optical projector.

  12. One-angle fluorescence tomography with in-and-out motion

    PubMed Central

    Zeng, Gengsheng L.

    2014-01-01

    The usual tomography is achieved by acquiring measurements around an object with multiple angles. The possibility of obtaining a fluorescence tomographic image from measurements at only one angle is explored. Instead of rotating around the object, the camera (or the objective lens) moves toward (or away from) the object and takes photographs while the camera’s focal plane passes through the object. The volume of stacked two-dimensional pictures forms a blurred three-dimensional image. The true image can be obtained by deconvolving the system’s point spread function. Simplified computer simulations are carried out to verify the feasibility of the proposed method. The computer simulations indicate that it is feasible to obtain a tomographic image by using the in-and-out motion to acquire data. PMID:25520544

  13. Calibration of hyperspectral close-range pushbroom cameras for plant phenotyping

    NASA Astrophysics Data System (ADS)

    Behmann, Jan; Mahlein, Anne-Katrin; Paulus, Stefan; Kuhlmann, Heiner; Oerke, Erich-Christian; Plümer, Lutz

    2015-08-01

    Hyperspectral sensors are able to detect biological processes of plants which are invisible to the naked eye. Close-range cameras in particular support the identification of biotic and abiotic stress reactions at an early stage. Up to now, their full potential is only partially realized because geometrical factors as leaf angle, curvature and self-shading, overlay the signal of biological processes. Suitable 3D plant models constitutes an important step to removing these factors from the data. The matching of these 3D model and the hyperspectral image with sufficient accuracy even for small leaf veins is required but relies on an adequate geometric calibration of hyperspectral cameras. We present a method for the geometric calibration of hyperspectral pushbroom cameras in the close-range, which enables reliable and reproducible results at sub-pixel scale. This approach extends the linear pushbroom camera by the ability to model non-linear fractions. Accuracy and reproducibility of the method is validated using a hyperspectral senor system with two line cameras observing the reflected radiation in the spectral range from 400 to 2500 nm. We point out new potentials arising from with the proposed camera calibration, e.g. hyperspectral 3D plant models, which have high potential for crop plant phenotyping.

  14. Multi-camera calibration based on openCV and multi-view registration

    NASA Astrophysics Data System (ADS)

    Deng, Xiao-ming; Wan, Xiong; Zhang, Zhi-min; Leng, Bi-yan; Lou, Ning-ning; He, Shuai

    2010-10-01

    For multi-camera calibration systems, a method based on OpenCV and multi-view registration combining calibration algorithm is proposed. First of all, using a Zhang's calibration plate (8X8 chessboard diagram) and a number of cameras (with three industrial-grade CCD) to be 9 group images shooting from different angles, using OpenCV to calibrate the parameters fast in the camera. Secondly, based on the corresponding relationship between each camera view, the computation of the rotation matrix and translation matrix is formulated as a constrained optimization problem. According to the Kuhn-Tucker theorem and the properties on the derivative of the matrix-valued function, the formulae of rotation matrix and translation matrix are deduced by using singular value decomposition algorithm. Afterwards an iterative method is utilized to get the entire coordinate transformation of pair-wise views, thus the precise multi-view registration can be conveniently achieved and then can get the relative positions in them(the camera outside the parameters).Experimental results show that the method is practical in multi-camera calibration .

  15. Spherical Gaussian mixture model and object tracking system for PTZ camera

    NASA Astrophysics Data System (ADS)

    Hwangbo, Seok; Lee, Chan-Su

    2015-05-01

    Recently, pan-tilt-zoom(PTZ) camera is widely used in extensive-area surveillance applications. A number of background modeling methods have been proposed within existing object detection and tracking systems. However, conventional background modeling methods for PTZ camera have difficulties in covering extensive field of view(FOV). This paper presents a novel object tracking system based on a spherical background model for PTZ camera. The proposed system has two components: The first one is the spherical Gaussian mixture model(S-GMM) that learns background for all the view angles in the PTZ camera. Also, Gaussian parameters in each pixel in the S-GMM are learned and updated. The second one is object tracking system with foreground detection using the S-GMM in real-time. The proposed system is suitable to cover wide FOV compared to a conventional background modeling system for PTZ camera, and is able to exactly track moving objects. We demonstrate the advantages of the proposed S-GMM for object tracking system using PTZ camera. Also, we expect to build a more advanced surveillance applications via the proposed system.

  16. No. 17 2014 Camera Trapping Wild Cats

    E-print Network

    Mazzotti, Frank

    Caribbean Naturalist No. 17 2014 Camera Trapping Wild Cats with Landowners in Northern Belize, behavior, biogeography, taxonomy, evolution, anatomy, physiology, geology, and related fields. Manuscripts

  17. Advanced High-Definition Video Cameras

    NASA Technical Reports Server (NTRS)

    Glenn, William

    2007-01-01

    A product line of high-definition color video cameras, now under development, offers a superior combination of desirable characteristics, including high frame rates, high resolutions, low power consumption, and compactness. Several of the cameras feature a 3,840 2,160-pixel format with progressive scanning at 30 frames per second. The power consumption of one of these cameras is about 25 W. The size of the camera, excluding the lens assembly, is 2 by 5 by 7 in. (about 5.1 by 12.7 by 17.8 cm). The aforementioned desirable characteristics are attained at relatively low cost, largely by utilizing digital processing in advanced field-programmable gate arrays (FPGAs) to perform all of the many functions (for example, color balance and contrast adjustments) of a professional color video camera. The processing is programmed in VHDL so that application-specific integrated circuits (ASICs) can be fabricated directly from the program. ["VHDL" signifies VHSIC Hardware Description Language C, a computing language used by the United States Department of Defense for describing, designing, and simulating very-high-speed integrated circuits (VHSICs).] The image-sensor and FPGA clock frequencies in these cameras have generally been much higher than those used in video cameras designed and manufactured elsewhere. Frequently, the outputs of these cameras are converted to other video-camera formats by use of pre- and post-filters.

  18. NARROW-K-BAND OBSERVATIONS OF THE GJ 1214 SYSTEM

    SciTech Connect

    Colón, Knicole D.; Gaidos, Eric

    2013-10-10

    GJ 1214 is a nearby M dwarf star that hosts a transiting super-Earth-size planet, making this system an excellent target for atmospheric studies. Most studies find that the transmission spectrum of GJ 1214b is flat, which favors either a high mean molecular weight or cloudy/hazy hydrogen (H) rich atmosphere model. Photometry at short wavelengths (<0.7 ?m) and in the K band can discriminate the most between these different atmosphere models for GJ 1214b, but current observations do not have sufficiently high precision. We present photometry of seven transits of GJ 1214b through a narrow K-band (2.141 ?m) filter with the Wide Field Camera on the 3.8 m United Kingdom Infrared Telescope. Our photometric precision is typically 1.7 × 10{sup –3} (for a single transit), comparable with other ground-based observations of GJ 1214b. We measure a planet-star radius ratio of 0.1158 ± 0.0013, which, along with other studies, also supports a flat transmission spectrum for GJ 1214b. Since this does not exclude a scenario where GJ 1214b has an H-rich envelope with heavy elements that are sequestered below a cloud/haze layer, we compare K-band observations with models of H{sub 2} collision-induced absorption in an atmosphere for a range of temperatures. While we find no evidence for deviation from a flat spectrum (slope s = 0.0016 ± 0.0038), an H{sub 2}-dominated upper atmosphere (<60 mbar) cannot be excluded. More precise observations at <0.7 ?m and in the K band, as well as a uniform analysis of all published data, would be useful for establishing more robust limits on atmosphere models for GJ 1214b.

  19. Auger recombination in narrow-gap semiconductor superlattices incorporating antimony

    E-print Network

    Flatte, Michael E.

    Auger recombination in narrow-gap semiconductor superlattices incorporating antimony C. H. Grein is performed between measured and calculated Auger recombination rates for four different narrow to compute Auger recombination rates. Varying amounts of Auger recombination suppression are displayed

  20. Illumination box and camera system

    DOEpatents

    Haas, Jeffrey S. (San Ramon, CA); Kelly, Fredrick R. (Modesto, CA); Bushman, John F. (Oakley, CA); Wiefel, Michael H. (La Honda, CA); Jensen, Wayne A. (Livermore, CA); Klunder, Gregory L. (Oakland, CA)

    2002-01-01

    A hand portable, field-deployable thin-layer chromatography (TLC) unit and a hand portable, battery-operated unit for development, illumination, and data acquisition of the TLC plates contain many miniaturized features that permit a large number of samples to be processed efficiently. The TLC unit includes a solvent tank, a holder for TLC plates, and a variety of tool chambers for storing TLC plates, solvent, and pipettes. After processing in the TLC unit, a TLC plate is positioned in a collapsible illumination box, where the box and a CCD camera are optically aligned for optimal pixel resolution of the CCD images of the TLC plate. The TLC system includes an improved development chamber for chemical development of TLC plates that prevents solvent overflow.

  1. Explosive Transient Camera (ETC) Program

    NASA Technical Reports Server (NTRS)

    Ricker, George

    1991-01-01

    Since the inception of the ETC program, a wide range of new technologies was developed to support this astronomical instrument. The prototype unit was installed at ETC Site 1. The first partially automated observations were made and some major renovations were later added to the ETC hardware. The ETC was outfitted with new thermoelectrically-cooled CCD cameras and a sophisticated vacuum manifold, which, together, made the ETC a much more reliable unit than the prototype. The ETC instrumentation and building were placed under full computer control, allowing the ETC to operate as an automated, autonomous instrument with virtually no human intervention necessary. The first fully-automated operation of the ETC was performed, during which the ETC monitored the error region of the repeating soft gamma-ray burster SGR 1806-21.

  2. Camera processing with chromatic aberration.

    PubMed

    Korneliussen, Jan Tore; Hirakawa, Keigo

    2014-10-01

    Since the refractive index of materials commonly used for lens depends on the wavelengths of light, practical camera optics fail to converge light to a single point on an image plane. Known as chromatic aberration, this phenomenon distorts image details by introducing magnification error, defocus blur, and color fringes. Though achromatic and apochromatic lens designs reduce chromatic aberration to a degree, they are complex and expensive and they do not offer a perfect correction. In this paper, we propose a new postcapture processing scheme designed to overcome these problems computationally. Specifically, the proposed solution is comprised of chromatic aberration-tolerant demosaicking algorithm and post-demosaicking chromatic aberration correction. Experiments with simulated and real sensor data verify that the chromatic aberration is effectively corrected. PMID:25163060

  3. Wind dynamic range video camera

    NASA Technical Reports Server (NTRS)

    Craig, G. D. (inventor)

    1985-01-01

    A television camera apparatus is disclosed in which bright objects are attenuated to fit within the dynamic range of the system, while dim objects are not. The apparatus receives linearly polarized light from an object scene, the light being passed by a beam splitter and focused on the output plane of a liquid crystal light valve. The light valve is oriented such that, with no excitation from the cathode ray tube, all light is rotated 90 deg and focused on the input plane of the video sensor. The light is then converted to an electrical signal, which is amplified and used to excite the CRT. The resulting image is collected and focused by a lens onto the light valve which rotates the polarization vector of the light to an extent proportional to the light intensity from the CRT. The overall effect is to selectively attenuate the image pattern focused on the sensor.

  4. Traffic camera markup language (TCML)

    NASA Astrophysics Data System (ADS)

    Cai, Yang; Bunn, Andrew; Snyder, Kerry

    2012-01-01

    In this paper, we present a novel video markup language for articulating semantic traffic data from surveillance cameras and other sensors. The markup language includes three layers: sensor descriptions, traffic measurement, and application interface descriptions. The multi-resolution based video codec algorithm enables a quality-of-service-aware video streaming according the data traffic. A set of object detection APIs are developed using Convex Hull and Adaptive Proportion models and 3D modeling. It is found that our approach outperforms 3D modeling and Scale-Independent Feature Transformation (SIFT) algorithms in terms of robustness. Furthermore, our empirical data shows that it is feasible to use TCML to facilitate the real-time communication between an infrastructure and a vehicle for safer and more efficient traffic control.

  5. HRSC: High resolution stereo camera

    USGS Publications Warehouse

    Neukum, G.; Jaumann, R.; Basilevsky, A.T.; Dumke, A.; Van Gasselt, S.; Giese, B.; Hauber, E.; Head, J. W., III; Heipke, C.; Hoekzema, N.; Hoffmann, H.; Greeley, R.; Gwinner, K.; Kirk, R.; Markiewicz, W.; McCord, T.B.; Michael, G.; Muller, Jan-Peter; Murray, J.B.; Oberst, J.; Pinet, P.; Pischel, R.; Roatsch, T.; Scholten, F.; Willner, K.

    2009-01-01

    The High Resolution Stereo Camera (HRSC) on Mars Express has delivered a wealth of image data, amounting to over 2.5 TB from the start of the mapping phase in January 2004 to September 2008. In that time, more than a third of Mars was covered at a resolution of 10-20 m/pixel in stereo and colour. After five years in orbit, HRSC is still in excellent shape, and it could continue to operate for many more years. HRSC has proven its ability to close the gap between the low-resolution Viking image data and the high-resolution Mars Orbiter Camera images, leading to a global picture of the geological evolution of Mars that is now much clearer than ever before. Derived highest-resolution terrain model data have closed major gaps and provided an unprecedented insight into the shape of the surface, which is paramount not only for surface analysis and geological interpretation, but also for combination with and analysis of data from other instruments, as well as in planning for future missions. This chapter presents the scientific output from data analysis and highlevel data processing, complemented by a summary of how the experiment is conducted by the HRSC team members working in geoscience, atmospheric science, photogrammetry and spectrophotometry. Many of these contributions have been or will be published in peer-reviewed journals and special issues. They form a cross-section of the scientific output, either by summarising the new geoscientific picture of Mars provided by HRSC or by detailing some of the topics of data analysis concerning photogrammetry, cartography and spectral data analysis.

  6. Development of narrow gap welding technology for extremely thick steel

    NASA Astrophysics Data System (ADS)

    Imai, K.; Saito, T.; Okumura, M.

    In the field of extremely thick steel, various narrow gap welding methods were developed on the basis of former welding methods and are used in practice. It is important to develop and improve automatic narrow gap welding, J edge preparation by gas cutting, the prevention of welding defects, wires for narrow gap welding and so on in order to expand the scope of application of the method. Narrow gap welding technologies are described, based on new concepts developed by Nippon Steel Corporation.

  7. The Critical Angle Can Override the Brewster Angle

    ERIC Educational Resources Information Center

    Froehle, Peter H.

    2009-01-01

    As a culminating activity in their study of optics, my students investigate polarized light and the Brewster angle. In this exercise they encounter a situation in which it is impossible to measure the Brewster angle for light reflecting from a particular surface. This paper describes the activity and explains the students' observations.

  8. Theodolite with CCD Camera for Safe Measurement of Laser-Beam Pointing

    NASA Technical Reports Server (NTRS)

    Crooke, Julie A.

    2003-01-01

    The simple addition of a charge-coupled-device (CCD) camera to a theodolite makes it safe to measure the pointing direction of a laser beam. The present state of the art requires this to be a custom addition because theodolites are manufactured without CCD cameras as standard or even optional equipment. A theodolite is an alignment telescope equipped with mechanisms to measure the azimuth and elevation angles to the sub-arcsecond level. When measuring the angular pointing direction of a Class ll laser with a theodolite, one could place a calculated amount of neutral density (ND) filters in front of the theodolite s telescope. One could then safely view and measure the laser s boresight looking through the theodolite s telescope without great risk to one s eyes. This method for a Class ll visible wavelength laser is not acceptable to even consider tempting for a Class IV laser and not applicable for an infrared (IR) laser. If one chooses insufficient attenuation or forgets to use the filters, then looking at the laser beam through the theodolite could cause instant blindness. The CCD camera is already commercially available. It is a small, inexpensive, blackand- white CCD circuit-board-level camera. An interface adaptor was designed and fabricated to mount the camera onto the eyepiece of the specific theodolite s viewing telescope. Other equipment needed for operation of the camera are power supplies, cables, and a black-and-white television monitor. The picture displayed on the monitor is equivalent to what one would see when looking directly through the theodolite. Again, the additional advantage afforded by a cheap black-and-white CCD camera is that it is sensitive to infrared as well as to visible light. Hence, one can use the camera coupled to a theodolite to measure the pointing of an infrared as well as a visible laser.

  9. 11/18/2006 Mohanty 1 Secure Digital CameraSecure Digital Camera

    E-print Network

    Mohanty, Saraju P.

    11/18/2006 Mohanty 1 Secure Digital CameraSecure Digital Camera Saraju P. Mohanty VLSI Design: smohanty@cse.unt.edu #12;11/18/2006 Mohanty 2 Outline of the TalkOutline of the Talk Digital Rights Management (DRM) Secure Digital Camera (SDC) One of our Digital Integrated Circuit solutions Invisible

  10. 1/15/2007 Mohanty 1 Secure Digital CameraSecure Digital Camera

    E-print Network

    Mohanty, Saraju P.

    1/15/2007 Mohanty 1 Secure Digital CameraSecure Digital Camera Saraju P. Mohanty VLSI Design: smohanty@cse.unt.edu #12;1/15/2007 Mohanty 2 Outline of the TalkOutline of the Talk Digital Rights Management (DRM) Secure Digital Camera (SDC) One of our Digital Integrated Circuit solutions Invisible

  11. Camera self-calibration from translation by referring to a known camera.

    PubMed

    Zhao, Bin; Hu, Zhaozheng

    2015-09-01

    This paper presents a novel linear method for camera self-calibration by referring to a known (or calibrated) camera. The method requires at least three images, with two images generated by the uncalibrated camera from pure translation and one image generated by the known reference camera. We first propose a method to compute the infinite homography from scene depths. Based on this, we use two images generated by translating the uncalibrated camera to recover scene depths, which are further utilized to linearly compute the infinite homography between an arbitrary uncalibrated image, and the image from the known camera. With the known camera as reference, the computed infinite homography is readily decomposed for camera calibration. The proposed self-calibration method has been tested with simulation and real image data. Experimental results demonstrate that the method is practical and accurate. This paper proposes using a "known reference camera" for camera calibration. The pure translation, as required in the method, is much more maneuverable, compared with some strict motions in the literature, such as pure rotation. The proposed self-calibration method has good potential for solving online camera calibration problems, which has important applications, especially for multicamera and zooming camera systems. PMID:26368906

  12. Small angle neutron scattering

    NASA Astrophysics Data System (ADS)

    Cousin, Fabrice

    2015-10-01

    Small Angle Neutron Scattering (SANS) is a technique that enables to probe the 3-D structure of materials on a typical size range lying from ˜ 1 nm up to ˜ a few 100 nm, the obtained information being statistically averaged on a sample whose volume is ˜ 1 cm3. This very rich technique enables to make a full structural characterization of a given object of nanometric dimensions (radius of gyration, shape, volume or mass, fractal dimension, specific area…) through the determination of the form factor as well as the determination of the way objects are organized within in a continuous media, and therefore to describe interactions between them, through the determination of the structure factor. The specific properties of neutrons (possibility of tuning the scattering intensity by using the isotopic substitution, sensitivity to magnetism, negligible absorption, low energy of the incident neutrons) make it particularly interesting in the fields of soft matter, biophysics, magnetic materials and metallurgy. In particular, the contrast variation methods allow to extract some informations that cannot be obtained by any other experimental techniques. This course is divided in two parts. The first one is devoted to the description of the principle of SANS: basics (formalism, coherent scattering/incoherent scattering, notion of elementary scatterer), form factor analysis (I(q?0), Guinier regime, intermediate regime, Porod regime, polydisperse system), structure factor analysis (2nd Virial coefficient, integral equations, characterization of aggregates), and contrast variation methods (how to create contrast in an homogeneous system, matching in ternary systems, extrapolation to zero concentration, Zero Averaged Contrast). It is illustrated by some representative examples. The second one describes the experimental aspects of SANS to guide user in its future experiments: description of SANS spectrometer, resolution of the spectrometer, optimization of spectrometer configurations, optimization of sample characteristics prior to measurements (thickness, volume, hydrogen content…), standards measurements to be made and principle of data reduction.

  13. 33 CFR 162.240 - Tongass Narrows, Alaska; navigation.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Register citations affecting § 162.240, see the List of CFR Sections Affected, which appears in the Finding... Narrows, Alaska; navigation. (a) Definitions. The term “Tongass Narrows” includes the body of water lying... registered length or less, shall exceed a speed of 7 knots in the region of Tongass Narrows bounded to...

  14. 33 CFR 83.09 - Narrow channels (Rule 9).

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 33 Navigation and Navigable Waters 1 2010-07-01 2010-07-01 false Narrow channels (Rule 9). 83.09... Narrow channels (Rule 9). (a) Keeping near to outer limit of channel or fairway which lies on vessel's starboard side; exception. (1) A vessel proceeding along the course of a narrow channel or fairway...

  15. 2. Photocopied July 1971 from photostat Jordan Narrows Folder #1, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    2. Photocopied July 1971 from photostat Jordan Narrows Folder #1, Engineering Department, Utah Power and Light Co., Salt Lake City, Utah. JORDAN NARROWS STATION. PLAN AND SECTION. - Salt Lake City Water & Electrical Power Company, Jordan Narrows Hydroelectric Plant, Jordan River, Riverton, Salt Lake County, UT

  16. Narrow linewidth operation of the RILIS titanium: Sapphire laser at ISOLDE/CERN

    NASA Astrophysics Data System (ADS)

    Rothe, S.; Fedosseev, V. N.; Kron, T.; Marsh, B. A.; Rossel, R. E.; Wendt, K. D. A.

    2013-12-01

    A narrow linewidth operating mode for the Ti:sapphire laser of the CERN ISOLDE Resonance Ionization Laser Ion Source (RILIS) has been developed. This satisfies the laser requirements for the programme of in-source resonance ionization spectroscopy measurements and improves the selectivity for isomer separation using RILIS. A linewidth reduction from typically 10 GHz down to 1 GHz was achieved by the intra-cavity insertion of a second (thick) Fabry-Pérot etalon. Reliable operation during a laser scan was achieved through motorized control of the tilt angle of each etalon. A scanning, stabilization and mode cleaning procedure was developed and implemented in LabVIEW. The narrow linewidth operation was confirmed in a high resolution spectroscopy study of francium isotopes by the Collinear Resonance Ionization Spectroscopy experiment. The resulting laser scans demonstrate the suitability of the laser, in terms of linewidth, spectral purity and stability for high resolution in-source spectroscopy and isomer selective ionization using the RILIS.

  17. Three-dimensional simulation method of fish-eye lens distortion for a vehicle backup rear-view camera.

    PubMed

    Kim, Daehee; Paik, Joonki

    2015-07-01

    Recently, various cameras have been embedded in vehicles for driver safety and convenience. In this context, a backup rear-view camera has attracted increasing attention in helping drivers' parking convenience. Preinstallation of a rear-view camera requires the calibration of a wide-angle lens, such as a fish-eye lens, and the registration of guidelines to the three-dimensional (3D) scene. The proposed method provides a novel simulation method for the optical distortion of a wide-angle lens in a vehicle rear-view camera. The proposed method consists of three steps: (i) generation of the 3D virtual space, (ii) field number-based viewing angle estimation, and (iii) distorted image generation in the 3D space. The major contribution of this work is the lens specification-based simulation of 3D distortion for accurate and efficient preinstallation of vehicle rear-view cameras. The proposed simulation method can also be used to design a general optical imaging system for intelligent surveillance and medical imaging. PMID:26367163

  18. Upgrading a CCD camera for astronomical use 

    E-print Network

    Lamecker, James Frank

    1993-01-01

    Existing charge-coupled device (CCD) video cameras have been modified to be used for astronomical imaging on telescopes in order to improve imaging times over those of photography. An astronomical CCD camera at the Texas A&M Observatory would...

  19. Swallowable Camera To Help Diagnose Esophagus Disorders

    E-print Network

    Chiao, Jung-Chih

    knbc.com Swallowable Camera To Help Diagnose Esophagus Disorders POSTED: 10:57 am PST December 21 to diagnose esophagus disorders like acid reflux. A patient swallows the disposable miniature camera Esophagus Disorders - Print This Story News Story - KNBC ... URL: http://www.knbc.com/print/14907252/detail

  20. Solid State Replacement of Rotating Mirror Cameras

    SciTech Connect

    Frank, A M; Bartolick, J M

    2006-08-25

    Rotating mirror cameras have been the mainstay of mega-frame per second imaging for decades. There is still no electronic camera that can match a film based rotary mirror camera for the combination of frame count, speed, resolution and dynamic range. The rotary mirror cameras are predominantly used in the range of 0.1 to 100 micro-seconds per frame, for 25 to more than a hundred frames. Electron tube gated cameras dominate the sub microsecond regime but are frame count limited. Video cameras are pushing into the microsecond regime but are resolution limited by the high data rates. An all solid state architecture, dubbed ''In-situ Storage Image Sensor'' or ''ISIS'', by Prof. Goji Etoh, has made its first appearance into the market and its evaluation is discussed. Recent work at Lawrence Livermore National Laboratory has concentrated both on evaluation of the presently available technologies and exploring the capabilities of the ISIS architecture. It is clear though there is presently no single chip camera that can simultaneously match the rotary mirror cameras, the ISIS architecture has the potential to approach their performance.

  1. Creating and Using a Camera Obscura

    ERIC Educational Resources Information Center

    Quinnell, Justin

    2012-01-01

    The camera obscura (Latin for "darkened room") is the earliest optical device and goes back over 2500 years. The small pinhole or lens at the front of the room allows light to enter and this is then "projected" onto a screen inside the room. This differs from a camera, which projects its image onto light-sensitive material. Originally images were…

  2. Making a room-sized camera obscura

    NASA Astrophysics Data System (ADS)

    Flynt, Halima; Ruiz, Michael J.

    2015-01-01

    We describe how to convert a room into a camera obscura as a project for introductory geometrical optics. The view for our camera obscura is a busy street scene set against a beautiful mountain skyline. We include a short video with project instructions, ray diagrams and delightful moving images of cars driving on the road outside.

  3. Digital Cameras in the K-12 Classroom.

    ERIC Educational Resources Information Center

    Clark, Kenneth; Hosticka, Alice; Bedell, Jacqueline

    This paper discusses the use of digital cameras in K-12 education. Examples are provided of the integration of the digital camera and visual images into: reading and writing; science, social studies, and mathematics; projects; scientific experiments; desktop publishing; visual arts; data analysis; computer literacy; classroom atmosphere; and…

  4. Solid state replacement of rotating mirror cameras

    NASA Astrophysics Data System (ADS)

    Frank, Alan M.; Bartolick, Joseph M.

    2007-01-01

    Rotating mirror cameras have been the mainstay of mega-frame per second imaging for decades. There is still no electronic camera that can match a film based rotary mirror camera for the combination of frame count, speed, resolution and dynamic range. The rotary mirror cameras are predominantly used in the range of 0.1 to 100 micro-seconds per frame, for 25 to more than a hundred frames. Electron tube gated cameras dominate the sub microsecond regime but are frame count limited. Video cameras are pushing into the microsecond regime but are resolution limited by the high data rates. An all solid state architecture, dubbed 'In-situ Storage Image Sensor' or 'ISIS', by Prof. Goji Etoh has made its first appearance into the market and its evaluation is discussed. Recent work at Lawrence Livermore National Laboratory has concentrated both on evaluation of the presently available technologies and exploring the capabilities of the ISIS architecture. It is clear though there is presently no single chip camera that can simultaneously match the rotary mirror cameras, the ISIS architecture has the potential to approach their performance.

  5. COBRA ATD multispectral camera response model

    NASA Astrophysics Data System (ADS)

    Holmes, V. Todd; Kenton, Arthur C.; Hilton, Russell J.; Witherspoon, Ned H.; Holloway, John H., Jr.

    2000-08-01

    A new multispectral camera response model has been developed in support of the US Marine Corps (USMC) Coastal Battlefield Reconnaissance and Analysis (COBRA) Advanced Technology Demonstration (ATD) Program. This analytical model accurately estimates response form five Xybion intensified IMC 201 multispectral cameras used for COBRA ATD airborne minefield detection. The camera model design is based on a series of camera response curves which were generated through optical laboratory test performed by the Naval Surface Warfare Center, Dahlgren Division, Coastal Systems Station (CSS). Data fitting techniques were applied to these measured response curves to obtain nonlinear expressions which estimates digitized camera output as a function of irradiance, intensifier gain, and exposure. This COBRA Camera Response Model was proven to be very accurate, stable over a wide range of parameters, analytically invertible, and relatively simple. This practical camera model was subsequently incorporated into the COBRA sensor performance evaluation and computational tools for research analysis modeling toolbox in order to enhance COBRA modeling and simulation capabilities. Details of the camera model design and comparisons of modeled response to measured experimental data are presented.

  6. Thermal Cameras in School Laboratory Activities

    ERIC Educational Resources Information Center

    Haglund, Jesper; Jeppsson, Fredrik; Hedberg, David; Schönborn, Konrad J.

    2015-01-01

    Thermal cameras offer real-time visual access to otherwise invisible thermal phenomena, which are conceptually demanding for learners during traditional teaching. We present three studies of students' conduction of laboratory activities that employ thermal cameras to teach challenging thermal concepts in grades 4, 7 and 10-12. Visualization of…

  7. Cameras Monitor Spacecraft Integrity to Prevent Failures

    NASA Technical Reports Server (NTRS)

    2014-01-01

    The Jet Propulsion Laboratory contracted Malin Space Science Systems Inc. to outfit Curiosity with four of its cameras using the latest commercial imaging technology. The company parlayed the knowledge gained under working with NASA to develop an off-the-shelf line of cameras, along with a digital video recorder, designed to help troubleshoot problems that may arise on satellites in space.

  8. Single chip camera active pixel sensor

    NASA Technical Reports Server (NTRS)

    Shaw, Timothy (Inventor); Pain, Bedabrata (Inventor); Olson, Brita (Inventor); Nixon, Robert H. (Inventor); Fossum, Eric R. (Inventor); Panicacci, Roger A. (Inventor); Mansoorian, Barmak (Inventor)

    2003-01-01

    A totally digital single chip camera includes communications to operate most of its structure in serial communication mode. The digital single chip camera include a D/A converter for converting an input digital word into an analog reference signal. The chip includes all of the necessary circuitry for operating the chip using a single pin.

  9. Distributed Localization of Networked Cameras Stanislav Funiak

    E-print Network

    Guestrin, Carlos

    on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires of a camera and, a few moments later, the same object is observed by another camera; if we knew the trajectory an in- dependent localization system like GPS, the trajectory is unknown. However, if we knew the poses

  10. INTEGRATED NAVIGATION OF CAMERAS FOR AUGMENTED REALITY

    E-print Network

    Gustafsson, Fredrik

    INTEGRATED NAVIGATION OF CAMERAS FOR AUGMENTED REALITY Thomas B. Sch¨on and Fredrik Gustafsson)@isy.liu.se Abstract: In augmented reality, the position and orientation of the camera must be estimated very filter, Inertial navigation, Augmented reality, Computer vision, Feature extraction 1. INTRODUCTION

  11. Depth Estimation Using a Sliding Camera.

    PubMed

    Ge, Kailin; Hu, Han; Feng, Jianjiang; Zhou, Jie

    2016-02-01

    Image-based 3D reconstruction technology is widely used in different fields. The conventional algorithms are mainly based on stereo matching between two or more fixed cameras, and high accuracy can only be achieved using a large camera array, which is very expensive and inconvenient in many applications. Another popular choice is utilizing structure-from-motion methods for arbitrarily placed camera(s). However, due to too many degrees of freedom, its computational cost is heavy and its accuracy is rather limited. In this paper, we propose a novel depth estimation algorithm using a sliding camera system. By analyzing the geometric properties of the camera system, we design a camera pose initialization algorithm that can work satisfyingly with only a small number of feature points and is robust to noise. For pixels corresponding to different depths, an adaptive iterative algorithm is proposed to choose optimal frames for stereo matching, which can take advantage of continuously pose-changing imaging and save the time consumption amazingly too. The proposed algorithm can also be easily extended to handle less constrained situations (such as using a camera mounted on a moving robot or vehicle). Experimental results on both synthetic and real-world data have illustrated the effectiveness of the proposed algorithm. PMID:26685238

  12. Wrist Camera Orientation for Effective Telerobotic Orbital Replaceable Unit (ORU) Changeout

    NASA Technical Reports Server (NTRS)

    Jones, Sharon Monica; Aldridge, Hal A.; Vazquez, Sixto L.

    1997-01-01

    The Hydraulic Manipulator Testbed (HMTB) is the kinematic replica of the Flight Telerobotic Servicer (FTS). One use of the HMTB is to evaluate advanced control techniques for accomplishing robotic maintenance tasks on board the Space Station. Most maintenance tasks involve the direct manipulation of the robot by a human operator when high-quality visual feedback is important for precise control. An experiment was conducted in the Systems Integration Branch at the Langley Research Center to compare several configurations of the manipulator wrist camera for providing visual feedback during an Orbital Replaceable Unit changeout task. Several variables were considered such as wrist camera angle, camera focal length, target location, lighting. Each study participant performed the maintenance task by using eight combinations of the variables based on a Latin square design. The results of this experiment and conclusions based on data collected are presented.

  13. 1024 x 768 XGA uncooled camera core achieves new levels of performance in a small package

    NASA Astrophysics Data System (ADS)

    Alicandro, C. J.; DeMarco, R. W.

    2011-06-01

    An uncooled XGA camera core has been developed for multiple thermal imaging applications that require longer detection range and wider fields of view. The design challenge is to maintain high performance while optimizing for size, weight, and power (SWAP). Utilizing a combination of low power electronic designs, proprietary calibration methods, and a new 17?m pitch high performance amorphous silicon (ASi) microbolometer, a rugged multi-purpose SWAP-optimized XGA camera core has been designed. The result is a camera core that has been shown to deliver far better detection range and angle-of-view performance than previous uncooled solutions with frame rates of 30 Hz in XGA mode and 60 Hz in VGA mode.

  14. Fast soft x-ray camera observation of fast and slow reconnection events on NSTX

    NASA Astrophysics Data System (ADS)

    Stratton, Brentley

    2005-10-01

    Reconnection events on the National Spherical Torus Experiment (NSTX) are studied using data from a new soft x-ray camera diagnostic. The camera has a wide-angle tangential view of the plasma and can capture 300 images per discharge at rates up to 500000 frames per second. Two classes of m=n=1 reconnection events are seen: events such as sawteeth and internal reconnection events (IREs) characterized by rapid (˜200 ?s) reconnection, and events in which reconnection occurs on a much slower time scale (tens of ms) with a similar spatial structure. The mode evolution is reconstructed from the fast soft x-ray camera data. Nonlinear resistive MHD modeling with the M3D code and PEST code stability analysis is used to predict the growth rates and island structures of the fast and slow events, with the goal of understanding the conditions which lead to the two types of events.

  15. Be Foil "Filter Knee Imaging" NSTX Plasma with Fast Soft X-ray Camera

    SciTech Connect

    B.C. Stratton; S. von Goeler; D. Stutman; K. Tritz; L.E. Zakharov

    2005-08-08

    A fast soft x-ray (SXR) pinhole camera has been implemented on the National Spherical Torus Experiment (NSTX). This paper presents observations and describes the Be foil Filter Knee Imaging (FKI) technique for reconstructions of a m/n=1/1 mode on NSTX. The SXR camera has a wide-angle (28{sup o}) field of view of the plasma. The camera images nearly the entire diameter of the plasma and a comparable region in the vertical direction. SXR photons pass through a beryllium foil and are imaged by a pinhole onto a P47 scintillator deposited on a fiber optic faceplate. An electrostatic image intensifier demagnifies the visible image by 6:1 to match it to the size of the charge-coupled device (CCD) chip. A pair of lenses couples the image to the CCD chip.

  16. Narrow field electromagnetic sensor system and method

    DOEpatents

    McEwan, T.E.

    1996-11-19

    A narrow field electromagnetic sensor system and method of sensing a characteristic of an object provide the capability to realize a characteristic of an object such as density, thickness, or presence, for any desired coordinate position on the object. One application is imaging. The sensor can also be used as an obstruction detector or an electronic trip wire with a narrow field without the disadvantages of impaired performance when exposed to dirt, snow, rain, or sunlight. The sensor employs a transmitter for transmitting a sequence of electromagnetic signals in response to a transmit timing signal, a receiver for sampling only the initial direct RF path of the electromagnetic signal while excluding all other electromagnetic signals in response to a receive timing signal, and a signal processor for processing the sampled direct RF path electromagnetic signal and providing an indication of the characteristic of an object. Usually, the electromagnetic signal is a short RF burst and the obstruction must provide a substantially complete eclipse of the direct RF path. By employing time-of-flight techniques, a timing circuit controls the receiver to sample only the initial direct RF path of the electromagnetic signal while not sampling indirect path electromagnetic signals. The sensor system also incorporates circuitry for ultra-wideband spread spectrum operation that reduces interference to and from other RF services while allowing co-location of multiple electronic sensors without the need for frequency assignments. 12 figs.

  17. Narrow field electromagnetic sensor system and method

    DOEpatents

    McEwan, Thomas E. (Livermore, CA)

    1996-01-01

    A narrow field electromagnetic sensor system and method of sensing a characteristic of an object provide the capability to realize a characteristic of an object such as density, thickness, or presence, for any desired coordinate position on the object. One application is imaging. The sensor can also be used as an obstruction detector or an electronic trip wire with a narrow field without the disadvantages of impaired performance when exposed to dirt, snow, rain, or sunlight. The sensor employs a transmitter for transmitting a sequence of electromagnetic signals in response to a transmit timing signal, a receiver for sampling only the initial direct RF path of the electromagnetic signal while excluding all other electromagnetic signals in response to a receive timing signal, and a signal processor for processing the sampled direct RF path electromagnetic signal and providing an indication of the characteristic of an object. Usually, the electromagnetic signal is a short RF burst and the obstruction must provide a substantially complete eclipse of the direct RF path. By employing time-of-flight techniques, a timing circuit controls the receiver to sample only the initial direct RF path of the electromagnetic signal while not sampling indirect path electromagnetic signals. The sensor system also incorporates circuitry for ultra-wideband spread spectrum operation that reduces interference to and from other RF services while allowing co-location of multiple electronic sensors without the need for frequency assignments.

  18. Studies of narrow autoionizing resonances in gadolinium

    SciTech Connect

    Bushaw, Bruce A.; Nortershauser, W.; Blaum, K.; Wendt, Klaus

    2003-06-30

    The autoionization (AI) spectrum of gadolinium between the first and second limits has been investigated by triple-resonance excitation with high-resolution cw lasers. A large number of narrow AI resonances have been observed and assigned total angular momentum J values. The resonances are further divided into members of AI Rydberg series converging to the second limit or other ''interloping'' levels. Fine structure in the Rydberg series has been identified and interpreted in terms of Jc j coupling. A number of detailed studies have been performed on the interloping resonances: These include lifetime determination by lineshape analysis, isotope shifts, hyperfine structure, and photoionization saturation parameters. The electronic structure of the interloping levels is discussed in terms of these studies. Linewidths generally decrease with increasing total angular momentum and the J = 7 resonances are extremely narrow with Lorentzian widths ranging from < 1 MHz up to 157 MHz. The strongest resonances are found to have cross-sections of {approx}10-12 cm{sup 2} and photoionization can be saturated with powers available from cw diode lasers.

  19. Effects of camera location on the reconstruction of 3D flare trajectory with two cameras

    NASA Astrophysics Data System (ADS)

    Özsaraç, Seçkin; Ye?ilkaya, Muhammed

    2015-05-01

    Flares are used as valuable electronic warfare assets for the battle against infrared guided missiles. The trajectory of the flare is one of the most important factors that determine the effectiveness of the counter measure. Reconstruction of the three dimensional (3D) position of a point, which is seen by multiple cameras, is a common problem. Camera placement, camera calibration, corresponding pixel determination in between the images of different cameras and also the triangulation algorithm affect the performance of 3D position estimation. In this paper, we specifically investigate the effects of camera placement on the flare trajectory estimation performance by simulations. Firstly, 3D trajectory of a flare and also the aircraft, which dispenses the flare, are generated with simple motion models. Then, we place two virtual ideal pinhole camera models on different locations. Assuming the cameras are tracking the aircraft perfectly, the view vectors of the cameras are computed. Afterwards, using the view vector of each camera and also the 3D position of the flare, image plane coordinates of the flare on both cameras are computed using the field of view (FOV) values. To increase the fidelity of the simulation, we have used two sources of error. One is used to model the uncertainties in the determination of the camera view vectors, i.e. the orientations of the cameras are measured noisy. Second noise source is used to model the imperfections of the corresponding pixel determination of the flare in between the two cameras. Finally, 3D position of the flare is estimated using the corresponding pixel indices, view vector and also the FOV of the cameras by triangulation. All the processes mentioned so far are repeated for different relative camera placements so that the optimum estimation error performance is found for the given aircraft and are trajectories.

  20. ARNICA, the Arcetri Near-Infrared Camera

    NASA Astrophysics Data System (ADS)

    Lisi, F.; Baffa, C.; Bilotti, V.; Bonaccini, D.; del Vecchio, C.; Gennari, S.; Hunt, L. K.; Marcucci, G.; Stanga, R.

    1996-04-01

    ARNICA (ARcetri Near-Infrared CAmera) is the imaging camera for the near-infrared bands between 1.0 and 2.5 microns that the Arcetri Observatory has designed and built for the Infrared Telescope TIRGO located at Gornergrat, Switzerland. We describe the mechanical and optical design of the camera, and report on the astronomical performance of ARNICA as measured during the commissioning runs at the TIRGO (December, 1992 to December 1993), and an observing run at the William Herschel Telescope, Canary Islands (December, 1993). System performance is defined in terms of efficiency of the camera+telescope system and camera sensitivity for extended and point-like sources. (SECTION: Astronomical Instrumentation)

  1. Development of high-speed video cameras

    NASA Astrophysics Data System (ADS)

    Etoh, Takeharu G.; Takehara, Kohsei; Okinaka, Tomoo; Takano, Yasuhide; Ruckelshausen, Arno; Poggemann, Dirk

    2001-04-01

    Presented in this paper is an outline of the R and D activities on high-speed video cameras, which have been done in Kinki University since more than ten years ago, and are currently proceeded as an international cooperative project with University of Applied Sciences Osnabruck and other organizations. Extensive marketing researches have been done, (1) on user's requirements on high-speed multi-framing and video cameras by questionnaires and hearings, and (2) on current availability of the cameras of this sort by search of journals and websites. Both of them support necessity of development of a high-speed video camera of more than 1 million fps. A video camera of 4,500 fps with parallel readout was developed in 1991. A video camera with triple sensors was developed in 1996. The sensor is the same one as developed for the previous camera. The frame rate is 50 million fps for triple-framing and 4,500 fps for triple-light-wave framing, including color image capturing. Idea on a video camera of 1 million fps with an ISIS, In-situ Storage Image Sensor, was proposed in 1993 at first, and has been continuously improved. A test sensor was developed in early 2000, and successfully captured images at 62,500 fps. Currently, design of a prototype ISIS is going on, and, hopefully, will be fabricated in near future. Epoch-making cameras in history of development of high-speed video cameras by other persons are also briefly reviewed.

  2. An oscillating motion of a red blood cell and a neutrally buoyant particle in Poiseuille flow in a narrow channel

    E-print Network

    Shi, Lingling; Pan, Tsorng-Whay; Glowinski, Roland

    2013-01-01

    Two motions of oscillation and vacillating breathing (swing) of a red blood cell have been observed in bounded Poiseuille flows (Phys. Rev. E 85, 16307 (2012)). To understand such motions, we have studied the oscillating motion of a neutrally buoyant rigid particle of the same shape in Poiseuille flow in a narrow channel and obtained that the crucial point is to have the particle interacting with Poiseuille flow with its mass center moving up and down in the channel central region. Since the mass center of the cell migrates toward the channel central region, its oscillating motion of the inclination angle is similar to the aforementioned motion as long as the cell keeps the shape of long body. But as the up-and-down oscillation of the cell mass center damps out, the oscillating motion of the inclination angle also damps out and the cell inclination angle approaches to a fixed angle.

  3. Spinning angle optical calibration apparatus

    DOEpatents

    Beer, Stephen K. (Morgantown, WV); Pratt, II, Harold R. (Morgantown, WV)

    1991-01-01

    An optical calibration apparatus is provided for calibrating and reproducing spinning angles in cross-polarization, nuclear magnetic resonance spectroscopy. An illuminated magnifying apparatus enables optical setting an accurate reproducing of spinning "magic angles" in cross-polarization, nuclear magnetic resonance spectroscopy experiments. A reference mark scribed on an edge of a spinning angle test sample holder is illuminated by a light source and viewed through a magnifying scope. When the "magic angle" of a sample material used as a standard is attained by varying the angular position of the sample holder, the coordinate position of the reference mark relative to a graduation or graduations on a reticle in the magnifying scope is noted. Thereafter, the spinning "magic angle" of a test material having similar nuclear properties to the standard is attained by returning the sample holder back to the originally noted coordinate position.

  4. Performance characterization of commercially available uncooled micro-bolometer thermal cameras for varying camera temperatures

    NASA Astrophysics Data System (ADS)

    Minnaar, I. J.

    2014-06-01

    Uncooled infrared (IR) microbolometer cameras are gaining popularity in a variety of military and commercial applications due to their simplicity, compactness and reduced cost when compared to photon detectors. Three commercially available IR microbolometer cameras have been investigated for use in a system. The cameras have been characterized in terms of camera response and noise as function of camera temperature with the aim of modelling the cameras for use in simulation. Ideally, the camera systems, consisting of a detector, electronics, and optics, should be modelled from a low-level physical point of view and measurements should be performed for verification. However, the detector and electronic design parameters are not available for the commercially acquired cameras, and a black-box approach of the systems was adopted for modelling and characterization. The black-box approach entails empirical mathematical modelling of the camera response and noise through measurements and subsequent data analysis. A 3D noise model was employed to characterize camera noise in terms of orthogonal noise components, and an empirical temperature-dependent model was deduced for each component. The method of modelling through measurement is discussed, and the accuracy of specifically the empirical noise models is shown. The cameras are also compared in terms of measured noise performance.

  5. The Hyperspectral Stereo Camera Project

    NASA Astrophysics Data System (ADS)

    Griffiths, A. D.; Coates, A. J.

    2006-12-01

    The MSSL Hyperspectral Stereo Camera (HSC) is developed from Beagle2 stereo camera heritage. Replaceing filter wheels with liquid crystal tuneable filters (LCTF) turns each eye into a compact hyperspectral imager. Hyperspectral imaging is defined here as acquiring 10s-100s of images in 10-20 nm spectral bands. Combined together these bands form an image `cube' (with wavelength as the third dimension) allowing a detailed spectrum to be extracted at any pixel position. A LCTF is conceptually similar to the Fabry-Perot tuneable filter design but instead of physical separation, the variable refractive index of the liquid crystal etalons is used to define the wavelength of interest. For 10 nm bandwidths, LCTFs are available covering the 400-720 nm and 650-1100 nm ranges. The resulting benefits include reduced imager mechanical complexity, no limitation on the number of filter wavelengths available and the ability to change the wavelengths of interest in response to new findings as the mission proceeds. LCTFs are currently commercially available from two US companies - Scientific Solutions Inc. and Cambridge Research Inc. (CRI). CRI distribute the `Varispec' LCTFs used in the HSC. Currently, in Earth orbit hyperspectral imagers can prospect for minerals, detect camouflaged military equipment and determine the species and state of health of crops. Therefore, we believe this instrument shows great promise for a wide range of investigations in the planetary science domain (below). MSSL will integrate and test at representative Martian temperatures the HSC development model (to determine power requirements to prevent the liquid crystals freezing). Additionally, a full radiometric calibration is required to determine the HSC sensitivity. The second phase of the project is to demonstrate (in a ground based lab) the benefit of much higher spectral resolution to the following Martian scientific investigations: - Determination of the mineralogy of rocks and soil - Detection of water vapour due to its absorption of sun light at 935 nm - The measurement of the dust optical density as a function of wavelength - Determination of the solar spectrum at the surface as a function of time of day - The search for putative biological pigments (i.e. chlorophyll and carotenoids) The detector for this proposal is a commercial low noise CCD imaging chip providing sufficient SNR to detect the 10s of ppm of water vapour present in the Martian atmosphere.

  6. Demonstration of three-dimensional imaging based on handheld Compton camera

    NASA Astrophysics Data System (ADS)

    Kishimoto, A.; Kataoka, J.; Nishiyama, T.; Taya, T.; Kabuki, S.

    2015-11-01

    Compton cameras are potential detectors that are capable of performing measurements across a wide energy range for medical imaging applications, such as in nuclear medicine and ion beam therapy. In previous work, we developed a handheld Compton camera to identify environmental radiation hotspots. This camera consists of a 3D position-sensitive scintillator array and multi-pixel photon counter arrays. In this work, we reconstructed the 3D image of a source via list-mode maximum likelihood expectation maximization and demonstrated the imaging performance of the handheld Compton camera. Based on both the simulation and the experiments, we confirmed that multi-angle data acquisition of the imaging region significantly improved the spatial resolution of the reconstructed image in the direction vertical to the detector. The experimental spatial resolutions in the X, Y, and Z directions at the center of the imaging region were 6.81 mm ± 0.13 mm, 6.52 mm ± 0.07 mm and 6.71 mm ± 0.11 mm (FWHM), respectively. Results of multi-angle data acquisition show the potential of reconstructing 3D source images.

  7. Modeling of a slanted-hole collimator in a compact endo-cavity gamma camera.

    NASA Astrophysics Data System (ADS)

    Kamuda, Mark; Cui, Yonggang; Lall, Terry; Ionson, Jim; Camarda, Giuseppe S.; Hossain, Anwar; Yang, Ge; Roy, Utpal N.; James, Ralph B.

    2013-09-01

    Having the ability to take an accurate 3D image of a tumor greatly helps doctors diagnose it and then create a treatment plan for a patient. One way to accomplish molecular imaging is to inject a radioactive tracer into a patient and then measure the gamma rays emitted from regions with high-uptake of the tracer, viz., the cancerous tissues. In large, expensive PET- or SPECT-imaging systems, the 3D imaging easily is accomplished by rotating the gamma-ray detectors and then employing software to reconstruct the 3D images from the multiple 2D projections at different angles of view. However, this method is impractical in a very compact imaging system due to anatomical considerations, e.g., the transrectal gamma camera under development at Brookhaven National Laboratory (BNL) for detection of intra-prostatic tumors. The camera uses pixilated cadmium zinc telluride (CdZnTe or CZT) detectors with matched parallel-hole collimator. Our research investigated the possibility of using a collimator with slanted holes to create 3D pictures of a radioactive source. The underlying concept is to take 2D projection images at different angles of view by adjusting the slant angle of the collimator, then using the 2D projection images to reconstruct the 3D image. To do this, we first simulated the response of a pixilated CZT detector to radiation sources placed in the field of view of the camera. Then, we formulated an algorithm to use the simulation results as prior knowledge and estimate the distribution of a shaped source from its 2D projection images. From the results of the simulation, we measured the spatial resolution of the camera as ~7-mm at a depth of 13.85-mm when using a detector with 2.46-mm pixel pitch and a collimator with 60° slant angle.

  8. Small-angle electron scattering of magnetic fine structures.

    PubMed

    Togawa, Yoshihiko

    2013-06-01

    Magnetic structures in magnetic artificial lattices and chiral magnetic orders in chiral magnets have been quantitatively analyzed in the reciprocal space by means of small-angle electron scattering (SAES) method. Lorentz deflection due to magnetic moments and Bragg diffraction due to periodicity are simultaneously recorded at an angle of the order of or less than 1 × 10(-6) rad, using a camera length of more than 100 m. The present SAES method, together with TEM real-space imaging methods such as in-situ Lorentz microscopy, is very powerful in analyzing magnetic fine structures in magnetic materials. Indeed, the existence of both a chiral helimagnetic structure and a chiral magnetic soliton lattice in a chiral magnet CrNb3S6 has been successfully verified for the first time using the present complementary methods. PMID:23674342

  9. Smart Camera Technology Increases Quality

    NASA Technical Reports Server (NTRS)

    2004-01-01

    When it comes to real-time image processing, everyone is an expert. People begin processing images at birth and rapidly learn to control their responses through the real-time processing of the human visual system. The human eye captures an enormous amount of information in the form of light images. In order to keep the brain from becoming overloaded with all the data, portions of an image are processed at a higher resolution than others, such as a traffic light changing colors. changing colors. In the same manner, image processing products strive to extract the information stored in light in the most efficient way possible. Digital cameras available today capture millions of pixels worth of information from incident light. However, at frame rates more than a few per second, existing digital interfaces are overwhelmed. All the user can do is store several frames to memory until that memory is full and then subsequent information is lost. New technology pairs existing digital interface technology with an off-the-shelf complementary metal oxide semiconductor (CMOS) imager to provide more than 500 frames per second of specialty image processing. The result is a cost-effective detection system unlike any other.

  10. True three-dimensional camera

    NASA Astrophysics Data System (ADS)

    Kornreich, Philipp; Farell, Bart

    2013-01-01

    An imager that can measure the distance from each pixel to the point on the object that is in focus at the pixel is described. This is accomplished by short photo-conducting lightguides at each pixel. In the eye the rods and cones are the fiber-like lightguides. The device uses ambient light that is only coherent in spherical shell-shaped light packets of thickness of one coherence length. Modern semiconductor technology permits the construction of lightguides shorter than a coherence length of ambient light. Each of the frequency components of the broad band light arriving at a pixel has a phase proportional to the distance from an object point to its image pixel. Light frequency components in the packet arriving at a pixel through a convex lens add constructively only if the light comes from the object point in focus at this pixel. The light in packets from all other object points cancels. Thus the pixel receives light from one object point only. The lightguide has contacts along its length. The lightguide charge carriers are generated by the light patterns. These light patterns, and thus the photocurrent, shift in response to the phase of the input signal. Thus, the photocurrent is a function of the distance from the pixel to its object point. Applications include autonomous vehicle navigation and robotic vision. Another application is a crude teleportation system consisting of a camera and a three-dimensional printer at a remote location.

  11. Xochicalco: Tlayohualchieliztli or camera obscura

    NASA Astrophysics Data System (ADS)

    Cornejo-Rodríguez, A.; Vázquez-Montiel, S.; Granados-Agustín, F.; Gale, D.; Diamant, R.; Espinasa-Perena, R.; Cruz, J. L.; Fernández-Guasti, M.

    2011-08-01

    Xochicalco is an archaeological site located in the state of Morelos in central Mexico. It flourished from 600 to 900 a.d. with numerous multicultural elements. There are several underground rooms carved into the hillside In particular, a room with a shaft that has a hole in the roof whose orientation towards the zenith supports its astronomical purpose. Our hypothesis is that the place was used as a tlayohualchieliztli or camera obscura for astronomical observations. This would be the first evidence of a pre-columbian image forming device. To explore the feasibility of this assertion, the conditions required to produce an image were studied. The aperture diameter in the top of the shaft is far too large to be used as a "pinhole" but it may have been covered with a screen containing a smaller bore-hole. We work out the optimum aperture size. The portion of the sky that could be observed due to the orientation of the shaft was also undertaken. The two most intense celestial objects should produce bright enough images thus suggesting that observation of the sun took place during day-time and observation of the moon during night-time. Amate paper or cloth could have been used to directly draw the position of celestial objects.

  12. Indium nitride: A narrow gap semiconductor

    SciTech Connect

    Wu, J.; Walukiewicz, W.; Yu, K.M.; Ager III, J.W.; Haller, E.E.; Lu, H.; Schaff, W.J.

    2002-08-14

    The optical properties of wurtzite InN grown on sapphire substrates by molecular-beam epitaxy have been characterized by optical absorption, photoluminescence, and photomodulated reflectance techniques. All these three characterization techniques show an energy gap for InN between 0.7 and 0.8 eV, much lower than the commonly accepted value of 1.9 eV. The photoluminescence peak energy is found to be sensitive to the free electron concentration of the sample. The peak energy exhibits a very weak hydrostatic pressure dependence and a small, anomalous blueshift with increasing temperature. The bandgap energies of In-rich InGaN alloys were found to be consistent with the narrow gap of InN. The bandgap bowing parameter was determined to be 1.43 eV in InGaN.

  13. Ultra-narrow metallic armchair graphene nanoribbons

    PubMed Central

    Kimouche, Amina; Ervasti, Mikko M.; Drost, Robert; Halonen, Simo; Harju, Ari; Joensuu, Pekka M.; Sainio, Jani; Liljeroth, Peter

    2015-01-01

    Graphene nanoribbons (GNRs)—narrow stripes of graphene—have emerged as promising building blocks for nanoelectronic devices. Recent advances in bottom-up synthesis have allowed production of atomically well-defined armchair GNRs with different widths and doping. While all experimentally studied GNRs have exhibited wide bandgaps, theory predicts that every third armchair GNR (widths of N=3m+2, where m is an integer) should be nearly metallic with a very small bandgap. Here, we synthesize the narrowest possible GNR belonging to this family (five carbon atoms wide, N=5). We study the evolution of the electronic bandgap and orbital structure of GNR segments as a function of their length using low-temperature scanning tunnelling microscopy and density-functional theory calculations. Already GNRs with lengths of 5?nm reach almost metallic behaviour with ?100?meV bandgap. Finally, we show that defects (kinks) in the GNRs do not strongly modify their electronic structure. PMID:26658960

  14. Soft disks in a narrow channel

    E-print Network

    D. Mukamel; H. A. Posch

    2008-12-19

    The pressure components of "soft" disks in a two dimensional narrow channel are analyzed in the dilute gas regime using the Mayer cluster expansion and molecular dynamics. Channels with either periodic or reflecting boundaries are considered. It is found that when the two-body potential, u(r), is singular at some distance r_0, the dependence of the pressure components on the channel width exhibits a singularity at one or more channel widths which are simply related to r_0. In channels with periodic boundary conditions and for potentials which are discontinuous at r_0, the transverse and longitudinal pressure components exhibit a 1/2 and 3/2 singularity, respectively. Continuous potentials with a power law singularity result in weaker singularities of the pressure components. In channels with reflecting boundary conditions the singularities are found to be weaker than those corresponding to periodic boundaries.

  15. Ultra-narrow metallic armchair graphene nanoribbons.

    PubMed

    Kimouche, Amina; Ervasti, Mikko M; Drost, Robert; Halonen, Simo; Harju, Ari; Joensuu, Pekka M; Sainio, Jani; Liljeroth, Peter

    2015-01-01

    Graphene nanoribbons (GNRs)-narrow stripes of graphene-have emerged as promising building blocks for nanoelectronic devices. Recent advances in bottom-up synthesis have allowed production of atomically well-defined armchair GNRs with different widths and doping. While all experimentally studied GNRs have exhibited wide bandgaps, theory predicts that every third armchair GNR (widths of N=3m+2, where m is an integer) should be nearly metallic with a very small bandgap. Here, we synthesize the narrowest possible GNR belonging to this family (five carbon atoms wide, N=5). We study the evolution of the electronic bandgap and orbital structure of GNR segments as a function of their length using low-temperature scanning tunnelling microscopy and density-functional theory calculations. Already GNRs with lengths of 5?nm reach almost metallic behaviour with ?100?meV bandgap. Finally, we show that defects (kinks) in the GNRs do not strongly modify their electronic structure. PMID:26658960

  16. Nondecaying Hydrodynamic Interactions along Narrow Channels

    NASA Astrophysics Data System (ADS)

    Misiunas, Karolis; Pagliara, Stefano; Lauga, Eric; Lister, John R.; Keyser, Ulrich F.

    2015-07-01

    Particle-particle interactions are of paramount importance in every multibody system as they determine the collective behavior and coupling strength. Many well-known interactions such as electrostatic, van der Waals, or screened Coulomb interactions, decay exponentially or with negative powers of the particle spacing r . Similarly, hydrodynamic interactions between particles undergoing Brownian motion decay as 1 /r in bulk, and are assumed to decay in small channels. Such interactions are ubiquitous in biological and technological systems. Here we confine two particles undergoing Brownian motion in narrow, microfluidic channels and study their coupling through hydrodynamic interactions. Our experiments show that the hydrodynamic particle-particle interactions are distance independent in these channels. This finding is of fundamental importance for the interpretation of experiments where dense mixtures of particles or molecules diffuse through finite length, water-filled channels or pore networks.

  17. 15 CFR 743.3 - Thermal imaging camera reporting.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ...2013-01-01 2013-01-01 false Thermal imaging camera reporting. 743...REGULATIONS SPECIAL REPORTING § 743.3 Thermal imaging camera reporting. (a) General requirement. Exports of thermal imaging cameras must be reported...

  18. 15 CFR 743.3 - Thermal imaging camera reporting.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ...2014-01-01 2014-01-01 false Thermal imaging camera reporting. 743...REPORTING AND NOTIFICATION § 743.3 Thermal imaging camera reporting. (a) General requirement. Exports of thermal imaging cameras must be reported...

  19. 15 CFR 743.3 - Thermal imaging camera reporting.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ...2012-01-01 2012-01-01 false Thermal imaging camera reporting. 743...REGULATIONS SPECIAL REPORTING § 743.3 Thermal imaging camera reporting. (a) General requirement. Exports of thermal imaging cameras must be reported...

  20. Design and analysis of a two-dimensional camera array

    E-print Network

    Yang, Jason C. (Jason Chieh-Sheng), 1977-

    2005-01-01

    I present the design and analysis of a two-dimensional camera array for virtual studio applications. It is possible to substitute conventional cameras and motion control devices with a real-time, light field camera array. ...

  1. The SALSA Project - High-End Aerial 3d Camera

    NASA Astrophysics Data System (ADS)

    Rüther-Kindel, W.; Brauchle, J.

    2013-08-01

    The ATISS measurement drone, developed at the University of Applied Sciences Wildau, is an electrical powered motor glider with a maximum take-off weight of 25 kg including a payload capacity of 10 kg. Two 2.5 kW engines enable ultra short take-off procedures and the motor glider design results in a 1 h endurance. The concept of ATISS is based on the idea to strictly separate between aircraft and payload functions, which makes ATISS a very flexible research platform for miscellaneous payloads. ATISS is equipped with an autopilot for autonomous flight patterns but under permanent pilot control from the ground. On the basis of ATISS the project SALSA was undertaken. The aim was to integrate a system for digital terrain modelling. Instead of a laser scanner a new design concept was chosen based on two synchronized high resolution digital cameras, one in a fixed nadir orientation and the other in a oblique orientation. Thus from every object on the ground images from different view angles are taken. This new measurement camera system MACS-TumbleCam was developed at the German Aerospace Center DLR Berlin-Adlershof especially for the ATISS payload concept. Special advantage in comparison to laser scanning is the fact, that instead of a cloud of points a surface including texture is generated and a high-end inertial orientation system can be omitted. The first test flights show a ground resolution of 2 cm and height resolution of 3 cm, which underline the extraordinary capabilities of ATISS and the MACS measurement camera system.

  2. Automatic calibration of laser range cameras using arbitrary planar surfaces

    SciTech Connect

    Baker, J.E.

    1994-06-01

    Laser Range Cameras (LRCs) are powerful tools for many robotic/computer perception activities. They can provide accurate range images and perfectly registered reflectance images of the target scene, useful for constructing reliably detailed 3-D world maps and target characterizations. An LRC`s output is an array of distances obtained by scanning a laser over the scene. To accurately interpret this data, the angular definition of each pixel, i.e., the 3-D direction corresponding to each distance measurement, must be known. This angular definition is a function of the camera`s intrinsic design and unique implementation characteristics, e.g., actual mirror positions, axes of rotation, angular velocities, etc. Typically, the range data is converted to Cartesian coordinates by calibration-parameterized, non-linear transformation equations. Unfortunately, typical LRC calibration techniques are manual, intensive, and inaccurate. Common techniques involve imaging carefully orchestrated artificial targets and manually measuring actual distances and relative angles to infer the correct calibration parameter values. This paper presents an automated method which uses Genetic Algorithms to search for calibration parameter values and possible transformation equations which combine to maximize the planarity of user-specified sub-regions of the image(s). This method permits calibration to be based on an arbitrary plane, without precise knowledge of the LRC`s mechanical precision, intrinsic design, or its relative positioning to the target. Furthermore, this method permits rapid, remote, and on-line recalibration - important capabilities for many robotic systems. Empirical validation of this system has been performed using two different LRC systems and has led to significant improvement in image accuracy while reducing the calibration time by orders of magnitude.

  3. High-power narrow-vertical-divergence photonic band crystal laser diodes with optimized epitaxial structure

    SciTech Connect

    Liu, Lei; Qu, Hongwei; Liu, Yun; Zhang, Yejin; Zheng, Wanhua; Wang, Yufei; Qi, Aiyi

    2014-12-08

    900?nm longitudinal photonic band crystal (PBC) laser diodes with optimized epitaxial structure are fabricated. With a same calculated fundamental-mode divergence, stronger mode discrimination is achieved by a quasi-periodic index modulation in the PBC waveguide than a periodic one. Experiments show that the introduction of over 5.5??m-thick PBC waveguide contributes to only 10% increment of the internal loss for the laser diodes. For broad area PBC lasers, output powers of 5.75?W under continuous wave test and over 10?W under quasi-continuous wave test are reported. The vertical divergence angles are 10.5° at full width at half maximum and 21.3° with 95% power content, in conformity with the simulated angles. Such device shows a prospect for high-power narrow-vertical-divergence laser emission from single diode laser and laser bar.

  4. [Research Award providing funds for a tracking video camera

    NASA Technical Reports Server (NTRS)

    Collett, Thomas

    2000-01-01

    The award provided funds for a tracking video camera. The camera has been installed and the system calibrated. It has enabled us to follow in real time the tracks of individual wood ants (Formica rufa) within a 3m square arena as they navigate singly in-doors guided by visual cues. To date we have been using the system on two projects. The first is an analysis of the navigational strategies that ants use when guided by an extended landmark (a low wall) to a feeding site. After a brief training period, ants are able to keep a defined distance and angle from the wall, using their memory of the wall's height on the retina as a controlling parameter. By training with walls of one height and length and testing with walls of different heights and lengths, we can show that ants adjust their distance from the wall so as to keep the wall at the height that they learned during training. Thus, their distance from the base of a tall wall is further than it is from the training wall, and the distance is shorter when the wall is low. The stopping point of the trajectory is defined precisely by the angle that the far end of the wall makes with the trajectory. Thus, ants walk further if the wall is extended in length and not so far if the wall is shortened. These experiments represent the first case in which the controlling parameters of an extended trajectory can be defined with some certainty. It raises many questions for future research that we are now pursuing.

  5. Management of mandibular angle fracture.

    PubMed

    Braasch, Daniel Cameron; Abubaker, A Omar

    2013-11-01

    Fractures through the angle of the mandible are one of the most common facial fractures. The management of such fractures has been controversial, however. This controversy is related to the anatomic relations and complex biomechanical aspects of the mandibular angle. The debate has become even more heated since the evolution of rigid fixation and the ability to provide adequate stability of the fractured segments. This article provides an overview of the special anatomic and biomechanical features of the mandibular angle and their impact on the management of these fractures. PMID:24183373

  6. Clementine High Resolution Camera Mosaicking Project

    NASA Technical Reports Server (NTRS)

    1998-01-01

    This report constitutes the final report for NASA Contract NASW-5054. This project processed Clementine I high resolution images of the Moon, mosaicked these images together, and created a 22-disk set of compact disk read-only memory (CD-ROM) volumes. The mosaics were produced through semi-automated registration and calibration of the high resolution (HiRes) camera's data against the geometrically and photometrically controlled Ultraviolet/Visible (UV/Vis) Basemap Mosaic produced by the US Geological Survey (USGS). The HiRes mosaics were compiled from non-uniformity corrected, 750 nanometer ("D") filter high resolution nadir-looking observations. The images were spatially warped using the sinusoidal equal-area projection at a scale of 20 m/pixel for sub-polar mosaics (below 80 deg. latitude) and using the stereographic projection at a scale of 30 m/pixel for polar mosaics. Only images with emission angles less than approximately 50 were used. Images from non-mapping cross-track slews, which tended to have large SPICE errors, were generally omitted. The locations of the resulting image population were found to be offset from the UV/Vis basemap by up to 13 km (0.4 deg.). Geometric control was taken from the 100 m/pixel global and 150 m/pixel polar USGS Clementine Basemap Mosaics compiled from the 750 nm Ultraviolet/Visible Clementine imaging system. Radiometric calibration was achieved by removing the image nonuniformity dominated by the HiRes system's light intensifier. Also provided are offset and scale factors, achieved by a fit of the HiRes data to the corresponding photometrically calibrated UV/Vis basemap, that approximately transform the 8-bit HiRes data to photometric units. The sub-polar mosaics are divided into tiles that cover approximately 1.75 deg. of latitude and span the longitude range of the mosaicked frames. Images from a given orbit are map projected using the orbit's nominal central latitude. Polar mosaics are tiled into squares 2250 pixels on a side, which spans approximately 2.2 deg. Two mosaics are provided for each pole: one corresponding to data acquired while periapsis was in the south, the other while periapsis was in the north. The CD-ROMs also contain ancillary data files that support the HiRes mosaic. These files include browse images with UV/Vis context stored in a Joint Photographic Experts Group (JPEG) format, index files ('imgindx.tab' and 'srcindx.tab') that tabulate the contents of the CD, and documentation files.

  7. Two degree of freedom camera mount

    NASA Technical Reports Server (NTRS)

    Ambrose, Robert O. (Inventor)

    2003-01-01

    A two degree of freedom camera mount. The camera mount includes a socket, a ball, a first linkage and a second linkage. The socket includes an interior surface and an opening. The ball is positioned within an interior of the socket. The ball includes a coupling point for rotating the ball relative to the socket and an aperture for mounting a camera. The first and second linkages are rotatably connected to the socket and slidably connected to the coupling point of the ball. Rotation of the linkages with respect to the socket causes the ball to rotate with respect to the socket.

  8. New Modular Camera No Ordinary Joe

    NASA Technical Reports Server (NTRS)

    2003-01-01

    Although dubbed 'Little Joe' for its small-format characteristics, a new wavefront sensor camera has proved that it is far from coming up short when paired with high-speed, low-noise applications. SciMeasure Analytical Systems, Inc., a provider of cameras and imaging accessories for use in biomedical research and industrial inspection and quality control, is the eye behind Little Joe's shutter, manufacturing and selling the modular, multi-purpose camera worldwide to advance fields such as astronomy, neurobiology, and cardiology.

  9. Determining camera parameters for round glassware measurements

    NASA Astrophysics Data System (ADS)

    Baldner, F. O.; Costa, P. B.; Gomes, J. F. S.; Filho, D. M. E. S.; Leta, F. R.

    2015-01-01

    Nowadays there are many types of accessible cameras, including digital single lens reflex ones. Although these cameras are not usually employed in machine vision applications, they can be an interesting choice. However, these cameras have many available parameters to be chosen by the user and it may be difficult to select the best of these in order to acquire images with the needed metrological quality. This paper proposes a methodology to select a set of parameters that will supply a machine vision system with the needed quality image, considering the measurement required of a laboratory glassware.

  10. Print spectral reflectance estimation using trichromatic camera

    NASA Astrophysics Data System (ADS)

    Harouna S., Aboubacar; Bringier, Benjamin; Khoudeir, Majdi

    2015-04-01

    This paper deals with print quality control through a spectral color measurement. The aim is to estimate the spectral reflectance curve of each pixel of a printed sheet for a spectral matching with the reference image. The proposed method consists to perform a spectral characterization of the complete chain which includes the printing system and a digital trichromatic camera. First, the spectral printer model is presented and verified by experiments. Then, the camera spectral sensitivity curves are estimated through the capture of a color chart whose spectral reflectance curves have been previously measured. Finally, the spectral printer model is used to estimate the print spectral reflectance curves from camera responses.

  11. Fuzzy logic control for camera tracking system

    NASA Technical Reports Server (NTRS)

    Lea, Robert N.; Fritz, R. H.; Giarratano, J.; Jani, Yashvant

    1992-01-01

    A concept utilizing fuzzy theory has been developed for a camera tracking system to provide support for proximity operations and traffic management around the Space Station Freedom. Fuzzy sets and fuzzy logic based reasoning are used in a control system which utilizes images from a camera and generates required pan and tilt commands to track and maintain a moving target in the camera's field of view. This control system can be implemented on a fuzzy chip to provide an intelligent sensor for autonomous operations. Capabilities of the control system can be expanded to include approach, handover to other sensors, caution and warning messages.

  12. Two detector, active digital holographic camera for 3D imaging and digital holographic interferometry

    NASA Astrophysics Data System (ADS)

    ?ak, Jakub; Kujawi?ska, Ma?gorzata; Józwik, Micha?

    2015-09-01

    In this paper we present the novel design and proof of concept of an active holographic camera consisting of two array detectors and Liquid Crystal on Silicon (LCOS) Spatial Light Modulator (SLM). The device allows sequential or simultaneous capture of two Fresnel holograms of 3D object/scene. The two detectors configuration provides an increased viewing angle of the camera, allows to capture two double exposure holograms with different sensitivity vectors and even facilitate capturing a synthetic aperture hologram for static objects. The LCOS SLM, located in a reference arm, serves as an active element, which enables phase shifting and proper pointing of reference beams towards both detectors in the configuration which allows miniaturization of the camera. The laboratory model of the camera has been tested for different modes of work namely for capture and reconstruction of 3D scene and for double exposure holographic interferometry applied for an engineering object under load. The future extension of the camera functionalities for Fourier holograms capture is discussed.

  13. Keyboard before Head Tracking Depresses User Success in Remote Camera Control

    NASA Astrophysics Data System (ADS)

    Zhu, Dingyun; Gedeon, Tom; Taylor, Ken

    In remote mining, operators of complex machinery have more tasks or devices to control than they have hands. For example, operating a rock breaker requires two handed joystick control to position and fire the jackhammer, leaving the camera control to either automatic control or require the operator to switch between controls. We modelled such a teleoperated setting by performing experiments using a simple physical game analogue, being a half size table soccer game with two handles. The complex camera angles of the mining application were modelled by obscuring the direct view of the play area and the use of a Pan-Tilt-Zoom (PTZ) camera. The camera control was via either a keyboard or via head tracking using two different sets of head gestures called “head motion” and “head flicking” for turning camera motion on/off. Our results show that the head motion control was able to provide a comparable performance to using a keyboard, while head flicking was significantly worse. In addition, the sequence of use of the three control methods is highly significant. It appears that use of the keyboard first depresses successful use of the head tracking methods, with significantly better results when one of the head tracking methods was used first. Analysis of the qualitative survey data collected supports that the worst (by performance) method was disliked by participants. Surprisingly, use of that worst method as the first control method significantly enhanced performance using the other two control methods.

  14. Thermal design and simulation of an attitude-varied space camera

    NASA Astrophysics Data System (ADS)

    Wang, Chenjie; Yang, Wengang; Feng, Liangjie; Li, XuYang; Wang, Yinghao; Fan, Xuewu; Wen, Desheng

    2015-10-01

    An attitude-varied space camera changes attitude continually when it is working, its attitude changes with large angle in short time leads to the significant change of heat flux; Moreover, the complicated inner heat sources, other payloads and the satellite platform will also bring thermal coupling effects to the space camera. According to a space camera which is located on a two dimensional rotating platform, detailed thermal design is accomplished by means of thermal isolation, thermal transmission and temperature compensation, etc. Then the ultimate simulation cases of both high temperature and low temperature are chosen considering the obscuration of the satellite platform and other payloads, and also the heat flux analysis of light entrance and radiator surface of the camera. NEVEDA and SindaG are used to establish the simulation model of the camera and the analysis is carried out. The results indicate that, under both passive and active thermal control, the temperature of optical components is 20+/-1°C,both their radial and axial temperature gradient are less than 0.3°C, while the temperature of the main structural components is 20+/-2°C, and the temperature fluctuation of the focal plane assemblies is 3.0-9.5°C The simulation shows that the thermal control system can meet the need of the mission, and the thermal design is efficient and reasonable.

  15. Robust gaze-tracking method by using frontal-viewing and eye-tracking cameras

    NASA Astrophysics Data System (ADS)

    Cho, Chul Woo; Lee, Ji Woo; Lee, Eui Chul; Park, Kang Ryoung

    2009-12-01

    Gaze-tracking technology is used to obtain the position of a user's viewpoint and a new gaze-tracking method is proposed based on a wearable goggle-type device, which includes an eye-tracking camera and a frontal viewing camera. The proposed method is novel in five ways compared to previous research. First, it can track the user's gazing position, allowing for the natural facial and eye movements by using frontal viewing and an eye-tracking camera. Second, an eye gaze position is calculated using a geometric transform, based on the mapping function among three rectangular regions. These are a rectangular region defined by the four pupil centers detected when a user gazes at the four corners of a monitor, a distorted monitor region observed by the frontal viewing camera, and an actual monitor region, respectively. Third, a facial gaze position is estimated based on the geometric center and the four internal angles of the monitor region detected by the frontal viewing camera. Fourth, a final gaze position is obtained by using the weighted summation of the eye and the facial gazing positions. Fifth, since a simple 2-D method is used to obtain the gazing position instead of a complicated 3-D method, the proposed method can be operated at real-time speeds. Experimental results show that the root mean square (rms) error of gaze estimation is less than 1 deg.

  16. Texturecam: A Smart Camera for Microscale, Mesoscale, and Deep Space Applications

    NASA Technical Reports Server (NTRS)

    Abbey, William; Allwood, Abigail; Bekker, Dmitriy; Bornstein, Benjamin; Cabrol, Nathalie A.; Castano, Rebecca; Chien, Steve A.; Doubleday, Joshua; Estlin, Tara; Foil, Greydon; Fuchs, Thomas; Howarth, Daniel; Ortega, Kevin; Thompson, David R.; Wagstaff, Kiri L.

    2013-01-01

    The TextureCam project is developing a 'smart camera' that can classify geologic surfaces in planetary images. This would allow autonomous spacecraft to collect data opportunisitcally during intervals between communications with Earth, such as during long traverses. Its surface classifications can identify new targets that were not anticipated in advance. The spacecraft might use this information to target these features with high-resolution instruments such as spectrometers nd narrow-field cameras. Classifications could also inform data 'triage' decisions, identifying high value images for prioritized downlink. Finally, the surface classification can serve as compressed maps of image content. Each of these strategies can improve the science data returned at each command cycle and speed reconnaissance during site survey and astrobiology investigation. Our first year of development has completed the image analysis algorithms and validated them in software tests. Here we survey these initial results and explore several application areas relevant to Mars and beyond.

  17. Interface circuit design and control system programming for an EMCCD camera based on Camera Link

    NASA Astrophysics Data System (ADS)

    Li, Bin-hua; Rao, Xiao-hui; Yan, Jia; Li, Da-lun; Zhang, Yi-gong

    2013-08-01

    This paper presents an appropriate solution for self-developed EMCCD cameras based on Camera Link. A new interface circuit used to connect an embedded processor Nios II to the serial communication port of Camera Link in the camera is designed, and a simplified structure diagram is shown. To implement functions of the circuit, in the hardware design, it is necessary to add a universal serial communication component to the Nios II when building the processor and its peripheral components in the Altera SOPC development environment. In the software design, we use C language to write a UART interrupt response routine for instructions and data receiving and transmitting, and a camera control program in the slave computer (Nios II), employ a Sapera LT development library and VC++ to write a serial communication routine, a camera control and image acquisition program in the host computer. The developed camera can be controlled by the host PC, the camera status can return to the PC, and a huge amount of image data can be uploaded at a high speed through a Camera Link cable. A flow chart of the serial communication and camera control program in Nios II is given, and two operating interfaces in the PC are shown. Some design and application skills are described in detail. The test results indicate that the interface circuit and the control programs that we have developed are feasible and reliable.

  18. Differentiating Biological Colours with Few and Many Sensors: Spectral Reconstruction with RGB and Hyperspectral Cameras

    PubMed Central

    Garcia, Jair E.; Girard, Madeline B.; Kasumovic, Michael; Petersen, Phred; Wilksch, Philip A.; Dyer, Adrian G.

    2015-01-01

    Background The ability to discriminate between two similar or progressively dissimilar colours is important for many animals as it allows for accurately interpreting visual signals produced by key target stimuli or distractor information. Spectrophotometry objectively measures the spectral characteristics of these signals, but is often limited to point samples that could underestimate spectral variability within a single sample. Algorithms for RGB images and digital imaging devices with many more than three channels, hyperspectral cameras, have been recently developed to produce image spectrophotometers to recover reflectance spectra at individual pixel locations. We compare a linearised RGB and a hyperspectral camera in terms of their individual capacities to discriminate between colour targets of varying perceptual similarity for a human observer. Main Findings (1) The colour discrimination power of the RGB device is dependent on colour similarity between the samples whilst the hyperspectral device enables the reconstruction of a unique spectrum for each sampled pixel location independently from their chromatic appearance. (2) Uncertainty associated with spectral reconstruction from RGB responses results from the joint effect of metamerism and spectral variability within a single sample. Conclusion (1) RGB devices give a valuable insight into the limitations of colour discrimination with a low number of photoreceptors, as the principles involved in the interpretation of photoreceptor signals in trichromatic animals also apply to RGB camera responses. (2) The hyperspectral camera architecture provides means to explore other important aspects of colour vision like the perception of certain types of camouflage and colour constancy where multiple, narrow-band sensors increase resolution. PMID:25965264

  19. A slanting light-guide analog decoding high resolution detector for positron emission tomography camera

    SciTech Connect

    Wong, W.H.; Jing, M.; Bendriem, B.; Hartz, R.; Mullani, N.; Gould, K.L.; Michel, C.

    1987-02-01

    Current high resolution PET cameras require the scintillation crystals to be much narrower than the smallest available photomultipliers. In addition, the large number of photomultiplier channels constitutes the major component cost in the camera. Recent new designs use the Anger camera type of analog decoding method to obtain higher resolution and lower cost by using the relatively large photomultipliers. An alternative approach to improve the resolution and cost factors has been proposed, with a system of slanting light-guides between the scintillators and the photomultipliers. In the Anger camera schemes, the scintillation light is distributed to several neighboring photomultipliers which then determine the scintillation location. In the slanting light-guide design, the scintillation is metered and channeled to only two photomultipliers for the decision making. This paper presents the feasibility and performance achievable with the slanting light-guide detectors. With a crystal/photomultiplier ratio of 6/1, the intrinsic resolution was found to be 4.0 mm using the first non-optimized prototype light-guides on BGO crystals. The axial resolution will be about 5-6 mm.

  20. Traceability of a CCD-Camera System for High-Temperature Measurements

    NASA Astrophysics Data System (ADS)

    Bünger, L.; Anhalt, K.; Taubert, R. D.; Krüger, U.; Schmidt, F.

    2015-08-01

    A CCD camera, which has been specially equipped with narrow-band interference filters in the visible spectral range for temperature measurements above 1200 K, was characterized with respect to its temperature response traceable to ITS-90 and with respect to absolute spectral radiance responsivity. The calibration traceable to ITS-90 was performed at a high-temperature blackbody source using a radiation thermometer as a transfer standard. Use of Planck's law and the absolute spectral radiance responsivity of the camera system allows the determination of the thermodynamic temperature. For the determination of the absolute spectral radiance responsivity, a monochromator-based setup with a supercontinuum white-light laser source was developed. The CCD-camera system was characterized with respect to the dark-signal-non-uniformity, the photo-response-non-uniformity, the non-linearity, and the size-of-source effect. The influence of these parameters on the calibration and measurement was evaluated and is considered for the uncertainty budget. The results of the two different calibration schemes for the investigated temperature range from 1200 K to 1800 K are in good agreement considering the expanded uncertainty . The uncertainty for the absolute spectral responsivity of the camera is 0.56 %.

  1. InfraCAM (trade mark): A Hand-Held Commercial Infrared Camera Modified for Spaceborne Applications

    NASA Technical Reports Server (NTRS)

    Manitakos, Daniel; Jones, Jeffrey; Melikian, Simon

    1996-01-01

    In 1994, Inframetrics introduced the InfraCAM(TM), a high resolution hand-held thermal imager. As the world's smallest, lightest and lowest power PtSi based infrared camera, the InfraCAM is ideal for a wise range of industrial, non destructive testing, surveillance and scientific applications. In addition to numerous commercial applications, the light weight and low power consumption of the InfraCAM make it extremely valuable for adaptation to space borne applications. Consequently, the InfraCAM has been selected by NASA Lewis Research Center (LeRC) in Cleveland, Ohio, for use as part of the DARTFire (Diffusive and Radiative Transport in Fires) space borne experiment. In this experiment, a solid fuel is ignited in a low gravity environment. The combustion period is recorded by both visible and infrared cameras. The infrared camera measures the emission from polymethyl methacrylate, (PMMA) and combustion products in six distinct narrow spectral bands. Four cameras successfully completed all qualification tests at Inframetrics and at NASA Lewis. They are presently being used for ground based testing in preparation for space flight in the fall of 1995.

  2. The Narrow-Line Region of Narrow-Line Seyfert 1 Galaxies

    NASA Astrophysics Data System (ADS)

    Rodríguez-Ardila, A.; Binette, Luc; Pastoriza, Miriani G.; Donzelli, Carlos J.

    2000-08-01

    This work studies the optical emission-line properties and physical conditions of the narrow-line region (NLR) of seven narrow-line Seyfert 1 galaxies (NLS1's) for which high signal-to-noise ratio spectroscopic observations were available. The resolution is 340 km s-1 (at H?) over the wavelength interval 3700-9500 Å, enabling us to separate the broad and narrow components of the permitted emission lines. Our results show that the flux carried out by the narrow component of H? is, on average, 50% of the total line flux. As a result, the [O III] ?5007/H? ratio emitted in the NLR varies from 1 to 5, instead of the universally adopted value of 10. This has strong implications for the required spectral energy distribution that ionizes the NLR gas. Photoionization models that consider a NLR composed of a combination of matter-bounded and ionization-bounded clouds are successful at explaining the low [O III] ?5007/H? ratio and the weakness of low-ionization lines of NLS1's. Variation of the relative proportion of these two type of clouds nicely reproduces the dispersion of narrow-line ratios found among the NLS1 sample. Assuming similar physical model parameters of both NLS1's and the normal Seyfert 1 galaxy NGC 5548, we show that the observed differences of emission-line ratios between these two groups of galaxies can be explained, to a first approximation, in terms of the shape of the input ionizing continuum. Narrow emission-line ratios of NLS1's are better reproduced by a steep power-law continuum in the EUV-soft X-ray region, with spectral index ?~-2. Flatter spectral indices (?~-1.5) match the observed line ratios of NGC 5548 but are unable to provide a good match to the NLS1 ratios. This result is consistent with ROSAT observations of NLS1's, which show that these objects are characterized by steeper power-law indices than those of Seyfert 1 galaxies with strong broad optical lines. Based on observations made at CASLEO. Complejo Astronómico El Leoncito (CASLEO) is operated under agreement between the Consejo Nacional de Investigaciones Científicas y técnicas de la República Argentina and the National Universities of La Plata, Córdoba and San Juán.

  3. Wide-angle Tangential Viewing System for DIII-D

    NASA Astrophysics Data System (ADS)

    Lasnier, C. J.; Allen, S. L.; Fenstermacher, M. E.; Hill, D. N.; Weber, T. R.

    2011-10-01

    We are designing a wide-angle tangential viewing system for DIII-D, with co-registered views in the visible and IR. We will examine toroidal and poloidal asymmetries of wall heating and particle flux during ELMs, magnetic perturbations, and disruptions; toroidal and poloidal mode structure of ELMs; poloidal distribution of particle flow velocities, and others. The system will simultaneously view the inner wall, outer wall, and upper and lower divertors, and will have an independent 3X optical zoom capability in visible and IR. Various parts of the image may be viewed at 3X magnification by translating the camera(s) vertically and laterally in the image plane. For IR we have a FLIR SC6000HS 3-5 ?m camera, and for visible a Phantom V7.3. Both have high frame rate capability. Visible wavelength and neutral density filters may be selected, or interferometric flow measurement optics may be substituted for the filter system. This system was inspired by a design by CEA Cadarache for JET, and is similar to a system designed by LLNL for ITER upper ports. This work performed under the auspices of the US Department of Energy under ARRAY 2005290 and DE-AC52-07NA27344.

  4. Narrow band imaging and long slit spectroscopy of UGC 5101

    NASA Astrophysics Data System (ADS)

    Stanga, R. M.; Mannucci, F.; Rodríguez Espinosa, J. M.

    1993-01-01

    UGC 5101 (z = 0.04; D is approximately equal to 240 Mpc) is one of the so called Ultraluminous IRAS sources. Two important properties of the members of this group are their L(sub IR) is greater than or equal to 10(exp 12) solar luminosity, and their space density in the universe up to z is less than 0.1 is equal or even larger than the space density of the quasars. Further noteworthy features of the Ultraluminous IRAS sources are their being morphologically peculiar and the fact that they all seem to host active nuclei in their center. We have observed UGC 5101 in an effort to study the interplay between the gas ionized by the central active nucleus and that gas ionized by other processes which may hold important clues to the understanding of the entire picture of this object. In particular these other ionizing processes could well be massive stars formed recently after the galactic encounter and shocks possibly also related to the galaxy collision. The data that we discuss were obtained between Dec. 1989 and Jan. 1992 with the WHT 4.2 m telescope using the two-arm spectrograph ISIS. Several spectral frames were obtained at three different position angles: PA 84--along the tail of the galaxy; PA 32--along the dust lane; and PA 110. The blue spectra are centered on the H beta line, while the red spectra are centered on the H alpha line. In the configuration we used for the long slit spectra, the spectral scale was 0.74 A per pixel, and the spatial scale was .37 arcsec per pixel; we also observed the H alpha region with a spectral scale of .37 A per pixel, at position angle 84. The narrow band images were obtained at the auxiliary port of ISIS, with a scale of .2 arcsec per pixel, and were centered at the H alpha wavelength, and on the adjacent continuum. The H alpha images and the spectra support the following model. UGC 5101 hosts an active nucleus; the NLR extends up to about 1.5 kpc and shows a complex velocity field, superimposed on the rotation curve of the galaxy. Besides the NLR, in the H alpha image are visible tow bright cones that extend up to 3 kpc along PA 32. The long slit spectra at PA 32 show that the velocity field of the gas in these regions is peculiar, while the ionization structure of the gas is similar to that of the NLR.

  5. Relativistic Transformation of Solid Angle.

    ERIC Educational Resources Information Center

    McKinley, John M.

    1980-01-01

    Rederives the relativistic transformations of light intensity from compact sources (stars) to show where and how the transformation of a solid angle contributes. Discusses astrophysical and other applications of the transformations. (Author/CS)

  6. Preliminary Design of ARIES-Devasthal Faint Object Spectrograph and Camera

    E-print Network

    Mondal, Soumen; Singh, Mahendra

    2009-01-01

    We present here the preliminary design of ARIES-Devasthal Faint Object Spectrograph and Camera (ADFOSC), which is a multi-mode instrument for both imaging and spectroscopy. ADFOSC is the first-generation instrument to be mounted at the axial port of the Cassegrain focus on our new 3.6m optical telescope to be installed at Devasthal, Nainital. The main design goals of the instrument are : the instrument will have capability of broad- and narrow-band imaging, low-medium resolution spectroscopy, and imaging polarimetry. The operating wavelength range will be from 360 to 1000 nm and the instrument will have remote-control capability.

  7. Angles of multivariable root loci

    NASA Technical Reports Server (NTRS)

    Thompson, P. M.; Stein, G.; Laub, A. J.

    1982-01-01

    A generalized eigenvalue problem is demonstrated to be useful for computing the multivariable root locus, particularly when obtaining the arrival angles to finite transmission zeros. The multivariable root loci are found for a linear, time-invariant output feedback problem. The problem is then employed to compute a closed-loop eigenstructure. The method of computing angles on the root locus is demonstrated, and the method is extended to a multivariable optimal root locus.

  8. Vacuum compatible miniature CCD camera head

    DOEpatents

    Conder, Alan D. (Tracy, CA)

    2000-01-01

    A charge-coupled device (CCD) camera head which can replace film for digital imaging of visible light, ultraviolet radiation, and soft to penetrating x-rays, such as within a target chamber where laser produced plasmas are studied. The camera head is small, capable of operating both in and out of a vacuum environment, and is versatile. The CCD camera head uses PC boards with an internal heat sink connected to the chassis for heat dissipation, which allows for close(0.04" for example) stacking of the PC boards. Integration of this CCD camera head into existing instrumentation provides a substantial enhancement of diagnostic capabilities for studying high energy density plasmas, for a variety of military industrial, and medical imaging applications.

  9. X-Ray Shawdowgraph Camera Design

    SciTech Connect

    Edward J. McCrea; Michael J. Doman; Randy A. Rohde

    1999-01-01

    An imagining camera that is used with X-Ray radiography systems in high explosive experiments has been built and fielded. The camera uses a 40mm diameter Micro-Channel Plate Itensifier (MCPI) for optical gain and photographic film for image recording. In the normal location of the X-ray film pack, a scintillating screen is placed instead. The camera system views the screen and records the image. The sensitivity of the MCPI to light makes the camera design sensitive to small details that a film pack does not need to consider. The X-ray image recording system was designed and bulit for situations where the film pack of the X-ray shadowgraph is not retrievable after the experiment. The system has been used in a number of experiments.

  10. Innovative camera system developed for Sprint vehicle

    SciTech Connect

    Not Available

    1985-04-01

    A new inspection system for the Sprint 101 ROV eliminates parallax errors because all three camera modules use a single lens for viewing. Parallax is the apparent displacement of an object when it is viewed from two points not in the same line of sight. The central camera is a Pentax 35-mm single lens reflex with a 28-mm lens. It comes with 250-shot film cassettes, an automatic film wind-on, and a data chamber display. An optical transfer assembly on the stills camera viewfinder transmits the image to one of the two video camera modules. The video picture transmitted to the surface is exactly the same as the stills photo. The surface operator can adjust the focus by viewing the video display.

  11. Real time moving scene holographic camera system

    NASA Technical Reports Server (NTRS)

    Kurtz, R. L. (inventor)

    1973-01-01

    A holographic motion picture camera system producing resolution of front surface detail is described. The system utilizes a beam of coherent light and means for dividing the beam into a reference beam for direct transmission to a conventional movie camera and two reflection signal beams for transmission to the movie camera by reflection from the front side of a moving scene. The system is arranged so that critical parts of the system are positioned on the foci of a pair of interrelated, mathematically derived ellipses. The camera has the theoretical capability of producing motion picture holograms of projectiles moving at speeds as high as 900,000 cm/sec (about 21,450 mph).

  12. Projective minimal analysis of camera geometry

    E-print Network

    Romano, Raquel Andrea

    2002-01-01

    This thesis addresses the general problem of how to find globally consistent and accurate estimates of multiple-view camera geometry from uncalibrated imagery of an extended scene. After decades of study, the classic problem ...

  13. Selecting the Right Camera for Your Desktop.

    ERIC Educational Resources Information Center

    Rhodes, John

    1997-01-01

    Provides an overview of camera options and selection criteria for desktop videoconferencing. Key factors in image quality are discussed, including lighting, resolution, and signal-to-noise ratio; and steps to improve image quality are suggested. (LRW)

  14. Daytime Aspect Camera for Balloon Altitudes

    NASA Technical Reports Server (NTRS)

    Dietz, Kurt L.; Ramsey, Brian D.; Alexander, Cheryl D.; Apple, Jeff A.; Ghosh, Kajal K.; Swift, Wesley R.

    2002-01-01

    We have designed, built, and flight-tested a new star camera for daytime guiding of pointed balloon-borne experiments at altitudes around 40 km. The camera and lens are commercially available, off-the-shelf components, but require a custom-built baffle to reduce stray light, especially near the sunlit limb of the balloon. This new camera, which operates in the 600- to 1000-nm region of the spectrum, successfully provides daytime aspect information of approx. 10 arcsec resolution for two distinct star fields near the galactic plane. The detected scattered-light backgrounds show good agreement with the Air Force MODTRAN models used to design the camera, but the daytime stellar magnitude limit was lower than expected due to longitudinal chromatic aberration in the lens. Replacing the commercial lens with a custom-built lens should allow the system to track stars in any arbitrary area of the sky during the daytime.

  15. Distributed Calibration of Smart Cameras John Jannotti

    E-print Network

    Jannotti, John

    Distributed Calibration of Smart Cameras John Jannotti Department of Computer Science Brown University jj@cs.brown.edu Jie Mao Department of Computer Science Brown University jmao@cs.brown.edu Abstract

  16. Ground Robot Navigation using Uncalibrated Cameras

    E-print Network

    Teller, Seth

    Precise calibration of camera intrinsic and extrinsic parameters, while often useful, is difficult to obtain during field operation and presents scaling issues for multi-robot systems. We demonstrate a vision-based approach ...

  17. Transport properties of a narrow self-affine fracture

    NASA Astrophysics Data System (ADS)

    Auradou, H.; Drazer, G.; Hulin, J. P.; Koplik, J.

    2003-04-01

    The main objective of our work is to determine the transport properties of an individual fracture. Specifically, we focus on properties such as conductivity, permeability and dispersion that characterize transport processes in a fracture. We are performing numerical and experimental (field and laboratory) investigations of the problem, focusing on the relationship between the geometry of the free space of the fracture and its transport properties. We present our experimental [1] and numerical [2] results on the dispersion of a solute in a narrow fracture, where narrow means that the aperture or gap size of the fracture is smaller than the vertical fluctuations of the walls of the fracture over the system size. In the experiments, the fracture consists of two complementary and transparent casts of a granite rough fracture surface. The topographies of the fracture surfaces are obtained using a profilometer that allows a detailed analysis of the statistical properties of the heights of the surfaces. The surface roughness is found to be "self-affine", meaning that the amplitude of the roughness increases (statistically) with the size of the fracture with a power law of exponent H. The latter named the Hurst or roughness exponent is found to be close to 0.8 for all the granite fracture surfaces. The experimental setup allowed us to control both the opening of the fracture and the lateral shift (if any) between the two opposite matching surfaces. A steady radial flow of a transparent fluid is first established in the fracture. The same fluid but dyed is then injected in the flow and its spreading is recorded using a video camera. We have shown that the initial flat shape of the invading front is increasingly distorted over a broad range of length scales and contains many details which have self-affine properties characterized by an exponent close to that one characterizing the heights of the fracture walls. A numerical study of the dispersion process using lattice Boltzmann method has been also performed. The Lattice-Boltzmann method is well suited to simulate fluid flow in irregular geometries, and allowed us to investigate the detailed properties of the flow field. In particular, we were able to simulate the fully 3D velocity field, which is critical for fractures without lateral shift, since in that case the aperture is constant everywhere and the standard lubrication approximation fails to describe the behavior of the flow. We also studied the statistical properties of the details of the fronts and a relationship between the fracture wall roughness exponent and the geometry of the distorted fronts is derived. Finally, we will also discuss the influence of the direction of the shift between fracture surfaces, on the dispersivity of tracer particles and on the permeability of the fracture. Preliminary results indicates that the spreading of a solute is larger when the mean flow is perpendicular to direction of the shift, which corresponds to the direction of highest permeability. [1] H . AURADOU, J.P. HULIN AND S. ROUX, Experimental study of miscible displacement fronts in rough self-affine fractures. Phys. Rev. E 63, 066306. [2] G. DRAZER, J. KOPLIK, Transport in rough self-affine fracture. Phys. Rev. E 66, 026303.

  18. Automated Target-Free Network Orienation and Camera Calibration

    NASA Astrophysics Data System (ADS)

    Stamatopoulos, C.; Fraser, C. S.

    2014-05-01

    Automated close-range photogrammetric network orientation and camera calibration has traditionally been associated with the use of coded targets in the object space to allow for an initial relative orientation (RO) and subsequent spatial resection of the images. However, over the last decade, advances coming mainly from the computer vision (CV) community have allowed for fully automated orientation via feature-based matching techniques. There are a number of advantages in such methodologies for various types of applications, as well as for cases where the use of artificial targets might be not possible or preferable, for example when attempting calibration from low-level aerial imagery, as with UAVs, or when calibrating long-focal length lenses where small image scales call for inconveniently large coded targets. While there are now a number of CV-based algorithms for multi-image orientation within narrow-baseline networks, with accompanying open-source software, from a photogrammetric standpoint the results are typically disappointing as the metric integrity of the resulting models is generally poor, or even unknown. The objective addressed in this paper is target-free automatic multi-image orientation, maintaining metric integrity, within networks that incorporate wide-baseline imagery. The focus is on both the development of a methodology that overcomes the shortcomings that can be present in current CV algorithms, and on the photogrammetric priorities and requirements that exist in current processing pipelines. This paper also reports on the application of the proposed methodology to automated target-free camera self-calibration and discusses the process via practical examples.

  19. A stereoscopic lens for digital cinema cameras

    NASA Astrophysics Data System (ADS)

    Lipton, Lenny; Rupkalvis, John

    2015-03-01

    Live-action stereoscopic feature films are, for the most part, produced using a costly post-production process to convert planar cinematography into stereo-pair images and are only occasionally shot stereoscopically using bulky dual-cameras that are adaptations of the Ramsdell rig. The stereoscopic lens design described here might very well encourage more live-action image capture because it uses standard digital cinema cameras and workflow to save time and money.

  20. Compact Optical Technique for Streak Camera Calibration

    SciTech Connect

    Curt Allen; Terence Davies; Frans Janson; Ronald Justin; Bruce Marshall; Oliver Sweningsen; Perry Bell; Roger Griffith; Karla Hagans; Richard Lerche

    2004-04-01

    The National Ignition Facility is under construction at the Lawrence Livermore National Laboratory for the U.S. Department of Energy Stockpile Stewardship Program. Optical streak cameras are an integral part of the experimental diagnostics instrumentation. To accurately reduce data from the streak cameras a temporal calibration is required. This article describes a technique for generating trains of precisely timed short-duration optical pulses that are suitable for temporal calibrations.

  1. CMOS Camera Array With Onboard Memory

    NASA Technical Reports Server (NTRS)

    Gat, Nahum

    2009-01-01

    A compact CMOS (complementary metal oxide semiconductor) camera system has been developed with high resolution (1.3 Megapixels), a USB (universal serial bus) 2.0 interface, and an onboard memory. Exposure times, and other operating parameters, are sent from a control PC via the USB port. Data from the camera can be received via the USB port and the interface allows for simple control and data capture through a laptop computer.

  2. Cooperative resonance linewidth narrowing in a planar metamaterial

    E-print Network

    S. D. Jenkins; J. Ruostekoski

    2012-09-17

    We theoretically analyze the experimental observations of a spectral line collapse in a metamaterial array of asymmetric split ring resonators [Fedotov et al., Phys. Rev. Lett. 104, 223901 (2010)]. We show that the ensemble of closely-spaced resonators exhibits cooperative response, explaining the observed system-size dependent narrowing of the transmission resonance linewidth. We further show that this cooperative narrowing depends sensitively on the lattice spacing and that significantly stronger narrowing could be achieved in media with suppressed ohmic losses.

  3. Low light performance of digital cameras

    NASA Astrophysics Data System (ADS)

    Hultgren, Bror; Hertel, Dirk

    2009-01-01

    Photospace data previously measured on large image sets have shown that a high percentage of camera phone pictures are taken under low-light conditions. Corresponding image quality measurements linked the lowest quality to these conditions, and subjective analysis of image quality failure modes identified image blur as the most important contributor to image quality degradation. Camera phones without flash have to manage a trade-off when adjusting shutter time to low-light conditions. The shutter time has to be long enough to avoid extreme underexposures, but not short enough that hand-held picture taking is still possible without excessive motion blur. There is still a lack of quantitative data on motion blur. Camera phones often do not record basic operating parameters such as shutter speed in their image metadata, and when recorded, the data are often inaccurate. We introduce a device and process for tracking camera motion and measuring its Point Spread Function (PSF). Vision-based metrics are introduced to assess the impact of camera motion on image quality so that the low-light performance of different cameras can be compared. Statistical distributions of user variability will be discussed.

  4. Low light performance of digital still cameras

    NASA Astrophysics Data System (ADS)

    Wueller, Dietmar

    2013-03-01

    The major difference between a dSLR camera, a consumer camera, and a camera in a mobile device is the sensor size. The sensor size is also related to the over all system size including the lens. With the sensors getting smaller the individual light sensitive areas are also getting smaller leaving less light falling onto each of the pixels. This effect requires higher signal amplification that leads to higher noise levels or other problems that may occur due to denoising algorithms. These Problems become more visible at low light conditions because of the lower signal levels. The fact that the sensitivity of cameras decreases makes customers ask for a standardized way to measure low light performance of cameras. The CEA (Consumer Electronics Association) together with ANSI has addressed this for camcorders in the CEA-639 [1] standard. The ISO technical committee 42 (photography) is currently also thinking about a potential standard on this topic for still picture cameras. This paper is part of the preparation work for this standardization activity and addresses the differences compared to camcorders and also potential additional problems with noise reduction that have occurred over the past few years. The result of this paper is a proposed test procedure with a few open questions that have to be answered in future work.

  5. Performance comparison of streak camera recording systems

    SciTech Connect

    Derzon, M.; Barber, T.

    1995-07-01

    Streak camera based diagnostics are vital to the inertial confinement fusion program at Sandia National Laboratories. Performance characteristics of various readout systems coupled to an EGG-AVO streak camera were analyzed and compared to scaling estimates. The purpose of the work was to determine the limits of the streak camera performance and the optimal fielding conditions for the Amador Valley Operations (AVO) streak camera systems. The authors measured streak camera limitations in spatial resolution and sensitivity. Streak camera limits on spatial resolution are greater than 18 lp/mm at 4% contrast. However, it will be difficult to make use of any resolution greater than this because of high spatial frequency variation in the photocathode sensitivity. They have measured a signal to noise of 3,000 with 0.3 mW/cm{sup 2} of 830 nm light at a 10 ns/mm sweep speed. They have compared lens coupling systems with and without micro-channel plate intensifiers and systems using film or charge coupled device (CCD) readout. There were no conditions where film was found to be an improvement over the CCD readout. Systems utilizing a CCD readout without an intensifier have comparable resolution, for these source sizes and at a nominal cost in signal to noise of 3, over those with an intensifier. Estimates of the signal-to-noise for different light coupling methods show how performance can be improved.

  6. Calibration of multi-camera photogrammetric systems

    NASA Astrophysics Data System (ADS)

    Detchev, I.; Mazaheri, M.; Rondeel, S.; Habib, A.

    2014-11-01

    Due to the low-cost and off-the-shelf availability of consumer grade cameras, multi-camera photogrammetric systems have become a popular means for 3D reconstruction. These systems can be used in a variety of applications such as infrastructure monitoring, cultural heritage documentation, biomedicine, mobile mapping, as-built architectural surveys, etc. In order to ensure that the required precision is met, a system calibration must be performed prior to the data collection campaign. This system calibration should be performed as efficiently as possible, because it may need to be completed many times. Multi-camera system calibration involves the estimation of the interior orientation parameters of each involved camera and the estimation of the relative orientation parameters among the cameras. This paper first reviews a method for multi-camera system calibration with built-in relative orientation constraints. A system stability analysis algorithm is then presented which can be used to assess different system calibration outcomes. The paper explores the required calibration configuration for a specific system in two situations: major calibration (when both the interior orientation parameters and relative orientation parameters are estimated), and minor calibration (when the interior orientation parameters are known a-priori and only the relative orientation parameters are estimated). In both situations, system calibration results are compared using the system stability analysis methodology.

  7. Europe's space camera unmasks a cosmic gamma-ray machine

    NASA Astrophysics Data System (ADS)

    1996-11-01

    The new-found neutron star is the visible counterpart of a pulsating radio source, Pulsar 1055-52. It is a mere 20 kilometres wide. Although the neutron star is very hot, at about a million degrees C, very little of its radiant energy takes the form of visible light. It emits mainly gamma-rays, an extremely energetic form of radiation. By examining it at visible wavelengths, astronomers hope to figure out why Pulsar 1055-52 is the most efficient generator of gamma-rays known so far, anywhere the Universe. The Faint Object Camera found Pulsar 1055-52 in near ultraviolet light at 3400 angstroms, a little shorter in wavelength than the violet light at the extremity of the human visual range. Roberto Mignani, Patrizia Caraveo and Giovanni Bignami of the Istituto di Fisica Cosmica in Milan, Italy, report its optical identification in a forthcoming issue of Astrophysical Journal Letters (1 January 1997). The formal name of the object is PSR 1055-52. Evading the glare of an adjacent star The Italian team had tried since 1988 to spot Pulsar 1055-52 with two of the most powerful ground-based optical telescopes in the Southern Hemisphere. These were the 3.6-metre Telescope and the 3.5-metre New Technology Telescope of the European Southern Observatory at La Silla, Chile. Unfortunately an ordinary star 100,000 times brighter lay in almost the same direction in the sky, separated from the neutron star by only a thousandth of a degree. The Earth's atmosphere defocused the star's light sufficiently to mask the glimmer from Pulsar 1055-52. The astronomers therefore needed an instrument in space. The Faint Object Camera offered the best precision and sensitivity to continue the hunt. Devised by European astronomers to complement the American wide field camera in the Hubble Space Telescope, the Faint Object Camera has a relatively narrow field of view. It intensifies the image of a faint object by repeatedly accelerating electrons from photo-electric films, so as to produce brighter flashes when the electrons hit a phosphor screen. Since Hubble's launch in 1990, the Faint Object Camera has examined many different kinds of cosmic objects, from the moons of Jupiter to remote galaxies and quasars. When the space telescope's optics were corrected at the end of 1993 the Faint Object Camera immediately celebrated the event with the discovery of primeval helium in intergalactic gas. In their search for Pulsar 1055-52, the astronomers chose a near-ultraviolet filter to sharpen the Faint Object Camera's vision and reduce the adjacent star's huge advantage in intensity. In May 1996, the Hubble Space Telescope operators aimed at the spot which radio astronomers had indicated, as the source of the radio pulsations of Pulsar 1055-52. The neutron star appeared precisely in the centre of the field of view, and it was clearly separated from the glare of the adjacent star. At magnitude 24.9, Pulsar 1055-52 was comfortably within the power of the Faint Object Camera, which can see stars 20 times fainter still. "The Faint Object Camera is the instrument of choice for looking for neutron stars," says Giovanni Bignami, speaking on behalf of the Italian team. "Whenever it points to a judiciously selected neutron star it detects the corresponding visible or ultraviolet light. The Faint Object Camera has now identified three neutron stars in that way, including Pulsar 1055-52, and it has examined a few that were first detected by other instruments." Mysteries of the neutron stars The importance of the new result can be gauged by the tally of only eight neutron stars seen so far at optical wavelengths, compared with about 760 known from their radio pulsations, and about 21 seen emitting X-rays. Since the first pulsar was detected by radio astronomers in Cambridge, England, nearly 30 years ago, theorists have come to recognize neutron stars as fantastic objects. They are veritable cosmic laboratories in which Nature reveals the behaviour of matter under extreme stress, just one step short of a black hole. A neutron star is created by the force

  8. Analysis of volume holographic storage allowing large-angle illumination

    NASA Astrophysics Data System (ADS)

    Shamir, Joseph

    2005-05-01

    Advanced technological developments have stimulated renewed interest in volume holography for applications such as information storage and wavelength multiplexing for communications and laser beam shaping. In these and many other applications, the information-carrying wave fronts usually possess narrow spatial-frequency bands, although they may propagate at large angles with respect to each other or a preferred optical axis. Conventional analytic methods are not capable of properly analyzing the optical architectures involved. For mitigation of the analytic difficulties, a novel approximation is introduced to treat narrow spatial-frequency band wave fronts propagating at large angles. This approximation is incorporated into the analysis of volume holography based on a plane-wave decomposition and Fourier analysis. As a result of the analysis, the recently introduced generalized Bragg selectivity is rederived for this more general case and is shown to provide enhanced performance for the above indicated applications. The power of the new theoretical description is demonstrated with the help of specific examples and computer simulations. The simulations reveal some interesting effects, such as coherent motion blur, that were predicted in an earlier publication.

  9. Unified framework for recognition, localization and mapping using wearable cameras.

    PubMed

    Vázquez-Martín, Ricardo; Bandera, Antonio

    2012-08-01

    Monocular approaches to simultaneous localization and mapping (SLAM) have recently addressed with success the challenging problem of the fast computation of dense reconstructions from a single, moving camera. Thus, if these approaches initially relied on the detection of a reduced set of interest points to estimate the camera position and the map, they are currently able to reconstruct dense maps from a handheld camera while the camera coordinates are simultaneously computed. However, these maps of 3-dimensional points usually remain meaningless, that is, with no memorable items and without providing a way of encoding spatial relationships between objects and paths. In humans and mobile robotics, landmarks play a key role in the internalization of a spatial representation of an environment. They are memorable cues that can serve to define a region of the space or the location of other objects. In a topological representation of the space, landmarks can be identified and located according to its structural, perceptive or semantic significance and distinctiveness. But on the other hand, landmarks may be difficult to be located in a metric representation of the space. Restricted to the domain of visual landmarks, this work describes an approach where the map resulting from a point-based, monocular SLAM is annotated with the semantic information provided by a set of distinguished landmarks. Both features are obtained from the image. Hence, they can be linked by associating to each landmark all those point-based features that are superimposed to the landmark in a given image (key-frame). Visual landmarks will be obtained by means of an object-based, bottom-up attention mechanism, which will extract from the image a set of proto-objects. These proto-objects could not be always associated with natural objects, but they will typically constitute significant parts of these scene objects and can be appropriately annotated with semantic information. Moreover, they will be affine covariant regions, that is, they will be invariant to affine transformation, being detected under different viewing conditions (view-point angle, rotation, scale, etc.). Monocular SLAM will be solved using the accurate parallel tracking and mapping (PTAM) framework by Klein and Murray in Proceedings of IEEE/ACM international symposium on mixed and augmented reality, 2007. PMID:22806676

  10. Aerosol retrieval from twilight photographs taken by a digital camera

    NASA Astrophysics Data System (ADS)

    Saito, M.; Iwabuchi, H.

    2014-12-01

    Twilight sky, one of the most beautiful sights seen in our daily life, varies day by day, because atmospheric components such as ozone and aerosols also varies day by day. Recent studies have revealed the effects of tropospheric aerosols on twilight sky. In this study, we develop a new algorithm for aerosol retrievals from twilight photographs taken by a digital single reflex-lens camera in solar zenith angle of 90-96? with interval of 1?. A radiative transfer model taking spherical-shell atmosphere, multiple scattering and refraction into account is used as a forward model, and the optimal estimation is used as an inversion calculation to infer the aerosol optical and radiative properties. The sensitivity tests show that tropospheric (stratospheric) aerosol optical thickness is responsible to the distribution of twilight sky color and brightness near the horizon (in viewing angles of 10? to 20?) and aerosol size distribution is responsible to the angular distribution of brightness near the solar direction. The AOTs are inferred with small uncertainties and agree very well with that from the Skyradiometer. In this conference, several case studies using the algorithm will be shown.

  11. Tunable pulsed narrow bandwidth light source

    DOEpatents

    Powers, Peter E. (Dayton, OH); Kulp, Thomas J. (Livermore, CA)

    2002-01-01

    A tunable pulsed narrow bandwidth light source and a method of operating a light source are provided. The light source includes a pump laser, first and second non-linear optical crystals, a tunable filter, and light pulse directing optics. The method includes the steps of operating the pump laser to generate a pulsed pump beam characterized by a nanosecond pulse duration and arranging the light pulse directing optics so as to (i) split the pulsed pump beam into primary and secondary pump beams; (ii) direct the primary pump beam through an input face of the first non-linear optical crystal such that a primary output beam exits from an output face of the first non-linear optical crystal; (iii) direct the primary output beam through the tunable filter to generate a sculpted seed beam; and direct the sculpted seed beam and the secondary pump beam through an input face of the second non-linear optical crystal such that a secondary output beam characterized by at least one spectral bandwidth on the order of about 0.1 cm.sup.-1 and below exits from an output face of the second non-linear optical crystal.

  12. Narrow bandpass cryogenic filter for microwave measurements.

    PubMed

    Ivanov, B I; Klimenko, D N; Sultanov, A N; Il'ichev, E; Meyer, H-G

    2013-05-01

    An ultra-wide stopband hairpin bandpass filter with integrated nonuniform transmission lines was designed and fabricated for highly sensitive measurements at cryogenic temperatures down to millikelvin and a frequency range of 10 Hz-10 GHz. The scattering matrices of the filter were characterized at T = 4.2 K. The filter provides a stopband from 10 Hz to 2.2 GHz and from 2.3 GHz to 10 GHz with more than 50 dB and 40 dB of amplitude suppression, respectively. The center frequency of the passband is f0 = 2.25 GHz with a bandwidth ?f = 80 MHz. The maximum insertion loss in the passband is 4 dB. The filter has a 50 ? input and output impedance, SubMiniature version A connector termination, and significantly reduced form factor. The wide stopband frequency range and narrow passband in conjunction with small dimensions make the filter suitable to use it as a part of a high sensitive readout for superconducting quantum circuits, such as superconducting quantum bits and cryogenic parametric amplifiers. PMID:23742575

  13. Latitude dependence of narrow bipolar pulse emissions

    NASA Astrophysics Data System (ADS)

    Ahmad, M. R.; Esa, M. R. M.; Cooray, V.; Baharudin, Z. A.; Hettiarachchi, P.

    2015-06-01

    In this paper, we present a comparative study on the occurrence of narrow bipolar pulses (NBPs) and other forms of lightning flashes across various geographical areas ranging from northern regions to the tropics. As the latitude decreased from Uppsala, Sweden (59.8°N) to South Malaysia (1.5°N), the percentage of NBP emissions relative to the total number of lightning flashes increased significantly from 0.13% to 12%. Occurrences of positive NBPs were more common than negative NBPs at all observed latitudes. However, as latitudes decreased, the negative NBP emissions increased significantly from 20% (Uppsala, Sweden) to 45% (South Malaysia). Factors involving mixed-phase region elevations and vertical extents of thundercloud tops are invoked to explain the observed results. These factors are fundamentally latitude dependent. Our results suggest that the NBP emission rate is not a useful measure to monitor thunderstorm severity because regular tropical thunderstorms, where relatively high NBP emissions occur, lack suitable conditions to become severe (i.e., there is modest convective available potential energy and a lack of baroclinity in such regions). Observations of significantly high negative NBP occurrences together with very rare occurrences of positive cloud-to-ground flashes and isolated breakdown pulses in tropical thunderstorms are indicative of a stronger negative screening layer magnitude and weaker lower positive charge region magnitude than those in northern regions.

  14. The Meteorological Setting of Narrow Bipolar Events

    NASA Astrophysics Data System (ADS)

    Stanley, M. A.; Suszcynsky, D. M.; Heavner, M. J.

    2003-12-01

    Narrow Bipolar Events (NBEs) are an impulsive form of electrical breakdown in storms which emits strong VHF radiation. It is well known that these events can be readily detected by VHF receivers in orbit and thus may provide a highly practical means to globally monitor storm activity. However, relatively little is known about how NBEs relate to the convective phase of storms and of how good a predictor they are of severe weather events such as large hail, damaging winds, and tornadoes. On June 10, 2002, numerous energetic NBEs were detected over Kansas by the Los Alamos National Laboratory Edot array, which is primarily located in Florida. These NBEs were also detected by a VHF receiver on-board the SVN 54 GPS satellite. The NBEs were associated with severe thunderstorms which produced softball size hail exceeding 11 centimeters in diameter and a weak F0 tornado. In another case study, several F2 tornadic Florida storms were analyzed for March, 2001. Unlike the Kansas storms, the NBEs of the Florida tornadic storms were spread out over a much wider area and exhibited considerable variability in both frequency of occurrence and predominant polarity of vertical charge transfer. To further explore the significance of the NBE rate variability, we will analyze NEXRAD radar volume scans in conjunction with Edot 3-dimensional locations to better understand how NBEs correlate with the thunderstorm life-cycle.

  15. Preliminary analysis on faint luminous lightning events recorded by multiple high speed cameras

    NASA Astrophysics Data System (ADS)

    Alves, J.; Saraiva, A. V.; Pinto, O.; Campos, L. Z.; Antunes, L.; Luz, E. S.; Medeiros, C.; Buzato, T. S.

    2013-12-01

    The objective of this work is the study of some faint luminous events produced by lightning flashes that were recorded simultaneously by multiple high-speed cameras during the previous RAMMER (Automated Multi-camera Network for Monitoring and Study of Lightning) campaigns. The RAMMER network is composed by three fixed cameras and one mobile color camera separated by, in average, distances of 13 kilometers. They were located in the Paraiba Valley (in the cities of São José dos Campos and Caçapava), SP, Brazil, arranged in a quadrilateral shape, centered in São José dos Campos region. This configuration allowed RAMMER to see a thunderstorm from different angles, registering the same lightning flashes simultaneously by multiple cameras. Each RAMMER sensor is composed by a triggering system and a Phantom high-speed camera version 9.1, which is set to operate at a frame rate of 2,500 frames per second with a lens Nikkor (model AF-S DX 18-55 mm 1:3.5 - 5.6 G in the stationary sensors, and a lens model AF-S ED 24 mm - 1:1.4 in the mobile sensor). All videos were GPS (Global Positioning System) time stamped. For this work we used a data set collected in four RAMMER manual operation days in the campaign of 2012 and 2013. On Feb. 18th the data set is composed by 15 flashes recorded by two cameras and 4 flashes recorded by three cameras. On Feb. 19th a total of 5 flashes was registered by two cameras and 1 flash registered by three cameras. On Feb. 22th we obtained 4 flashes registered by two cameras. Finally, in March 6th two cameras recorded 2 flashes. The analysis in this study proposes an evaluation methodology for faint luminous lightning events, such as continuing current. Problems in the temporal measurement of the continuing current can generate some imprecisions during the optical analysis, therefore this work aim to evaluate the effects of distance in this parameter with this preliminary data set. In the cases that include the color camera we analyzed the RGB mode (red, green, blue) and compared them with the data provided by the black and white cameras for the same event and the influence of these parameters with the luminosity intensity of the flashes. Two peculiar cases presented, from the data obtained at one site, a stroke, some continuing current during the interval between the strokes and, then, a subsequent stroke; however, the other site showed that the subsequent stroke was in fact an M-component, since the continuing current had not vanished after its parent stroke. These events generated a dubious classification for the same event that was based only in a visual analysis with high-speed cameras and they were analyzed in this work.

  16. Location accuracy evaluation of lightning location systems using natural lightning flashes recorded by a network of high-speed cameras

    NASA Astrophysics Data System (ADS)

    Alves, J.; Saraiva, A. C. V.; Campos, L. Z. D. S.; Pinto, O., Jr.; Antunes, L.

    2014-12-01

    This work presents a method for the evaluation of location accuracy of all Lightning Location System (LLS) in operation in southeastern Brazil, using natural cloud-to-ground (CG) lightning flashes. This can be done through a multiple high-speed cameras network (RAMMER network) installed in the Paraiba Valley region - SP - Brazil. The RAMMER network (Automated Multi-camera Network for Monitoring and Study of Lightning) is composed by four high-speed cameras operating at 2,500 frames per second. Three stationary black-and-white (B&W) cameras were situated in the cities of São José dos Campos and Caçapava. A fourth color camera was mobile (installed in a car), but operated in a fixed location during the observation period, within the city of São José dos Campos. The average distance among cameras was 13 kilometers. Each RAMMER sensor position was determined so that the network can observe the same lightning flash from different angles and all recorded videos were GPS (Global Position System) time stamped, allowing comparisons of events between cameras and the LLS. The RAMMER sensor is basically composed by a computer, a Phantom high-speed camera version 9.1 and a GPS unit. The lightning cases analyzed in the present work were observed by at least two cameras, their position was visually triangulated and the results compared with BrasilDAT network, during the summer seasons of 2011/2012 and 2012/2013. The visual triangulation method is presented in details. The calibration procedure showed an accuracy of 9 meters between the accurate GPS position of the object triangulated and the result from the visual triangulation method. Lightning return stroke positions, estimated with the visual triangulation method, were compared with LLS locations. Differences between solutions were not greater than 1.8 km.

  17. Characteristics of slug flow in narrow rectangular channels under vertical condition

    NASA Astrophysics Data System (ADS)

    Wang, Yang; Yan, Changqi; Sun, Licheng; Xing, Dianchuan; Yan, Chaoxing; Tian, Daogui

    2013-07-01

    Gas-liquid slug flow is widely encountered in many practical industrial applications. A detailed understanding of the hydrodynamics of gas slug has important significance for modeling of the slug flow. Non-intrusive flow visualization using a high speed video camera system is applied to study characteristics of slug flow in a vertical narrow rectangular channel (3.25×40 mm2). Ideal Taylor bubbles are hardly observed, and most of the gas slugs are deformed, much more seriously at high liquid superficial velocity. The liquid film thicknesses of left and right narrow sides surrounding gas slug are divergent and wavy, but it has weak effect on liquid film velocity. The gas and liquid velocity as well as the length of gas slug have significant effect on the separating liquid film thickness. The separating liquid film velocity is decreased with the increase of gas superficial velocity at low liquid velocity, and increased with the increase of liquid superficial velocity. The film stops descending and the gas superficial velocity has no significant effect on liquid film separating velocity at high liquid velocity (jL?1.204 m/s), and it is mainly determined by the liquid flow rate. The shape of slug nose has a significant effect on its velocity, while the effect of its length is very weak. The Ishii&Jones-Zuber drift flux correlation could predict slug velocity well, except at low liquid superficial velocity by reason of that the calculated drift velocity is less than experimental values.

  18. The Narrow Line Region of the Seyfert 2 galaxy Mrk 78. An infrared view

    E-print Network

    C. Ramos Almeida; A. M. Pérez García; J. A. Acosta-Pulido; J. M. Rodríguez Espinosa; R. Barrena; A. Manchado

    2006-03-16

    We report near-infrared spectroscopic data for the Seyfert 2 galaxy Mrk 78, taken with the LIRIS near-infrared camera/spectrometer at the William Herschel Telescope (WHT). The long-slit spectra clearly show extended emission. The resolution and depth of the near-infrared spectra allows the examination of its morphology and ionization regions, and a direct comparison with similarly deep visible spectra. The emission-line ratios obtained are used to derive the extinction towards the nucleus. The detection of strong features such as [Fe II], H$_{2}$, hydrogen recombination lines and the coronal [Si VI]$\\lambda$1.962 line is used to study the kinematics and excitation mechanisms occurring in Mrk 78, revealing that despite of the strong radio-jet interaction present in this object, photoionization from the active nucleus dominates the narrow line region emission, while UV fluorescence is the source of the H$_{2}$ emission. Lines with extended emission yield velocity distributions with an amplitude of about 600 km/s, the consequence of an eastern lobe moving away from us plus a western lobe with the opposite contribution. We used the photoionization code CLOUDY to recreate a typical narrow line region region, to derive the ionization parameter, and to compare our spectral data with diagnostic diagrams.

  19. Observations of positive narrow bipolar pulses

    NASA Astrophysics Data System (ADS)

    Karunarathne, Sumedhe; Marshall, Thomas C.; Stolzenburg, Maribeth; Karunarathna, Nadeeka

    2015-07-01

    Waveforms of 226 positive narrow bipolar pulses (NBPs) were obtained with five to eight stations of E-change meters covering an area of 70 × 100 km2. The NBPs had typical average parameters: 10-90% rise time of 2.6 ?s, full width at half maximum time of 2.8 ?s, zero cross time of 9.9 ?s, and range-normalized amplitude at 100 km of 11.0 V/m. Four main types of positive NBP waveforms were identified: Type A had a simple bipolar waveform with a positive peak and a negative overshoot peak (1% of NBPs), Type B had extra peak(s) superimposed on the overshoot peak (67%), Type C had extra peak(s) on or just after the main positive peak (13%), and Type D had extra peak(s) before the main positive peak (19%). Regardless of type, each NBP waveform maintained its basic shape across a range of 10 to 130 km from its origin. NBP locations, obtained with a time of arrival technique, seemed unrestricted in their horizontal distribution (except for Type C), while NBP altitudes ranged from 7 to 19 km with an average of 13 km. Estimated peak currents were 2-126 kA with an average of 30 kA. Isolation of NBPs from other lightning events was determined for both temporal (660 ms) and spatial (>10 km) quantities; 37% of NBPs were isolated, 38% occurred within 660 ms before a flash, 19% occurred within flashes, and 11% occurred within 660 ms after a flash. The total RMS power radiated by NBPs within 1 kHz-2.5 MHz bandwidth had a range of 5.0 × 106-6.1 × 108 W with an average of 7.8 × 107 W.

  20. NIR Narrow- and Broad-Band Study of the SSA 22 Field

    NASA Astrophysics Data System (ADS)

    Tamura, Naoyuki; Ohta, Kouji; Maihara, Toshinori; Iwamuro, Fumihide; Motohara, Kentaro; Takata, Tadafumi; Iye, Masanori

    2001-08-01

    Deep narrow- and broad-band near-infrared imaging observations of the central 2'×2' region of the SSA 22 field were made with the near-infrared camera (CISCO) attached to the Subaru Telescope. Using a narrow-band filter centered at 2.033?m, [O III] ?5007 emitters at z ~ 3.06 +/- 0.02 were searched to examine star-forming activities in an over-density region where a clustering of Lyman Break Galaxies (LBGs) and Lyman ? emitter candidates around z = 3.09 was reported, though the targeted redshift is slightly different from that of the peak of the over-density region. Although one emitter candidate at z = 3.06 was detected, it is likely to be located at a redshift of between 1 and 2 judged based on multi-band photometry. Another emission-line object was detected in another narrow-band filter (``off band'' filter) centered at 2.120 ?m, which is identified with a galaxy at z = 0.132 (the emission line is Paschen ?). The K'-band imaging data revealed the presence of 12 Extremely Red Objects (EROs) with I814-K' gep 4. The distribution of the EROs does not seem to coincide with that of Lyman Break Galaxies or Lyman alpha; emitters at z ~ 3. The magnitudes and colors of the EROs are not consistent with those of passively evolving massive elliptical galaxies at z ~ 3. Candidates for counterparts of the submm sources detected with SCUBA are found; no EROs around the submm sources are found in our magnitude limit.

  1. Atomic-resolution defect contrast in low angle annular dark-field STEM

    SciTech Connect

    Phillips, Patrick J.; De Graef, M.; Kovarik, Libor; Agrawal, A.; Windl, W.; Mills, M. J.

    2012-05-01

    While traditional high-resolution STEM is performed by exclusively collecting electrons which have been scattered to high angles (i.e., HAADF), the present contribution will focus on small-angle scattered electrons, as in low angle annular dark-field (LAADF) STEM. This unique imaging mode allows one to image defect contrast while maintaining directly interpretable atomic resolution. By simply adjusting the microscope camera length, and thus the acceptance angle of the annular detector, it is possible to transition between Z-contrast and defect contrast. Both LAADF and HAADF experimental and computational results are discussed in regards to zone axis imaging of a y/y1Ni-superalloy; various length scales are explored. Electron de-channeling is observed while the probe is placed over defected regions of crystal.

  2. A modified captive bubble method for determining advancing and receding contact angles

    NASA Astrophysics Data System (ADS)

    Xue, Jian; Shi, Pan; Zhu, Lin; Ding, Jianfu; Chen, Qingmin; Wang, Qingjun

    2014-03-01

    In this work, a modification to the captive bubble method was proposed to test the advancing and receding contact angle. This modification is done by adding a pressure chamber with a pressure control system to the original experimental system equipped with an optical angle mater equipped with a high speed CCD camera, a temperature control system and a computer. A series of samples with highly hydrophilic, hydrophilic, hydrophobic and superhydrophobic surfaces were prepared. The advancing and receding contact angles of these samples with highly hydrophilic, hydrophilic, and hydrophobic surfaces through the new methods was comparable to the result tested by the traditional sessile drop method. It is proved that this method overcomes the limitation of the traditional captive bubble method and the modified captive bubble method allows a smaller error from the test. However, due to the nature of the captive bubble technique, this method is also only suitable for testing the surface with advancing or receding contact angle below 130°.

  3. Foveated fisheye lens design using an angle-variant distortion projection function

    NASA Astrophysics Data System (ADS)

    Samy, Ahmed Mahmoud; Gao, Zhishan

    2015-11-01

    Light projection function is a major area of interest within the field of designing ultra-wide angle cameras. In this paper, we introduce a novel ultra-wide angle projection function that is characterized by an angle-variant distortion model similar to that of human retina space-variant resolution. The projection peculiarities were compared with the classical equidistant fisheye projection function to illustrate the benefits of our projection model on real-time tracking and ultra-wide angle imaging applications. The new projection model produced an accurate result with uncomplicated distortion control using Zemax user-defined macro program. The inverted model is also successfully used in correcting such distorted image. The paper shows as well the design of an original 170° fast foveated fisheye lens that provides more than 52 % undistorted image with a high-resolution performance over the entire lens field of view.

  4. Diffusion-Induced Ramsey Narrowing Yanhong Xiao,1

    E-print Network

    Walsworth, Ronald L.

    Diffusion-Induced Ramsey Narrowing Yanhong Xiao,1 Irina Novikova,1 David F. Phillips,1 and Ronald L February 2006) Diffusion-induced Ramsey narrowing is characterized and identified as a general phenomenon, in which diffusion of coherence in and out of an interaction region such as a laser beam induces spectral

  5. 1. Photocopied July 1971 from Photo 745, Jordan Narrows Folder ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    1. Photocopied July 1971 from Photo 745, Jordan Narrows Folder #1, Engineering Department, Utah Power and Light Co., Salt Lake City, Utah. JORDAN STATION, JULY 2, 1909. GENERAL VIEW. - Salt Lake City Water & Electrical Power Company, Jordan Narrows Hydroelectric Plant, Jordan River, Riverton, Salt Lake County, UT

  6. 3. Photocopied July 1971 from Photo 741, Jordan Narrows Folder ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    3. Photocopied July 1971 from Photo 741, Jordan Narrows Folder #1, Engineering Department, Utah Power and Light Co., Salt Lake City, Utah. INTERIOR VIEW, JULY 2, 1909. - Salt Lake City Water & Electrical Power Company, Jordan Narrows Hydroelectric Plant, Jordan River, Riverton, Salt Lake County, UT

  7. Biomass production of sugarcane on narrow-rows in Florida

    SciTech Connect

    Cayton, J.E.; Eiland, B.R.

    1981-01-01

    Sugarcane production for biomass was examined on three narrow-row patterns in Florida. Equipment and production methods were modified for planting, spraying and harvesting the narrow-row patterns. No large increases in yields of vigorous varieties were found when compared to those from conventional rows. Some increases were observed in varieties which have low stalk populations. 4 refs.

  8. Bayesian Face Recognition and Perceptual Narrowing in Face-Space

    ERIC Educational Resources Information Center

    Balas, Benjamin

    2012-01-01

    During the first year of life, infants' face recognition abilities are subject to "perceptual narrowing", the end result of which is that observers lose the ability to distinguish previously discriminable faces (e.g. other-race faces) from one another. Perceptual narrowing has been reported for faces of different species and different races, in…

  9. A new mathematical explanation of the Tacoma Narrows Bridge collapse

    E-print Network

    a new mathematical model for the study of the dy- namical behavior of suspension bridges which providesA new mathematical explanation of the Tacoma Narrows Bridge collapse Gianni ARIOLI - Filippo The spectacular collapse of the Tacoma Narrows Bridge, which occurred in 1940, has attracted the at- tention

  10. Mobile phone camera benchmarking: combination of camera speed and image quality

    NASA Astrophysics Data System (ADS)

    Peltoketo, Veli-Tapani

    2014-01-01

    When a mobile phone camera is tested and benchmarked, the significance of quality metrics is widely acknowledged. There are also existing methods to evaluate the camera speed. For example, ISO 15781 defines several measurements to evaluate various camera system delays. However, the speed or rapidity metrics of the mobile phone's camera system have not been used with the quality metrics even if the camera speed has become more and more important camera performance feature. There are several tasks in this work. Firstly, the most important image quality metrics are collected from the standards and papers. Secondly, the speed related metrics of a mobile phone's camera system are collected from the standards and papers and also novel speed metrics are identified. Thirdly, combinations of the quality and speed metrics are validated using mobile phones in the market. The measurements are done towards application programming interface of different operating system. Finally, the results are evaluated and conclusions are made. The result of this work gives detailed benchmarking results of mobile phone camera systems in the market. The paper defines also a proposal of combined benchmarking metrics, which includes both quality and speed parameters.

  11. Improving Situational Awareness in Camera Surveillance by Combining Top View Maps with Camera Footages

    E-print Network

    Theune, Mariët

    Improving Situational Awareness in Camera Surveillance by Combining Top View Maps with Camera, situational awareness, top view map, field of view. 1. INTRODUCTION 1.1 Available Technology At the moment of the area being monitored. This in turn is expected to enhance situational awareness of the operator, which

  12. Robust Camera Calibration Tool for Video Surveillance Camera in Urban Environment

    E-print Network

    Southern California, University of

    position, orientation, and fo- cal length) is very useful for various surveillance systems because it can and Ram Nevatia Institute for Robotics and Intelligent Systems, University of Southern California Los such as smart room and security system are prevailing nowadays. Camera calibra- tion information (e.g. camera

  13. Characterization of SWIR cameras by MRC measurements

    NASA Astrophysics Data System (ADS)

    Gerken, M.; Schlemmer, H.; Haan, Hubertus A.; Siemens, Christofer; Münzberg, M.

    2014-05-01

    Cameras for the SWIR wavelength range are becoming more and more important because of the better observation range for day-light operation under adverse weather conditions (haze, fog, rain). In order to choose the best suitable SWIR camera or to qualify a camera for a given application, characterization of the camera by means of the Minimum Resolvable Contrast MRC concept is favorable as the MRC comprises all relevant properties of the instrument. With the MRC known for a given camera device the achievable observation range can be calculated for every combination of target size, illumination level or weather conditions. MRC measurements in the SWIR wavelength band can be performed widely along the guidelines of the MRC measurements of a visual camera. Typically measurements are performed with a set of resolution targets (e.g. USAF 1951 target) manufactured with different contrast values from 50% down to less than 1%. For a given illumination level the achievable spatial resolution is then measured for each target. The resulting curve is showing the minimum contrast that is necessary to resolve the structure of a target as a function of spatial frequency. To perform MRC measurements for SWIR cameras at first the irradiation parameters have to be given in radiometric instead of photometric units which are limited in their use to the visible range. In order to do so, SWIR illumination levels for typical daylight and twilight conditions have to be defined. At second, a radiation source is necessary with appropriate emission in the SWIR range (e.g. incandescent lamp) and the irradiance has to be measured in W/m2 instead of Lux = Lumen/m2. At third, the contrast values of the targets have to be calibrated newly for the SWIR range because they typically differ from the values determined for the visual range. Measured MRC values of three cameras are compared to the specified performance data of the devices and the results of a multi-band in-house designed Vis-SWIR camera system are discussed.

  14. PANIC: the new panoramic NIR camera for Calar Alto

    NASA Astrophysics Data System (ADS)

    Baumeister, Harald; Alter, Matthias; Cárdenas Vázquez, M. Concepción; Fernandez, Matilde; Fried, Josef; Helmling, Jens; Huber, Armin; Ibáñez Mengual, Jose-Miguel; Rodríguez Gómez, Julio F.; Laun, Werner; Lenzen, Rainer; Mall, Ulrich; Naranjo, Vianak; Ramos, Jose-Ricardo; Rohloff, Ralf-Rainer; García Segura, Antonio; Storz, Clemens; Ubierna, Marcos; Wagner, Karl

    2008-07-01

    PANIC is a wide-field NIR camera, which is currently under development for the Calar Alto observatory (CAHA) in Spain. It uses a mosaic of four Hawaii-2RG detectors and covers the spectral range from 0.8-2.5 ?m (z to K-band). The field-of-view is 30×30 arcmin. This instrument can be used at the 2.2m telescope (0.45arcsec/pixel, 0.5×0.5 degree FOV) and at the 3.5m telescope (0.23arcsec/pixel, 0.25×0.25 degree FOV). The operating temperature is about 77K, achieved by liquid Nitrogen cooling. The cryogenic optics has three flat folding mirrors with diameters up to 282 mm and nine lenses with diameters between 130 mm and 255 mm. A compact filter unit can carry up to 19 filters distributed over four filter wheels. Narrow band (1%) filters can be used. The instrument has a diameter of 1.1 m and it is about 1 m long. The weight limit of 400 kg at the 2.2m telescope requires a light-weight cryostat design. The aluminium vacuum vessel and radiation shield have wall thicknesses of only 6 mm and 3 mm respectively.

  15. Localization using RGB-D cameras orthoimages

    NASA Astrophysics Data System (ADS)

    Mittet, M.-A.; Landes, T.; Grussenmeyer, P.

    2014-06-01

    3D cameras are a new generation of sensors more and more used in geomatics. The main advantages of 3D cameras are their handiness, their price, and the ability to produce range images or point clouds in real-time. They are used in many areas and the use of this kind of sensors has grown especially as the Kinect (Microsoft) arrived on the market. This paper presents a new localization system based exclusively on the combination of several 3D cameras on a mobile platform. It is planed that the platform moves on sidewalks, acquires the environment and enables the determination of most appropriate routes for disabled persons. The paper will present the key features of our approach as well as promising solutions for the challenging task of localization based on 3D-cameras. We give examples of mobile trajectory estimated exclusively from 3D cameras acquisitions. We evaluate the accuracy of the calculated trajectory, thanks to a reference trajectory obtained by a total station.

  16. Camera Calibration with Radial Variance Component Estimation

    NASA Astrophysics Data System (ADS)

    Mélykuti, B.; Kruck, E. J.

    2014-11-01

    Camera calibration plays a more and more important role in recent times. Beside real digital aerial survey cameras the photogrammetric market is dominated by a big number of non-metric digital cameras mounted on UAVs or other low-weight flying platforms. The in-flight calibration of those systems has a significant role to enhance the geometric accuracy of survey photos considerably. It is expected to have a better precision of photo measurements in the center of images then along the edges or in the corners. With statistical methods the accuracy of photo measurements in dependency of the distance of points from image center has been analyzed. This test provides a curve for the measurement precision as function of the photo radius. A high number of camera types have been tested with well penetrated point measurements in image space. The result of the tests led to a general consequence to show a functional connection between accuracy and radial distance and to give a method how to check and enhance the geometrical capability of the cameras in respect to these results.

  17. Single tube gamma camera for scintimammography.

    PubMed

    Pani, R; Scopinaro, F; Pellegrini, R; Soluri, A; Pergola, A; De Vincentis, G; Ierardi, M; Weinberg, I N

    1997-01-01

    The development of large area Position Sensitive Photo Multiplier Tubes (PSPMT) by Hamamatsu is opening new imaging possibilities in Nuclear Medicine. In particular the realization of the 8" PSPMT prototype represents the first important technological advantage since the discovery of the Anger Camera. PSPMT virtually integrates in one hundreds PMT allowing the creation of dedicated detectors. A Single Tube Gamma Camera based on a 5" PSPMT dedicated to scintimammography is presented and discussed in this work. To optimize gamma camera response two different scintillating arrays were tested: YAP:Ce and CsI (Tl). Their overall size cover all photocathode active area, and crystal pixel size was 2 mm x 2 mm. The detection efficiency was comparable to that of Anger Camera. The best result was obtained by CsI (Tl) scintillating: an intrinsic spatial resolution of 1.6 mm FWHM and a relative energy resolution of 17% FWHM. New image possibilities in scintimammography are offered by Single Tube Gamma Camera operating in the same radiological projection of RX mammography. PMID:9179212

  18. Design of Endoscopic Capsule With Multiple Cameras.

    PubMed

    Gu, Yingke; Xie, Xiang; Li, Guolin; Sun, Tianjia; Wang, Dan; Yin, Zheng; Zhang, Pengfei; Wang, Zhihua

    2014-11-01

    In order to reduce the miss rate of the wireless capsule endoscopy, in this paper, we propose a new system of the endoscopic capsule with multiple cameras. A master-slave architecture, including an efficient bus architecture and a four level clock management architecture, is applied for the Multiple Cameras Endoscopic Capsule (MCEC). For covering more area of the gastrointestinal tract wall with low power, multiple cameras with a smart image capture strategy, including movement sensitive control and camera selection, are used in the MCEC. To reduce the data transfer bandwidth and power consumption to prolong the MCEC's working life, a low complexity image compressor with PSNR 40.7 dB and compression rate 86% is implemented. A chipset is designed and implemented for the MCEC and a six cameras endoscopic capsule prototype is implemented by using the chipset. With the smart image capture strategy, the coverage rate of the MCEC prototype can achieve 98% and its power consumption is only about 7.1 mW. PMID:25376042

  19. Experimental Study of Flooding in Vertical Narrow Rectangular channels

    NASA Astrophysics Data System (ADS)

    Li, X. C.; Sun, Z. N.

    2010-03-01

    In this paper, counter-current gas—liquid two-phase flow and onset of flooding in vertical narrow rectangular channels were studied. The onset of flooding in vertical narrow rectangular channels was investigated by the condition at which the liquid on the channel wall begins to move partially above the liquid injection section. In order to study the flow pattern, during counter-current flow and determine conditions associated with the onset of flooding, the flow pattern and pressure drop were investigated by visual experiments. In addition, the flooding phenomena in vertical narrow rectangular channels were compared with that in conventional channels. The results show that the flow characteristics and the tendency of pressure drop in vertical narrow rectangular channels were similarly with the conventional channels. However, the maximum of pressure drop appeared at the completed carrying up of flooding in vertical narrow rectangular channels, and it appeared at the onset of flooding in conventional channels.

  20. Acceptance/Operational Test Report for Tank 241-AN-104 camera and camera purge control system

    SciTech Connect

    Castleberry, J.L.

    1995-11-01

    This Acceptance/Operational Test Procedure (ATP/OTP) will document the satisfactory operation of the camera purge panel, purge control panel, color camera system and associated control components destined for installation. The final acceptance of the complete system will be performed in the field. The purge panel and purge control panel will be tested for its safety interlock which shuts down the camera and pan-and-tilt inside the tank vapor space during loss of purge pressure and that the correct purge volume exchanges are performed as required by NFPA 496. This procedure is separated into seven sections. This Acceptance/Operational Test Report documents the successful acceptance and operability testing of the 241-AN-104 camera system and camera purge control system.