Science.gov

Sample records for camera narrow angle

  1. Flight Calibration of the LROC Narrow Angle Camera

    NASA Astrophysics Data System (ADS)

    Humm, D. C.; Tschimmel, M.; Brylow, S. M.; Mahanti, P.; Tran, T. N.; Braden, S. E.; Wiseman, S.; Danton, J.; Eliason, E. M.; Robinson, M. S.

    2015-09-01

    Characterization and calibration are vital for instrument commanding and image interpretation in remote sensing. The Lunar Reconnaissance Orbiter Camera Narrow Angle Camera (LROC NAC) takes 500 Mpixel greyscale images of lunar scenes at 0.5 meters/pixel. It uses two nominally identical line scan cameras for a larger crosstrack field of view. Stray light, spatial crosstalk, and nonlinearity were characterized using flight images of the Earth and the lunar limb. These are important for imaging shadowed craters, studying ˜1 meter size objects, and photometry respectively. Background, nonlinearity, and flatfield corrections have been implemented in the calibration pipeline. An eight-column pattern in the background is corrected. The detector is linear for DN = 600-2000 but a signal-dependent additive correction is required and applied for DN<600. A predictive model of detector temperature and dark level was developed to command dark level offset. This avoids images with a cutoff at DN=0 and minimizes quantization error in companding. Absolute radiometric calibration is derived from comparison of NAC images with ground-based images taken with the Robotic Lunar Observatory (ROLO) at much lower spatial resolution but with the same photometric angles.

  2. Lunar Reconnaissance Orbiter Camera Narrow Angle Cameras: Laboratory and Initial Flight Calibration

    NASA Astrophysics Data System (ADS)

    Humm, D. C.; Tschimmel, M.; Denevi, B. W.; Lawrence, S.; Mahanti, P.; Tran, T. N.; Thomas, P. C.; Eliason, E.; Robinson, M. S.

    2009-12-01

    The Lunar Reconnaissance Orbiter Camera (LROC) has two identical Narrow Angle Cameras (NACs). Each NAC is a monochrome pushbroom scanner, providing images with a pixel scale of 50 cm from a 50-km orbit. A single NAC image has a swath width of 2.5 km and a length of up to 26 km. The NACs are mounted to acquire side-by-side imaging for a combined swath width of 5 km. The NAC is designed to fully characterize future human and robotic landing sites in terms of scientific and resource merit, trafficability, and hazards. The North and South poles will be mapped at 1-meter-scale poleward of 85.5 degrees latitude. Stereo coverage is achieved by pointing the NACs off-nadir, which requires planning in advance. Read noise is 91 and 93 e- and the full well capacity is 334,000 and 352,000 e- for NAC-L and NAC-R respectively. Signal-to-noise ranges from 42 for low-reflectance material with 70 degree illumination to 230 for high-reflectance material with 0 degree illumination. Longer exposure times and 2x binning are available to further increase signal-to-noise with loss of spatial resolution. Lossy data compression from 12 bits to 8 bits uses a companding table selected from a set optimized for different signal levels. A model of focal plane temperatures based on flight data is used to command dark levels for individual images, optimizing the performance of the companding tables and providing good matching of the NAC-L and NAC-R images even before calibration. The preliminary NAC calibration pipeline includes a correction for nonlinearity at low signal levels with an offset applied for DN>600 and a logistic function for DN<600. Flight images taken on the limb of the Moon provide a measure of stray light performance. Averages over many lines of images provide a measure of flat field performance in flight. These are comparable with laboratory data taken with a diffusely reflecting uniform panel.

  3. Two Years of Digital Terrain Model Production Using the Lunar Reconnaissance Orbiter Narrow Angle Camera

    NASA Astrophysics Data System (ADS)

    Burns, K.; Robinson, M. S.; Speyerer, E.; LROC Science Team

    2011-12-01

    One of the primary objectives of the Lunar Reconnaissance Orbiter Camera (LROC) is to gather stereo observations with the Narrow Angle Camera (NAC). These stereo observations are used to generate digital terrain models (DTMs). The NAC has a pixel scale of 0.5 to 2.0 meters but was not designed for stereo observations and thus requires the spacecraft to roll off-nadir to acquire these images. Slews interfere with the data collection of the other instruments, so opportunities are currently limited to four per day. Arizona State University has produced DTMs from 95 stereo pairs for 11 Constellation Project (CxP) sites (Aristarchus, Copernicus crater, Gruithuisen domes, Hortensius domes, Ina D-caldera, Lichtenberg crater, Mare Ingenii, Marius hills, Reiner Gamma, South Pole-Aitkin Rim, Sulpicius Gallus) as well as 30 other regions of scientific interest (including: Bhabha crater, highest and lowest elevation points, Highland Ponds, Kugler Anuchin, Linne Crater, Planck Crater, Slipher crater, Sears Crater, Mandel'shtam Crater, Virtanen Graben, Compton/Belkovich, Rumker Domes, King Crater, Luna 16/20/23/24 landing sites, Ranger 6 landing site, Wiener F Crater, Apollo 11/14/15/17, fresh craters, impact melt flows, Larmor Q crater, Mare Tranquillitatis pit, Hansteen Alpha, Moore F Crater, and Lassell Massif). To generate DTMs, the USGS ISIS software and SOCET SET° from BAE Systems are used. To increase the absolute accuracy of the DTMs, data obtained from the Lunar Orbiter Laser Altimeter (LOLA) is used to coregister the NAC images and define the geodetic reference frame. NAC DTMs have been used in examination of several sites, e.g. Compton-Belkovich, Marius Hills and Ina D-caldera [1-3]. LROC will continue to acquire high-resolution stereo images throughout the science phase of the mission and any extended mission opportunities, thus providing a vital dataset for scientific research as well as future human and robotic exploration. [1] B.L. Jolliff (2011) Nature Geoscience, in press. [2] Lawrence et al. (2011) LPSC XLII, Abst 2228. [3] Garry et al. (2011) LPSC XLII, Abst 2605.

  4. Narrow Angle movie

    NASA Technical Reports Server (NTRS)

    1999-01-01

    This brief three-frame movie of the Moon was made from three Cassini narrow-angle images as the spacecraft passed by the Moon on the way to its closest approach with Earth on August 17, 1999. The purpose of this particular set of images was to calibrate the spectral response of the narrow-angle camera and to test its 'on-chip summing mode' data compression technique in flight. From left to right, they show the Moon in the green, blue and ultraviolet regions of the spectrum in 40, 60 and 80 millisecond exposures, respectively. All three images have been scaled so that the brightness of Crisium basin, the dark circular region in the upper right, is the same in each image. The spatial scale in the blue and ultraviolet images is 1.4 miles per pixel (2.3 kilometers). The original scale in the green image (which was captured in the usual manner and then reduced in size by 2x2 pixel summing within the camera system) was 2.8 miles per pixel (4.6 kilometers). It has been enlarged for display to the same scale as the other two. The imaging data were processed and released by the Cassini Imaging Central Laboratory for Operations (CICLOPS) at the University of Arizona's Lunar and Planetary Laboratory, Tucson, AZ.

    Photo Credit: NASA/JPL/Cassini Imaging Team/University of Arizona

    Cassini, launched in 1997, is a joint mission of NASA, the European Space Agency and Italian Space Agency. The mission is managed by NASA's Jet Propulsion Laboratory, Pasadena, CA, for NASA's Office of Space Science, Washington DC. JPL is a division of the California Institute of Technology, Pasadena, CA.

  5. NFC - Narrow Field Camera

    NASA Astrophysics Data System (ADS)

    Koukal, J.; Srba, J.; Gorková, S.

    2015-01-01

    We have been introducing a low-cost CCTV video system for faint meteor monitoring and here we describe the first results from 5 months of two-station operations. Our system called NFC (Narrow Field Camera) with a meteor limiting magnitude around +6.5mag allows research on trajectories of less massive meteoroids within individual parent meteor showers and the sporadic background. At present 4 stations (2 pairs with coordinated fields of view) of NFC system are operated in the frame of CEMeNt (Central European Meteor Network). The heart of each NFC station is a sensitive CCTV camera Watec 902 H2 and a fast cinematographic lens Meopta Meostigmat 1/50 - 52.5 mm (50 mm focal length and fixed aperture f/1.0). In this paper we present the first results based on 1595 individual meteors, 368 of which were recorded from two stations simultaneously. This data set allows the first empirical verification of theoretical assumptions for NFC system capabilities (stellar and meteor magnitude limit, meteor apparent brightness distribution and accuracy of single station measurements) and the first low mass meteoroid trajectory calculations. Our experimental data clearly showed the capabilities of the proposed system for low mass meteor registration and for calculations based on NFC data to lead to a significant refinement in the orbital elements for low mass meteoroids.

  6. High-resolution topomapping of candidate MER landing sites with Mars Orbiter Camera narrow-angle images

    USGS Publications Warehouse

    Kirk, R.L.; Howington-Kraus, E.; Redding, B.; Galuszka, D.; Hare, T.M.; Archinal, B.A.; Soderblom, L.A.; Barrett, J.M.

    2003-01-01

    We analyzed narrow-angle Mars Orbiter Camera (MOC-NA) images to produce high-resolution digital elevation models (DEMs) in order to provide topographic and slope information needed to assess the safety of candidate landing sites for the Mars Exploration Rovers (MER) and to assess the accuracy of our results by a variety of tests. The mapping techniques developed also support geoscientific studies and can be used with all present and planned Mars-orbiting scanner cameras. Photogrammetric analysis of MOC stereopairs yields DEMs with 3-pixel (typically 10 m) horizontal resolution, vertical precision consistent with ???0.22 pixel matching errors (typically a few meters), and slope errors of 1-3??. These DEMs are controlled to the Mars Orbiter Laser Altimeter (MOLA) global data set and consistent with it at the limits of resolution. Photoclinometry yields DEMs with single-pixel (typically ???3 m) horizontal resolution and submeter vertical precision. Where the surface albedo is uniform, the dominant error is 10-20% relative uncertainty in the amplitude of topography and slopes after "calibrating" photoclinometry against a stereo DEM to account for the influence of atmospheric haze. We mapped portions of seven candidate MER sites and the Mars Pathfinder site. Safety of the final four sites (Elysium, Gusev, Isidis, and Meridiani) was assessed by mission engineers by simulating landings on our DEMs of "hazard units" mapped in the sites, with results weighted by the probability of landing on those units; summary slope statistics show that most hazard units are smooth, with only small areas of etched terrain in Gusev crater posing a slope hazard.

  7. Methane Band and Continuum Band Imaging of Titan's Atmosphere Using Cassini ISS Narrow Angle Camera Pictures from the CURE/Cassini Imaging Project

    NASA Astrophysics Data System (ADS)

    Shitanishi, Jennifer; Gillam, S. D.

    2009-05-01

    The study of Titan's atmosphere, which bears resemblance to early Earth's, may help us understand more of our own. Constructing a Monte Carlo model of Titan's atmosphere is helpful to achieve this goal. Methane (MT) and continuum band (CB) images of Titan taken by the CURE/Cassini Imaging Project, using the Cassini Narrow Angle Camera (NAC) were analyzed. They were scheduled by Cassini Optical Navigation. Images were obtained at phase 53, 112, 161, and 165. They include 22 total MT1(center wavelength at 619nm), MT2(727nm), MT3(889nm), CB1(635nm), CB2(751nm), and CB3(938nm) images. They were reduced with previously written scripts using the National Optical Astronomy Observatory Image Reduction and Analysis Facility scientific analysis suite. Correction for horizontal and vertical banding and cosmic ray hits were made. The MT images were registered with corresponding CB images to ensure that subsequently measured fluxes ratios came from the same parts of the atmosphere. Preliminary DN limb-to-limb scans and loci of the haze layers will be presented. Accurate estimates of the sub-spacecraft points on each picture will be presented. Flux ratios (FMT/FCB=Q0) along the scans and total absorption coefficients along the lines of sight from the spacecraft through the pixels (and into Titan) will also be presented.

  8. Photogrammetric measurement of 3D freeform millimetre-sized objects with micro features: an experimental validation of the close-range camera calibration model for narrow angles of view

    NASA Astrophysics Data System (ADS)

    Percoco, Gianluca; Sánchez Salmerón, Antonio J.

    2015-09-01

    The measurement of millimetre and micro-scale features is performed by high-cost systems based on technologies with narrow working ranges to accurately control the position of the sensors. Photogrammetry would lower the costs of 3D inspection of micro-features and would be applicable to the inspection of non-removable micro parts of large objects too. Unfortunately, the behaviour of photogrammetry is not known when photogrammetry is applied to micro-features. In this paper, the authors address these issues towards the application of digital close-range photogrammetry (DCRP) to the micro-scale, taking into account that in literature there are research papers stating that an angle of view (AOV) around 10° is the lower limit to the application of the traditional pinhole close-range calibration model (CRCM), which is the basis of DCRP. At first a general calibration procedure is introduced, with the aid of an open-source software library, to calibrate narrow AOV cameras with the CRCM. Subsequently the procedure is validated using a reflex camera with a 60 mm macro lens, equipped with extension tubes (20 and 32 mm) achieving magnification of up to 2 times approximately, to verify literature findings with experimental photogrammetric 3D measurements of millimetre-sized objects with micro-features. The limitation experienced by the laser printing technology, used to produce the bi-dimensional pattern on common paper, has been overcome using an accurate pattern manufactured with a photolithographic process. The results of the experimental activity prove that the CRCM is valid for AOVs down to 3.4° and that DCRP results are comparable with the results of existing and more expensive commercial techniques.

  9. Narrow-angle astrometry with PRIMA

    NASA Astrophysics Data System (ADS)

    Sahlmann, J.; Sgransan, D.; Mrand, A.; Zimmerman, N.; Abuter, R.; Chazelas, B.; Delplancke, F.; Henning, T.; Kaminski, A.; Khler, R.; Launhardt, R.; Mohler, M.; Pepe, F.; Queloz, D.; Quirrenbach, A.; Reffert, S.; Schmid, C.; Schuhler, N.; Schulze-Hartung, T.

    2012-07-01

    The Extrasolar Planet Search with PRIMA project (ESPRI) aims at characterising and detecting extrasolar planets by measuring the host star's reflex motion using the narrow-angle astrometry capability of the PRIMA facility at the Very Large Telescope Interferometer. A first functional demonstration of the astrometric mode was achieved in early 2011. This marked the start of the astrometric commissioning phase with the purpose of characterising the instrument's performance, which ultimately has to be sufficient for exoplanet detection. We show results obtained from the observation of bright visual binary stars, which serve as test objects to determine the instrument's astrometric precision, its accuracy, and the plate scale. Finally, we report on the current status of the ESPRI project, in view of starting its scientific programme.

  10. Inflight Calibration of the Lunar Reconnaissance Orbiter Camera Wide Angle Camera

    NASA Astrophysics Data System (ADS)

    Mahanti, P.; Humm, D. C.; Robinson, M. S.; Boyd, A. K.; Stelling, R.; Sato, H.; Denevi, B. W.; Braden, S. E.; Bowman-Cisneros, E.; Brylow, S. M.; Tschimmel, M.

    2015-09-01

    The Lunar Reconnaissance Orbiter Camera (LROC) Wide Angle Camera (WAC) has acquired more than 250,000 images of the illuminated lunar surface and over 190,000 observations of space and non-illuminated Moon since 1 January 2010. These images, along with images from the Narrow Angle Camera (NAC) and other Lunar Reconnaissance Orbiter instrument datasets are enabling new discoveries about the morphology, composition, and geologic/geochemical evolution of the Moon. Characterizing the inflight WAC system performance is crucial to scientific and exploration results. Pre-launch calibration of the WAC provided a baseline characterization that was critical for early targeting and analysis. Here we present an analysis of WAC performance from the inflight data. In the course of our analysis we compare and contrast with the pre-launch performance wherever possible and quantify the uncertainty related to various components of the calibration process. We document the absolute and relative radiometric calibration, point spread function, and scattered light sources and provide estimates of sources of uncertainty for spectral reflectance measurements of the Moon across a range of imaging conditions.

  11. Techniques of coagulation laser prostatectomy for narrow divergence angle fibers.

    PubMed

    Milam, D F

    1996-01-01

    Although laser prostatectomy has become an accepted technique for the treatment of obstructive prostatism, considerable debate remains about which laser prostatectomy method to use in various treatment situations. This article discusses the different methods of noncontact side-firing coagulation laser prostatectomy using narrow divergence angle fibers (< 30 degrees). Static treatment strategies that have been successfully employed with widely divergent beams are not appropriate for fibers producing narrow divergence angle beams. Narrow divergence angle beams produce a small diameter spot on the prostatic urethra and far larger power density. Additionally, neodymium/yttrium aluminum garnet light scatters relatively poorly within prostatic tissue. Most light continues along the path of through transmission until ultimate tissue absorption and conversion into heat. The width and total volume of the coagulated lesion is therefore limited when using a narrow divergence angle fiber to produce static lesions. Probe movement is essential. Rapid (> 2 mm/s) probe movement produces only superficial coagulation. An initial dwell period of approximately 3 s is also important to maximize coagulated tissue volume. Scanning strategies where the fiber is moved through the prostatic urethra in longitudinal and radial directions are discussed and compared. Radial and longitudinal scanning methods produce similar coagulation defects. Treatment using a rocking motion within a limited volume of tissue may increase coagulation depth. No technique is ideal for all clinical situations. Vaporization prostatectomy or contact laser transurethral incision of the prostate is appropriate for primary treatment of glands < 30 g or as adjunctive therapy to facilitate early catheter removal. Alternative treatment methods are compared to noncontact coagulation prostatectomy. PMID:9118400

  12. Associations between Narrow Angle and Adult Anthropometry: The Liwan Eye Study

    PubMed Central

    Jiang, Yuzhen; He, Mingguang; Friedman, David S.; Khawaja, Anthony P.; Lee, Pak Sang; Nolan, Winifred P.; Yin, Qiuxia; Foster, Paul J.

    2015-01-01

    Purpose To assess the associations between narrow angle and adult anthropometry. Methods Chinese adults aged 50 years and older were recruited from a population-based survey in the Liwan District of Guangzhou, China. Narrow angle was defined as the posterior trabecular meshwork not visible under static gonioscopy in at least three quadrants (i.e. a circumference of at least 270°). Logistic regression models were used to examine the associations between narrow angle and anthropomorphic measures (height, weight and body mass index, BMI). Results Among the 912 participants, lower weight, shorter height, and lower BMI were significantly associated with narrower angle width (tests for trend: mean angle width in degrees vs weight p<0.001; vs height p<0.001; vs BMI p = 0.012). In univariate analyses, shorter height, lower weight and lower BMI were all significantly associated with greater odds of narrow angle. The crude association between height and narrow angle was largely attributable to a stronger association with age and sex. Lower BMI and weight remained significantly associated with narrow angle after adjustment for height, age, sex, axial ocular biometric measures and education. In analyses stratified by sex, the association between BMI and narrow angle was only observed in women. Conclusion Lower BMI and weight were associated with significantly greater odds of narrow angle after adjusting for age, education, axial ocular biometric measures and height. The odds of narrow angle increased 7% per 1 unit decrease in BMI. This association was most evident in women. PMID:24707840

  13. Improved wide-angle, fisheye and omnidirectional camera calibration

    NASA Astrophysics Data System (ADS)

    Urban, Steffen; Leitloff, Jens; Hinz, Stefan

    2015-10-01

    In this paper an improved method for calibrating wide-angle, fisheye and omnidirectional imaging systems is presented. We extend the calibration procedure proposed by Scaramuzza et al. by replacing the residual function and joint refinement of all parameters. In doing so, we achieve a more stable, robust and accurate calibration (up to factor 7) and can reduce the number of necessary calibration steps from five to three. After introducing the camera model and highlighting the differences from the current calibration procedure, we perform a comprehensive performance evaluation using several data sets and show the impact of the proposed calibration procedure on the calibration results.

  14. Spectral data of specular reflectance, narrow-angle transmittance and angle-resolved surface scattering of materials for solar concentrators

    PubMed Central

    Good, Philipp; Cooper, Thomas; Querci, Marco; Wiik, Nicolay; Ambrosetti, Gianluca; Steinfeld, Aldo

    2015-01-01

    The spectral specular reflectance of conventional and novel reflective materials for solar concentrators is measured with an acceptance angle of 17.5mrad over the wavelength range 300?2500nm at incidence angles 1560 using a spectroscopic goniometry system. The same experimental setup is used to determine the spectral narrow-angle transmittance of semi-transparent materials for solar collector covers at incidence angles 060. In addition, the angle-resolved surface scattering of reflective materials is recorded by an area-scan CCD detector over the spectral range 3501050nm. A comprehensive summary, discussion, and interpretation of the results are included in the associated research article Spectral reflectance, transmittance, and angular scattering of materials for solar concentrators in Solar Energy Materials and Solar Cells. PMID:26862556

  15. Spectral data of specular reflectance, narrow-angle transmittance and angle-resolved surface scattering of materials for solar concentrators.

    PubMed

    Good, Philipp; Cooper, Thomas; Querci, Marco; Wiik, Nicolay; Ambrosetti, Gianluca; Steinfeld, Aldo

    2016-03-01

    The spectral specular reflectance of conventional and novel reflective materials for solar concentrators is measured with an acceptance angle of 17.5 mrad over the wavelength range 300-2500 nm at incidence angles 15-60° using a spectroscopic goniometry system. The same experimental setup is used to determine the spectral narrow-angle transmittance of semi-transparent materials for solar collector covers at incidence angles 0-60°. In addition, the angle-resolved surface scattering of reflective materials is recorded by an area-scan CCD detector over the spectral range 350-1050 nm. A comprehensive summary, discussion, and interpretation of the results are included in the associated research article "Spectral reflectance, transmittance, and angular scattering of materials for solar concentrators" in Solar Energy Materials and Solar Cells. PMID:26862556

  16. Narrow Field-Of Visual Odometry Based on a Focused Plenoptic Camera

    NASA Astrophysics Data System (ADS)

    Zeller, N.; Quint, F.; Stilla, U.

    2015-03-01

    In this article we present a new method for visual odometry based on a focused plenoptic camera. This method fuses the depth data gained by a monocular Simultaneous Localization and Mapping (SLAM) algorithm and the one received from a focused plenoptic camera. Our algorithm uses the depth data and the totally focused images supplied by the plenoptic camera to run a real-time semi-dense direct SLAM algorithm. Based on this combined approach, the scale ambiguity of a monocular SLAM system can be overcome. Furthermore, the additional light-field information highly improves the tracking capabilities of the algorithm. Thus, visual odometry even for narrow field of view (FOV) cameras is possible. We show that not only tracking profits from the additional light-field information. By accumulating the depth information over multiple tracked images, also the depth accuracy of the focused plenoptic camera can be highly improved. This novel approach improves the depth error by one order of magnitude compared to the one received from a single light-field image.

  17. Improved iris localization by using wide and narrow field of view cameras for iris recognition

    NASA Astrophysics Data System (ADS)

    Kim, Yeong Gon; Shin, Kwang Yong; Park, Kang Ryoung

    2013-10-01

    Biometrics is a method of identifying individuals by their physiological or behavioral characteristics. Among other biometric identifiers, iris recognition has been widely used for various applications that require a high level of security. When a conventional iris recognition camera is used, the size and position of the iris region in a captured image vary according to the X, Y positions of a user's eye and the Z distance between a user and the camera. Therefore, the searching area of the iris detection algorithm is increased, which can inevitably decrease both the detection speed and accuracy. To solve these problems, we propose a new method of iris localization that uses wide field of view (WFOV) and narrow field of view (NFOV) cameras. Our study is new as compared to previous studies in the following four ways. First, the device used in our research acquires three images, one each of the face and both irises, using one WFOV and two NFOV cameras simultaneously. The relation between the WFOV and NFOV cameras is determined by simple geometric transformation without complex calibration. Second, the Z distance (between a user's eye and the iris camera) is estimated based on the iris size in the WFOV image and anthropometric data of the size of the human iris. Third, the accuracy of the geometric transformation between the WFOV and NFOV cameras is enhanced by using multiple matrices of the transformation according to the Z distance. Fourth, the searching region for iris localization in the NFOV image is significantly reduced based on the detected iris region in the WFOV image and the matrix of geometric transformation corresponding to the estimated Z distance. Experimental results showed that the performance of the proposed iris localization method is better than that of conventional methods in terms of accuracy and processing time.

  18. 11. 22'X34' original blueprint, VariableAngle Launcher, 'SIDE VIEW CAMERA WOOD ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    11. 22'X34' original blueprint, Variable-Angle Launcher, 'SIDE VIEW CAMERA WOOD FRAME SUPERSTRUCTURE' drawn at 1/2'=1'-0'. (BOURD Sketch # 209125). - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  19. 12. 22'X34' original blueprint, VariableAngle Launcher, 'GENERAL SIDE VIEW CAMERA ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    12. 22'X34' original blueprint, Variable-Angle Launcher, 'GENERAL SIDE VIEW CAMERA STATIONS' drawn at 1/2'=1'-0'. (BUORD Sketch # 209111). - Variable Angle Launcher Complex, Camera Stations, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  20. 12. 22'X34' original vellum, VariableAngle Launcher, 'SIDE VIEW CAMERA TRACK ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    12. 22'X34' original vellum, Variable-Angle Launcher, 'SIDE VIEW CAMERA TRACK H-20 BRIDGE MODIFICATIONS' drawn at 3/16'=1'-0' and 1/2'1'-0'. (BUORD Sketch # 208784, PAPW 907). - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  1. 10. 22'X34' original blueprint, VariableAngle Launcher, 'SIDE VIEW CAMERA CARSTEEL ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    10. 22'X34' original blueprint, Variable-Angle Launcher, 'SIDE VIEW CAMERA CAR-STEEL FRAME AND AXLES' drawn at 1/2'=1'-0'. (BOURD Sketch # 209124). - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  2. 13. 22'X34' original vellum, VariableAngle Launcher, 'SIDEVIEW CAMERA CAR TRACK ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    13. 22'X34' original vellum, Variable-Angle Launcher, 'SIDEVIEW CAMERA CAR TRACK DETAILS' drawn at 1/4'=1'-0' (BUORD Sketch # 208078, PAPW 908). - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  3. High-Precision Narrow Angle Astrometry with a Space-Borne Interferometer

    NASA Technical Reports Server (NTRS)

    Milman, Mark H.; Murphy, Dave

    2008-01-01

    This paper develops an observing and processing scheme for narrow angle astrometry using a single baseline interferometer without the aid of "grid" stars to characterize the interferometer baseline vector in inertial space. The basic concept derives from the recognition that over a narrow field the set of fundamental unknown instrument parameters that arise because the interferometer baseline vector has large uncertainties (since there are no grid star measurements) is indistinguishable from a particular set of unobservable errors in the determination of star positions within the field. Reference stars within the narrow field of regard are used to circumvent the unobservable modes. Feasibility of the approach is demonstrated through analysis and example simulations.

  4. The ESPRI project: narrow-angle astrometry with VLTI-PRIMA

    NASA Astrophysics Data System (ADS)

    Launhardt, R.; Henning, T.; Queloz, D; Quirrenbach, A.; Delplancke, F.; Elias, N. M.; Pepe, F.; Reffert, S.; Sgransan, D.; Setiawan, J.; Tubbs, R.; ESPRI Consortium

    2008-07-01

    We describe the ongoing hardware and software developments that shall enable the ESO VLTI to perform narrow-angle differential delay astrometry in K-band with an accuracy of up to 10 ?arcsec. The ultimate goal of these efforts is to perform an astrometric search for extrasolar planets around nearby stars.

  5. The design and fabricate of wide angle 905nm narrow band filter

    NASA Astrophysics Data System (ADS)

    Shi, Baohua; Li, Zaijin; Li, Hongyu; Qu, Yi

    2014-12-01

    All-dielectric film narrow band filter is widely used in laser system owing to its excellent optical capability, manufacturability and environmental adaptability. But 905nm infrared semiconductor laser system have large divergence angel so we designed entrance light cone angle 905nm narrow band filter. And center wavelength shift, due to entrance light cone angle, affects its spectral selective power seriously. In order to reduce these impacts, an informal dielectric film narrowband filter is designed. Changes of transmission characteristics with oblique incidence of Gaussian beam of uneven illumination are analyzed. The relationship between the angle of incidence and the central wavelength shift quantificational are Solved. A 30 incident 905nm narrowband filter was fabricated. Between 880nm and 950nm, the average transmittance is above 90%, and at the cut-off band the average transmittance is below 1%.

  6. Wide-Angle, Reflective Strip-Imaging Camera

    NASA Technical Reports Server (NTRS)

    Vaughan, Arthur H.

    1992-01-01

    Proposed camera images thin, striplike portion of field of view of 180 degrees wide. Hemispherical concave reflector forms image onto optical fibers, which transfers it to strip of photodetectors or spectrograph. Advantages include little geometric distortion, achromatism, and ease of athermalization. Uses include surveillance of clouds, coarse mapping of terrain, measurements of bidirectional reflectance distribution functions of aerosols, imaging spectrometry, oceanography, and exploration of planets.

  7. Angle-insensitive and narrow band grating filter with a gradient-index layer.

    PubMed

    Zheng, Gaige; Cong, Jiawei; Xu, Linhua; Su, Wei

    2014-10-15

    We demonstrate the design of an efficient angle-insensitive guided mode resonance filter (GMRF), with narrow bandwidth and low sideband reflection, for TE-polarized waves. The reflection properties of the multilayer structure have been studied, and the results verify that the thin film design of the gradient-index layer is important for the realization of an angle-insensitive filter. Various gradient coefficients of the thin film have distinct effects on the reflection spectrum. For an increasing incident angle, although the line-shape symmetry becomes less perfect, the positions of the resonant peak remain the same. The GMRF proposed here has many desirable attributes that lends itself to being an excellent platform, for devices such as lasers, detectors, filters, and sensors. PMID:25361122

  8. On-Orbit Geometric Calibration of the Lunar Reconnaissance Orbiter Wide Angle Camera

    NASA Astrophysics Data System (ADS)

    Speyerer, E. J.; Wagner, R.; Robinson, M. S.

    2013-12-01

    Lunar Reconnaissance Orbiter (LRO) is equipped with a single Wide Angle Camera (WAC) [1] designed to collect monochromatic and multispectral observations of the lunar surface. Cartographically accurate image mosaics and stereo image based terrain models requires the position of each pixel in a given image be known to a corresponding point on the lunar surface with a high degree of accuracy and precision. The Lunar Reconnaissance Orbiter Camera (LROC) team initially characterized the WAC geometry prior to launch at the Malin Space Science Systems calibration facility. After lunar orbit insertion, the LROC team recognized spatially varying geometric offsets between color bands. These misregistrations made analysis of the color data problematic and showed that refinements to the pre-launch geometric analysis were necessary. The geometric parameters that define the WAC optical system were characterized from statistics gathered from co-registering over 84,000 image pairs. For each pair, we registered all five visible WAC bands to a precisely rectified Narrow Angle Camera (NAC) image (accuracy <15 m) [2] to compute key geometric parameters. In total, we registered 2,896 monochrome and 1,079 color WAC observations to nearly 34,000 NAC observations and collected over 13.7 million data points across the visible portion of the WAC CCD. Using the collected statistics, we refined the relative pointing (yaw, pitch and roll), effective focal length, principal point coordinates, and radial distortion coefficients. This large dataset also revealed spatial offsets between bands after orthorectification due to chromatic aberrations in the optical system. As white light enters the optical system, the light bends at different magnitudes as a function of wavelength, causing a single incident ray to disperse in a spectral spread of color [3,4]. This lateral chromatic aberration effect, also known as 'chromatic difference in magnification' [5] introduces variation to the effective focal length for each WAC band. Secondly, tangential distortions caused by minor decentering in the optical system altered the derived exterior orientation parameters for each 14-line WAC band. We computed the geometric parameter sets separately for each band to characterize the lateral chromatic aberrations and the decentering components in the WAC optical system. From this approach, we negated the need for additional tangential terms in the distortion model, thus reducing the number of computations during image orthorectification and therefore expediting the orthorectification process. We undertook a similar process for refining the geometry for the UV bands (321 and 360 nm), except we registered each UV bands to orthorectified visible bands of the same WAC observation (the visible bands have resolutions 4 times greater than the UV). The resulting 7-band camera model with refined geometric parameters enables map projection with sub-pixel accuracy. References: [1] Robinson et al. (2010) Space Sci. Rev. 150, 81-124 [2] Wagner et al. (2013) Lunar Sci Forum [3] Mahajan, V.N. (1998) Optical Imaging and Aberrations [4] Fiete, R.D. (2013), Manual of Photogrammetry, pp. 359-450 [5] Brown, D.C. (1966) Photometric Eng. 32, 444-462.

  9. Potential of long-baseline infrared interferometry for narrow-angle astrometry

    NASA Technical Reports Server (NTRS)

    Shao, M.; Colavita, M. M.

    1992-01-01

    Narrow-angle astrometry has many astrophysical applications, from the measurement of parallaxes to the search for planets around nearby stars. Ground-based long-focus telescopes with photoelectric detectors have achieved accuracies of about 2-3 milliarcsec in 1 h. This accuracy is limited primarily by the atmosphere, and is consistent with models of atmospheric turbulence. However, applying these turbulence models to observations with long-baseline interferometers yields astrometric errors that are far smaller than can be achieved with long-focus telescopes. The predictions for the ultimate accuracy of ground-based narrow-angle astrometry using long-baseline IR (2.2 micron) stellar interferometers are very promising. With the excellent seeing at a high altitude site like Mauna Kea, the atmospheric limit for a 1 h astrometric measurement is expected to be of the order of 10 microarcsec for 1 h of integration. This two-order-of-magnitude improvement over conventional measurements is due to two effects. One is that a long-baseline IR interferometer can find useful reference stars very near an arbitrary target star, so that the atmospherically-induced motions of both stars are highly correlated. The second is that the baseline length can be much larger than the separation of the stellar beams in the turbulent atmosphere, resulting in a reduction in astrometric error with increasing baseline length.

  10. Calibration of a trinocular system formed with wide angle lens cameras.

    PubMed

    Ricolfe-Viala, Carlos; Sanchez-Salmeron, Antonio-Jose; Valera, Angel

    2012-12-01

    To obtain 3D information of large areas, wide angle lens cameras are used to reduce the number of cameras as much as possible. However, since images are high distorted, errors in point correspondences increase and 3D information could be erroneous. To increase the number of data from images and to improve the 3D information, trinocular sensors are used. In this paper a calibration method for a trinocular sensor formed with wide angle lens cameras is proposed. First pixels locations in the images are corrected using a set of constraints which define the image formation in a trinocular system. When pixels location are corrected, lens distortion and trifocal tensor is computed. PMID:23262716

  11. Cross-diffractive optical elements for wide angle geometric camera calibration.

    PubMed

    Thibault, Simon; Arfaoui, Aymen; Desaulniers, Pierre

    2011-12-15

    Diffractive optical elements (DOEs) can generate multiple two-dimensional (2D) diffraction grids that can be used to calibrate cameras for photogrammetry. However, several factors limit the accuracy and the functionality of this technique. One of the most important is the DOE fabrication itself. A large DOE with wide 2D fan-out grids is very difficult and costly to develop. Consequently, the calibration is limited to small aperture cameras and/or limited angles. To overcome these problems, we present a low cost solution. We propose to use two large, commercially available, crossed phase DOEs that generate 1515 equally spaced dots. As the DOEs are not perfect, the unwanted secondary diffractive orders are used as calibration targets to expand the calibration field of view. We show that the use of the primary and secondary diffractive orders provides a valuable calibration tool for wide angle aerial cameras. PMID:22179878

  12. Narrow-angle tail radio sources and evidence for radial orbits in Abell clusters

    NASA Technical Reports Server (NTRS)

    O'Dea, Christopher P.; Owen, Frazer N.; Sarazin, Craig L.

    1986-01-01

    Published observational data on the tail orientations (TOs) of 60 narrow-angle-tail (NAT) radio sources in Abell clusters of galaxies are analyzed statistically using a maximum-likelihood approach. The results are presented in a table, and it is found that the observed TO distributions in the whole sample and in subsamples of morphologically regular NATs and NATs with pericentric distances d greater than 500 kpc are consistent with isotropic orbits, whereas the TOs for NATs with d less than 500 kpc are consistent with highly radial orbits. If radial orbits were observed near the centers of other types of cluster galaxies as well, it could be inferred that violent relaxation during cluster formation was incomplete, and that clusters form by spherical collapse and secondary infall, as proposed by Gunn (1977).

  13. Narrow Angle Wide Spectral Range Radiometer Design FEANICS/REEFS Radiometer Design Report

    NASA Technical Reports Server (NTRS)

    Camperchioli, William

    2005-01-01

    A critical measurement for the Radiative Enhancement Effects on Flame Spread (REEFS) microgravity combustion experiment is the net radiative flux emitted from the gases and from the solid fuel bed. These quantities are measured using a set of narrow angle, wide spectral range radiometers. The radiometers are required to have an angular field of view of 1.2 degrees and measure over the spectral range of 0.6 to 30 microns, which presents a challenging design effort. This report details the design of this radiometer system including field of view, radiometer response, radiometric calculations, temperature effects, error sources, baffling and amplifiers. This report presents some radiometer specific data but does not present any REEFS experiment data.

  14. In situ measurements of particle friction angles in steep, narrow channels

    NASA Astrophysics Data System (ADS)

    Prancevic, J.; Lamb, M. P.

    2013-12-01

    The persistent observation that sediment requires increased fluid stresses to move on steeper channels has inspired a wide range of explanations, which can loosely be divided into those that invoke increased grain stability (friction angle, ?) and those that require altered flow hydraulics in steep channels. Measurements of bulk fluid forces over a wide range of channel slopes (? ? 22) have been obtained using laboratory flume experiments that can control for grain stability and show that altered flow hydraulics do play a role in increased critical shear stress. However, measurements of grain stability are almost all limited to channel slopes less than a few degrees. These friction angle studies have been conducted by tilting a fixed gravel bed with a single loose particle until dislodgment, or by directly measuring the forces required to dislodge a particle using a load cell. The latter methodology is less common but offers the advantage of quickly measuring the friction angles of in situ grains in natural river channels. Indeed, it has enabled the collection of extremely large datasets at low slopes [e.g., Johnston et al., 1998]. We are adding to this dataset with measurements from several natural steep channels in the San Gabriel Mountains, CA to test if the particle friction angle changes systematically as a function of slope or width-to-grain size ratio (W/D50), which is thought to determine the propensity for particle jamming. Using a load cell that records peak forces we measure the minimum force required to pull a particle from its pocket in the downstream direction and the particle weight. Particles are sampled over a regular grid and we record the percentage of the particle buried by fines and the qualitative degree of interlocking. Preliminary results from three sites with bed slopes of ? = 2.9, 3.2, and 9.0 suggest that the at-a-site variability in friction angle is much higher than between-site variability, and that median values do not vary in a consistent manner with bed slope (? = 51, 67, and 65, respectively). At an individual site the degree of interlocking is the primary control on particle friction angle. However, the degree of interlocking was not higher in the steep (? = 9.0), narrow (W/D50 = 12.5) channel. This indicates that increased grain stability may not play a crucial role in increasing the threshold shear stresses required for sediment motion on very steep slopes.

  15. Topview stereo: combining vehicle-mounted wide-angle cameras to a distance sensor array

    NASA Astrophysics Data System (ADS)

    Houben, Sebastian

    2015-03-01

    The variety of vehicle-mounted sensors in order to fulfill a growing number of driver assistance tasks has become a substantial factor in automobile manufacturing cost. We present a stereo distance method exploiting the overlapping field of view of a multi-camera fisheye surround view system, as they are used for near-range vehicle surveillance tasks, e.g. in parking maneuvers. Hence, we aim at creating a new input signal from sensors that are already installed. Particular properties of wide-angle cameras (e.g. hanging resolution) demand an adaptation of the image processing pipeline to several problems that do not arise in classical stereo vision performed with cameras carefully designed for this purpose. We introduce the algorithms for rectification, correspondence analysis, and regularization of the disparity image, discuss reasons and avoidance of the shown caveats, and present first results on a prototype topview setup.

  16. The first demonstration of the concept of "narrow-FOV Si/CdTe semiconductor Compton camera"

    NASA Astrophysics Data System (ADS)

    Ichinohe, Yuto; Uchida, Yuusuke; Watanabe, Shin; Edahiro, Ikumi; Hayashi, Katsuhiro; Kawano, Takafumi; Ohno, Masanori; Ohta, Masayuki; Takeda, Shin`ichiro; Fukazawa, Yasushi; Katsuragawa, Miho; Nakazawa, Kazuhiro; Odaka, Hirokazu; Tajima, Hiroyasu; Takahashi, Hiromitsu; Takahashi, Tadayuki; Yuasa, Takayuki

    2016-01-01

    The Soft Gamma-ray Detector (SGD), to be deployed on board the ASTRO-H satellite, has been developed to provide the highest sensitivity observations of celestial sources in the energy band of 60-600 keV by employing a detector concept which uses a Compton camera whose field-of-view is restricted by a BGO shield to a few degree (narrow-FOV Compton camera). In this concept, the background from outside the FOV can be heavily suppressed by constraining the incident direction of the gamma ray reconstructed by the Compton camera to be consistent with the narrow FOV. We, for the first time, demonstrate the validity of the concept using background data taken during the thermal vacuum test and the low-temperature environment test of the flight model of SGD on ground. We show that the measured background level is suppressed to less than 10% by combining the event rejection using the anti-coincidence trigger of the active BGO shield and by using Compton event reconstruction techniques. More than 75% of the signals from the field-of-view are retained against the background rejection, which clearly demonstrates the improvement of signal-to-noise ratio. The estimated effective area of 22.8 cm2 meets the mission requirement even though not all of the operational parameters of the instrument have been fully optimized yet.

  17. Research on influence of space-borne stereo mapping camera's time synchronization precision on attitude angle errors

    NASA Astrophysics Data System (ADS)

    Wu, Xingxing; Liu, Jinguo; Long, Kehui; Zhou, Huaide; Zhang, Boyan

    2014-11-01

    Mapping precision of space-borne stereo mapping camera is primarily determined by attitude angle errors of the satellite. Time synchronization errors of space-borne stereo mapping camera will bring on extra attitude angle errors. In this paper model of space-borne stereo mapping camera was established in satellite tool kit (STK) to obtain the regularity of attitude angles changing with time. Influence of space-borne stereo mapping camera's time synchronization precision on attitude angle errors was analyzed by combing the regularity of attitude angles changing with time and the sampling theory. As a result digitalized model of extra attitude angle errors and time synchronization errors of space-borne stereo mapping camera was put forward. In validation experiments real attitude angle data of a stereo mapping satellite were collected and extra attitude angle errors caused by specific time synchronization errors of space-borne stereo mapping camera were obtained. Results of the experiments and analysis indicated that extra attitude angle errors caused by specific time synchronization error could be reduced from 0.01939 arc second to 0.00003879 arc second as time synchronization precision was optimized from 1ms to 20?s.

  18. Narrow Angle Diversity using ACTS Ka-band Signal with Two USAT Ground Stations

    NASA Technical Reports Server (NTRS)

    Kalu, A.; Emrich, C.; Ventre, J.; Wilson, W.; Acosta, R.

    1998-01-01

    Two ultra small aperture terminal (USAT) ground stations, separated by 1.2 km in a narrow angle diversity configuration, received a continuous Ka-band tone sent from Cleveland Link Evaluation Terminal (LET). The signal was transmitted to the USAT ground stations via NASA's Advanced Communications Technology Satellite (ACTS) steerable beam. Received signal power at the two sites was measured and analyzed. A dedicated datalogger at each site recorded time-of-tip data from tipping bucket rain gauges, providing rain amount and instantaneous rain rate. WSR-88D data was also obtained for the collection period. Eleven events with ground-to-satellite slant-path precipitation and resultant signal attenuation were observed during the data collection period. Fade magnitude and duration were compared at the two sites and diversity gain was calculated. These results exceeded standard diversity gain model predictions by several decibels. Rain statistics from tipping bucket data and from radar data were also compared to signal attenuation. The nature of Florida's subtropical rainfall, specifically its impact on signal attenuation at the sites, was addressed.

  19. Large-angle pinhole gamma camera with depth-of-interaction detector for contamination monitoring

    NASA Astrophysics Data System (ADS)

    Baek, Cheol-Ha; Kim, Hyun-Il; Hwang, Ji Yeon; Jung An, Su; Kim, Kwang Hyun; Kwak, Sung-Woo; Chung, Yong Hyun

    2011-08-01

    The gamma camera system was designed for monitoring the medical fields such as a radiopharmaceutical preparation lab or a patient waiting room (after source injection) in the division of nuclear medicine. However, gamma cameras equipped with a large-angle pinhole collimator and a thick monolithic crystal suffer from the degradation of the spatial resolution at the periphery region due to parallax error by obliquely incident photons. To improve the uniformity of the spatial resolution across the field of view (FOV), we proposed a three-layer crystal detector with a maximum-likelihood position-estimation (MLPE) method, which can measure depth-of-interaction (DOI) information. The aim of this study was to develop and evaluate the performance of new detector experimentally. The proposed detector employed three layers of monolithic CsI(Tl) crystals, each of which is 50.050.02.0 mm3, and a large-angle pinhole collimator with an acceptance angle of 120. The bottom surface of the third layer was directly coupled to an 88 channel position-sensitive photomultiplier tube (PSPMT, Hamamatsu H8500C). The PSPMT was read out using a resistive charge divider, which multiplexes 64 anodes into 8(X)+8(Y) channels. Gaussian-based MLPE method has been implemented using experimentally measured detector response functions (DRFs). Tc-99 m point source was imaged at different positions with and without DOI measurements. Experimental results showed that the spatial resolution was degraded gradually as the source moved from the center to the periphery of the FOV without DOI information but the DOI detector showed the marked improvement in the spatial resolution, especially at off-center by correcting the parallax error. In this paper, our new detector with DOI capability proved to characterize reliably the gamma event position with the high and uniform spatial resolution, so that the large-angle pinhole gamma camera could be a useful tool in contamination monitoring.

  20. Synthesizing wide-angle and arbitrary view-point images from a circular camera array

    NASA Astrophysics Data System (ADS)

    Fukushima, Norishige; Yendo, Tomohiro; Fujii, Toshiaki; Tanimoto, Masayuki

    2006-02-01

    We propose a technique of Imaged-Based Rendering(IBR) using a circular camera array. By the result of having recorded the scene as surrounding the surroundings, we can synthesize a more dynamic arbitrary viewpoint images and a wide angle images like a panorama . This method is based on Ray- Space, one of the image-based rendering, like Light Field. Ray-Space is described by the position (x, y) and a direction (θ, φ) of the ray's parameter which passes a reference plane. All over this space, when the camera has been arranged circularly, the orbit of the point equivalent to an Epipor Plane Image(EPI) at the time of straight line arrangement draws a sin curve. Although described in a very clear form, in case a rendering is performed, pixel of which position of which camera being used and the work for which it asks become complicated. Therefore, the position (u, v) of the position (s, t) pixel of a camera like Light Filed redescribes space expression. It makes the position of a camera a polar-coordinates system (r, theta), and is making it close to description of Ray-Space. Thereby, although the orbit of a point serves as a complicated periodic function of periodic 2pi, the handling of a rendering becomes easy. From such space, the same as straight line arrangement, arbitrary viewpoint picture synthesizing is performed only due to a geometric relationship between cameras. Moreover, taking advantage of the characteristic of concentrating on one circular point, we propose the technique of generating a wide-angle picture like a panorama. When synthesizing a viewpoint, since it is overlapped and is recording the ray of all the directions of the same position, this becomes possible. Having stated until now is the case where it is a time of the camera fully having been arranged and a plenoptic sampling being filled. The discrete thing which does not fill a sampling is described from here. When arranging a camera in a straight line and compounding a picture, in spite of assuming the pinhole camera model, an effect like a focus shows up. This is an effect peculiar to Light Field when a sampling is not fully performed, and is called a synthetic aperture. We have compounded all focal images by processing called an "Adaptive Filter" to such a phenomenon. An adaptive filter is the method of making the parallax difference map of perfect viewpoint dependence centering on a viewpoint to make. This is a phenomenon produced even when it has arranged circularly. Then, in circular camera arrangement, this adaptive filter is extended, and all focal pictures are compounded. Although there is a problem that an epipor line is not parallel etc. when it has arranged circularly, extension obtains enough, it comes only out of geometric information, and a certain thing is clarified By taking such a method, it succeeded in performing a wide angle and arbitrary viewpoint image synthesis also from discrete space also from the fully sampled space.

  1. Mesosphere light scattering depolarization during the Perseids activity epoch by wide-angle polarization camera measurements

    NASA Astrophysics Data System (ADS)

    Ugolnikov, Oleg S.; Maslov, Igor A.

    2014-03-01

    The paper describes the study of scattered radiation field in the mesosphere basing on wide-angle polarization camera (WAPC) measurements of the twilight sky background and single scattering separation procedure. Mid-August observations in 2012 and 2013 show the decrease of single scattering polarization value probably related with Perseids meteor dust moderation in the upper mesosphere. Effect correlates with activity of tiny fraction of Perseids shower. Polarization and temperature analysis allows estimating the altitude of dust layer and character polarization of dust scattering.

  2. Development of soft x-ray large solid angle camera onboard WF-MAXI

    NASA Astrophysics Data System (ADS)

    Kimura, Masashi; Tomida, Hiroshi; Ueno, Shiro; Kawai, Nobuyuki; Yatsu, Yoichi; Arimoto, Makoto; Mihara, Tatehiro; Serino, Motoko; Tsunemi, Hiroshi; Yoshida, Atsumasa; Sakamoto, Takanori; Kohmura, Takayoshi; Negoro, Hitoshi

    2014-07-01

    Wide-Field MAXI (WF-MAXI) planned to be installed in Japanese Experiment Module "Kibo" Exposed Facility of the international space station (ISS). WF-MAXI consists of two types of cameras, Soft X-ray Large Solid Angle Camera (SLC) and Hard X-ray Monitor (HXM). HXM is multi-channel arrays of CsI scintillators coupled with avalanche photodiodes (APDs) which covers the energy range of 20 - 200 keV. SLC is arrays of CCD, which is evolved version of MAXI/SSC. Instead of slit and collimator in SSC, SLC is equipped with coded mask allowing its field of view to 20% of all sky at any given time, and its location determination accuracy to few arcminutes. In older to achieve larger effective area, the number of CCD chip and the size of each chip will be larger than that of SSC. We are planning to use 59 x 31 mm2 CCD chip provided by Hamamatsu Photonics. Each camera will be quipped with 16 CCDs and total of 4 cameras will be installed in WF-MAXI. Since SLC utilize X-ray CCDs it must equip active cooling system for CCDs. Instead of using the peltier cooler, we use mechanical coolers that are also employed in Astro-H. In this way we can cool the CCDs down to -100C. ISS orbit around the earth in 90 minutes; therefore a point source moves 4 arcminutes per second. In order to achieve location determination accuracy, we need fast readout from CCD. The pulse heights are stacked into a single row along the vertical direction. Charge is transferred continuously, thus the spatial information along the vertical direction is lost and replaced with the precise arrival time information. Currently we are making experimental model of the camera body including the CCD and electronics for the CCDs. In this paper, we show the development status of SLC.

  3. A New Approach to Micro-arcsecond Astrometry with SIM Allowing Early Mission Narrow Angle Measurements of Compelling Astronomical Targets

    NASA Technical Reports Server (NTRS)

    Shaklan, Stuart; Pan, Xiaopei

    2004-01-01

    The Space Interferometry Mission (SIM) is capable of detecting and measuring the mass of terrestrial planets around stars other than our own. It can measure the mass of black holes and the visual orbits of radio and x-ray binary sources. SIM makes possible a new level of understanding of complex astrophysical processes. SIM achieves its high precision in the so-called narrow-angle regime. This is defined by a 1 degree diameter field in which the position of a target star is measured with respect to a set of reference stars. The observation is performed in two parts: first, SIM observes a grid of stars that spans the full sky. After a few years, repeated observations of the grid allow one to determine the orientation of the interferometer baseline. Second, throughout the mission, SIM periodically observes in the narrow-angle mode. Every narrow-angle observation is linked to the grid to determine the precise attitude and length of the baseline. The narrow angle process demands patience. It is not until five years after launch that SIM achieves its ultimate accuracy of 1 microarcsecond. The accuracy is degraded by a factor of approx. 2 at mid-mission. Our work proposes a technique for narrow angle astrometry that does not rely on the measurement of grid stars. This technique, called Gridless Narrow Angle Astrometry (GNAA) can obtain microarcsecond accuracy and can detect extra-solar planets and other exciting objects with a few days of observation. It can be applied as early as during the first six months of in-orbit calibration (IOC). The motivations for doing this are strong. First, and obviously, it is an insurance policy against a catastrophic mid-mission failure. Second, at the start of the mission, with several space-based interferometers in the planning or implementation phase, NASA will be eager to capture the public's imagination with interferometric science. Third, early results and a technique that can duplicate those results throughout the mission will give the analysts important experience in the proper use and calibration of SIM.

  4. Limitations of the narrow-angle convergent pair. [of Viking Orbiter photographs for triangulation and topographic mapping

    NASA Technical Reports Server (NTRS)

    Arthur, D. W. G.

    1977-01-01

    Spatial triangulations and topographies of the Martian surface derived from Viking Orbiter pictures depend on the use of symmetric narrow-angle convergent pairs. The overlap in each pair is close to 100 percent and the ground principal points virtually coincide. The analysis of this paper reveals a high degree of indeterminacy in such pairs and at least in part explains the rather disappointing precision of the associated spatial triangulations.

  5. Empirical Photometric Normalization for the Seven Band UV-VIS Lunar Reconnaissance Orbiter Wide Angle Camera

    NASA Astrophysics Data System (ADS)

    Boyd, A. K.; Robinson, M. S.; Nuno, R. G.; Sato, H.

    2014-12-01

    We present results on a near-global (80S to 80N) seven color Wide Angle Camera (WAC) photometric normalization and color analysis. Over 100,000 WAC color observations were calibrated to reflectance (radiance factor: IoF), and photometric angles (i,e,g), latitude, and longitude were calculated and stored for each WAC pixel. Photometric angles were calculated using the WAC GLD100 [1], and a six-dimensional data set (3 spatial and 3 photometric) was reduced to three by photometrically normalizing the IoF with a global wavelength-dependent, 3rd-order multivariate polynomial. The multispectral mosaic was normalized to a standard viewing geometry (incidence angle=30, emission angle=0, phase angle=30).The WAC has a 60 cross-track field-of-view in color mode, which allows the acquisition of a near global data set each month; however, the phase angle can change by as much as 60 across each image. These large changes in viewing geometry present challenges to the required photometric normalization. In the ratio of the 321 nm and 689 nm wavelengths, the Moon has a standard deviation less than 3% in the highlands and 7% globally; thus to allow confident identification of true color differences, the photometric normalization must be precise. Pyroclastic deposits in Marius Hills, Sinus Aestuum, and Mare Serenitatis are among the least reflective materials, with 643 nm normalized reflectance values less than 0.036.Low-reflectance deposits are generally concentrated close to the equator on the nearside, whereas high-reflectance materials are dispersed globally. The highest reflectance materials occur at Giordano Bruno and Virtanen craters and are attributed to exposure of immature materials. Immature ejecta has shallower spectral slope compared to the mean highlands spectra (321 nm to 689 nm), and UV weathering characteristics can be seen when comparing different aged Copernican ejecta [2]. Copernican ejecta is found to have 643 nm reflectance values greater than 0.36 in some areas. The range of reflectance on the Moon is 10x from the least to most reflective.The new empirical normalized reflectance presented here correlates with an independent Hapke model based normalization [3] with an R-squared value of 0.985.[1] Scholten et al. LPSC XVII (2011) [2] Denevi et al. JGR Planets (2014) [3] Sato et al. JGR Planets (2014)

  6. Small-angle approximation to the transfer of narrow laser beams in anisotropic scattering media

    NASA Technical Reports Server (NTRS)

    Box, M. A.; Deepak, A.

    1981-01-01

    The broadening and the signal power detected of a laser beam traversing an anisotropic scattering medium were examined using the small-angle approximation to the radiative transfer equation in which photons suffering large-angle deflections are neglected. To obtain tractable answers, simple Gaussian and non-Gaussian functions for the scattering phase functions are assumed. Two other approximate approaches employed in the field to further simplify the small-angle approximation solutions are described, and the results obtained by one of them are compared with those obtained using small-angle approximation. An exact method for obtaining the contribution of each higher order scattering to the radiance field is examined but no results are presented.

  7. Automatic screening of narrow anterior chamber angle and angle-closure glaucoma based on slit-lamp image analysis by using support vector machine.

    PubMed

    Theeraworn, C; Kongprawechnon, W; Kondo, T; Bunnun, P; Nishihara, A; Manassakorn, A

    2013-01-01

    At present, Van Herick's method is a standard technique used to screen a Narrow Anterior Chamber Angle (NACA) and Angle-Closure Glaucoma (ACG). It can identify a patient who suffers from NACA and ACG by considering the width of peripheral anterior chamber depth (PACD) and corneal thickness. However, the screening result of this method often varies among ophthalmologists. So, an automatic screening of NACA and ACG based on slit-lamp image analysis by using Support Vector Machine (SVM) is proposed. SVM can automatically generate the classification model, which is used to classify the result as an angle-closure likely or an angle-closure unlikely. It shows that it can improve the accuracy of the screening result. To develop the classification model, the width of PACD and corneal thickness from many positions are measured and selected to be features. A statistic analysis is also used in the PACD and corneal thickness estimation in order to reduce the error from reflection on the cornea. In this study, it is found that the generated models are evaluated by using 5-fold cross validation and give a better result than the result classified by Van Herick's method. PMID:24111078

  8. The Mars observer camera

    NASA Technical Reports Server (NTRS)

    Malin, M. C.; Danielson, G. E.; Ingersoll, A. P.; Masursky, H.; Veverka, J.; Soulanille, T.; Ravine, M.

    1987-01-01

    A camera designed to operate under the extreme constraints of the Mars Observer Mission was selected by NASA in April, 1986. Contingent upon final confirmation in mid-November, the Mars Observer Camera (MOC) will begin acquiring images of the surface and atmosphere of Mars in September-October 1991. The MOC incorporates both a wide angle system for low resolution global monitoring and intermediate resolution regional targeting, and a narrow angle system for high resolution selective surveys. Camera electronics provide control of image clocking and on-board, internal editing and buffering to match whatever spacecraft data system capabilities are allocated to the experiment. The objectives of the MOC experiment follow.

  9. Measurement of the Atmospheric Limit to Narrow Angle Interferometric Astrometry Using the Mark-Iii Stellar Interferometer

    NASA Astrophysics Data System (ADS)

    Colavita, M. M.

    1994-03-01

    Measurements were made with the Mark III stellar interferometer in order to verify predictions for the accuracy of very-narrow-angle interferometric astrometry. The Mark III was modified to observe simultaneously on its 12-m baseline the phase of the fringe packets of the primary and secondary of the long-period visual binary star α Gem. The residuals of the phase difference between primary and secondary were analyzed for 6 data segments taken over two nights. Examination of the Allan variances of the data out to a measurement limit of 8 min indicates that the error is white, as predicted. The mean fluctuations of the residuals corresponds to an astrometric accuracy of 21 μas/√h, which is in good agreement with the predictions of atmospheric models. An accurate separation for Gem was also determined: 3".2811±0".01 at position angle 73°.23±0°.15 for B1992.9589.

  10. Utilizing Angled O-Ball Narrow-Diameter Implants to Solve the Restorative Challenge Posed by Alveolar Resorption: A Case Report.

    PubMed

    Patel, Paresh B

    2015-09-01

    An angled narrow-diameter implant has been introduced for use in cases where the atrophic edentulous ridge is wide enough to accommodate narrow-diameter implants but the necessary implant angulations would make it impossible to fabricate an esthetically acceptable overdenture. A case is described in which such implants were placed and restored. PMID:26355445

  11. Development of a large-angle pinhole gamma camera with depth-of-interaction capability for small animal imaging

    NASA Astrophysics Data System (ADS)

    Baek, C.-H.; An, S. J.; Kim, H.-I.; Choi, Y.; Chung, Y. H.

    2012-01-01

    A large-angle gamma camera was developed for imaging small animal models used in medical and biological research. The simulation study shows that a large field of view (FOV) system provides higher sensitivity with respect to a typical pinhole gamma cameras by reducing the distance between the pinhole and the object. However, this gamma camera suffers from the degradation of the spatial resolution at the periphery region due to parallax error by obliquely incident photons. We propose a new method to measure the depth of interaction (DOI) using three layers of monolithic scintillators to reduce the parallax error. The detector module consists of three layers of monolithic CsI(Tl) crystals with dimensions of 50.0 50.0 2.0 mm3, a Hamamatsu H8500 PSPMT and a large-angle pinhole collimator with an acceptance angle of 120. The 3-dimensional event positions were determined by the maximum-likelihood position-estimation (MLPE) algorithm and the pre-generated look up table (LUT). The spatial resolution (FWHM) of a Co-57 point-like source was measured at different source position with the conventional method (Anger logic) and with DOI information. We proved that high sensitivity can be achieved without degradation of spatial resolution using a large-angle pinhole gamma camera: this system can be used as a small animal imaging tool.

  12. A triple axis double crystal multiple reflection camera for ultra small angle X-ray scattering

    NASA Astrophysics Data System (ADS)

    Lambard, Jacques; Lesieur, Pierre; Zemb, Thomas

    1992-06-01

    To extend the domain of small angle X-ray scattering requires multiple reflection crystals to collimate the beam. A double crystal, triple axis X-ray camera using multiple reflection channel cut crystals is described. Procedures for measuring the desmeared scattering cross-section on absolute scale are described as well as the measurement from several typical samples : fibrils of collagen, 0.3 μm diameter silica spheres, 0.16 μm diameter interacting latex spheres, porous lignite coal, liquid crystals in a surfactant-water system, colloidal crystal of 0.32 μm diameter silica spheres. L'extension du domaine de diffusion des rayons-X vers les petits angles demande l'emploi de cristaux à réflexions multiples pour collimater le faisceau. Nous décrivons une caméra à rayons-X à trois axes où les réflexions multiples sont réalisées dans deux cristaux à gorge. Nous donnons ensuite les procédures de déconvolution pour obtenir la section efficace de diffusion en échelle absolue, ainsi que les résultats des mesures effectuées avec plusieurs échantillons typiques : fibres de collagène, sphères de silice de 0,3 μm de diamètre, sphères de latex de 0,16 μm de diamètre en interaction, charbon lignite poreux, cristaux liquides formés dans un système eau-tensioactif, solution colloïdale de sphères de silice de 0,32 μm de diamètre.

  13. Numerical simulations of the bending of narrow-angle-tail radio jets by ram pressure or pressure gradients

    NASA Technical Reports Server (NTRS)

    Soker, Noam; Sarazin, Craig L.; O'Dea, Christopher P.

    1988-01-01

    Three-dimensional numerical hydrodynamic simulations are used to study the bending of radio jets. The simulations are compared with observations of jets in narrow-angle-tail radio sources. Two mechanisms for the observed bending are considered: direct bending of quasi-continuous jets by ram pressure from intergalactic gas and bending by pressure gradients in the interstellar gas of the host galaxy, the pressure gradients themselves being the result of ram pressure by intergalactic gas. It is shown that the pressure gradients are much less effective in bending jets, implying that the jets have roughly 30 times lower momentum fluxes if they are bent by this mechanism. Ram-pressure bending produces jets with 'kidney-shaped' cross sections; when observed from the side, these jets appear to have diffuse extensions on the downstream side. On the other hand, pressure-gradient bending causes the jets to be densest near their upstream side.

  14. A New Lunar Atlas: Mapping the Moon with the Wide Angle Camera

    NASA Astrophysics Data System (ADS)

    Speyerer, E.; Robinson, M. S.; Boyd, A.; Sato, H.

    2012-12-01

    The Lunar Reconnaissance Orbiter (LRO) spacecraft launched in June 2009 and began systematically mapping the lunar surface and providing a priceless dataset for the planetary science community and future mission planners. From 20 September 2009 to 11 December 2011, the spacecraft was in a nominal 50 km polar orbit, except for two one-month long periods when a series of spacecraft maneuvers enabled low attitude flyovers (as low as 22 km) of key exploration and scientifically interesting targets. One of the instruments, the Lunar Reconnaissance Orbiter Camera (LROC) Wide Angle Camera (WAC) [1], captured nearly continuous synoptic views of the illuminated lunar surface. The WAC is a 7-band (321, 360, 415, 566, 604, 643, 689 nm) push frame imager with field of view of 60 in color mode and 90 in monochrome mode. This broad field of view enables the WAC to reimage nearly 50% (at the equator where the orbit tracks our spaced the furthest) of the terrain it imaged in the previous orbit. The visible bands of map projected WAC images have a pixel scale of 100 m, while UV bands have a pixel scale of 400 m due to 4x4 pixel on-chip binning that increases signal-to-noise. The nearly circular polar orbit and short (two hour) orbital periods enable seamless mosaics of broad areas of the surface with uniform lighting and resolution. In March of 2011, the LROC team released the first version of the global monochrome (643nm) morphologic map [2], which was comprised of 15,000 WAC images collected over three periods. With the over 130,000 WAC images collected while the spacecraft was in the 50 km orbit, a new set of mosaics are being produced by the LROC Team and will be released to the Planetary Data Systems. These new maps include an updated morphologic map with an improved set of images (limiting illumination variations and gores due to off-nadir observation of other instruments) and a new photometric correction derived from the LROC WAC dataset. In addition, a higher sun (lower incidence angle) mosaic will also be released. This map has minimal shadows and highlights albedo differences. In addition, seamless regional WAC mosaics acquired under multiple lighting geometries (Sunlight coming from the East, overhead, and West) will also be produced for key areas of interest. These new maps use the latest terrain model (LROC WAC GLD100) [3], updated spacecraft ephemeris provided by the LOLA team [4], and improved WAC distortion model [5] to provide accurate placement of each WAC pixel on the lunar surface. References: [1] Robinson et al. (2010) Space Sci. Rev. [2] Speyerer et al. (2011) LPSC, #2387. [3] Scholten et al. (2012) JGR. [4] Mazarico et al. (2012) J. of Geodesy [5] Speyerer et al. (2012) ISPRS Congress.

  15. Erratum: The Wide Angle Camera of the ROSETTA Mission [Mem.SAIt 74, 434-435 (2003)

    NASA Astrophysics Data System (ADS)

    Barbieri, C.; Fornasier, S.; Verani, S.; Bertini, I.; Lazzarin, M.; Rampazzi, F.; Cremonese, G.; Ragazzoni, R.; Marzari, F.; Angrilli, F.; Bianchini, G. A.; Debei, S.; Dececco, M.; Guizzo, G.; Parzianello, G.; Ramous, P.; Saggin, B.; Zaccariotto, M.; da Deppo, V.; Naletto, G.; Nicolosi, G.; Pelizzo, M. G.; Tondello, G.; Brunello, P.; Peron, F.

    The authors acknowledge that the paper fails to convey the correct information about the respective contributions and roles of the partners of the OSIRIS consortium. In particular, the hardware contributions of the Max-Planck Institut fr Sonnensystemforschung, MPS, (Katlenburg Lindau, Germany, formerly MPAe), of the Instituto de Astrofisica de Andalucia (Granada, Spain), of the Department of Astronomy and Space Physics of Uppsala University (DASP), of ESA Research and Scientific Support Department (ESA/RSSD) to the Wide Angle Camera has not been mentioned or incorrectly expounded. The overall responsibility (PI ship) of MPS (MPAe) for OSIRIS and hence for the Wide Angle Camera is not correctly mentioned either. The correct information is given in the paper by Keller et al. (2006, Space Science Review, in press). The authors take this opportunity to acknowledge that the activity of the Italian team has been partly supported by the Italian Space Agency ASI through a contract to CISAS.

  16. Early direct-injection, low-temperature combustion of diesel fuel in an optical engine utilizing a 15-hole, dual-row, narrow-included-angle nozzle.

    SciTech Connect

    Gehrke, Christopher R.; Radovanovic, Michael S.; Milam, David M.; Martin, Glen C.; Mueller, Charles J.

    2008-04-01

    Low-temperature combustion of diesel fuel was studied in a heavy-duty, single-cylinder optical engine employing a 15-hole, dual-row, narrow-included-angle nozzle (10 holes x 70/mD and 5 holes x 35/mD) with 103-/gmm-diameter orifices. This nozzle configuration provided the spray targeting necessary to contain the direct-injected diesel fuel within the piston bowl for injection timings as early as 70/mD before top dead center. Spray-visualization movies, acquired using a high-speed camera, show that impingement of liquid fuel on the piston surface can result when the in-cylinder temperature and density at the time of injection are sufficiently low. Seven single- and two-parameter sweeps around a 4.82-bar gross indicated mean effective pressure load point were performed to map the sensitivity of the combustion and emissions to variations in injection timing, injection pressure, equivalence ratio, simulated exhaust-gas recirculation, intake temperature, intake boost pressure, and load. High-speed movies of natural luminosity were acquired by viewing through a window in the cylinder wall and through a window in the piston to provide quasi-3D information about the combustion process. These movies revealed that advanced combustion phasing resulted in intense pool fires within the piston bowl, after the end of significant heat release. These pool fires are a result of fuel-films created when the injected fuel impinged on the piston surface. The emissions results showed a strong correlation with pool-fire activity. Smoke and NO/dx emissions rose steadily as pool-fire intensity increased, whereas HC and CO showed a dramatic increase with near-zero pool-fire activity.

  17. Miniature Wide-Angle Lens for Small-Pixel Electronic Camera

    NASA Technical Reports Server (NTRS)

    Mouroulils, Pantazis; Blazejewski, Edward

    2009-01-01

    A proposed wideangle lens is shown that would be especially well suited for an electronic camera in which the focal plane is occupied by an image sensor that has small pixels. The design of the lens is intended to satisfy requirements for compactness, high image quality, and reasonably low cost, while addressing issues peculiar to the operation of small-pixel image sensors. Hence, this design is expected to enable the development of a new generation of compact, high-performance electronic cameras. The lens example shown has a 60 degree field of view and a relative aperture (f-number) of 3.2. The main issues affecting the design are also shown.

  18. A Wide-Angle Camera for the Mobile Asteroid Surface Scout (MASCOT) on Hayabusa-2

    NASA Astrophysics Data System (ADS)

    Schmitz, N.; Koncz, A.; Jaumann, R.; Hoffmann, H.; Jobs, D.; Kachlicki, J.; Michaelis, H.; Mottola, S.; Pforte, B.; Schroeder, S.; Terzer, R.; Trauthan, F.; Tschentscher, M.; Weisse, S.; Ho, T.-M.; Biele, J.; Ulamec, S.; Broll, B.; Kruselburger, A.; Perez-Prieto, L.

    2014-04-01

    JAXA's Hayabusa-2 mission, an asteroid sample return mission, is scheduled for launch in December 2014, for a rendezvous with the C-type asteroid 1999 JU3 in 2018. MASCOT, the Mobile Asteroid Surface Scout [1], is a small lander, designed to deliver ground truth for the orbiter remote measurements, support the selection of sampling sites, and provide context for the returned samples.MASCOT's main objective is to investigate the landing site's geomorphology, the internal structure, texture and composition of the regolith (dust, soil and rocks), and the thermal, mechanical, and magnetic properties of the surface. MASCOT comprises a payload of four scientific instruments: camera, radiometer, magnetometer and hyper-spectral microscope. The camera (MASCOT CAM) was designed and built by DLR's Institute of Planetary Research, together with Airbus DS Germany.

  19. Fovea-stereographic: a projection function for ultra-wide-angle cameras

    NASA Astrophysics Data System (ADS)

    Samy, Ahmed Mahmoud; Gao, Zhishan

    2015-04-01

    A new ultra-wide-angle projection function called fovea-stereographic is described and characterized by the relative relationship between the radial distortion level and the object field-of-view (FOV) angle, creating a high-resolution wide foveal image and adequate peripheral information to be processed within a limited computational time. The paper also provides the design results of an innovative fast fovea-stereographic fisheye lens system with a 170 deg of FOV that shows a more than 58.8% (100 deg) high-resolution central foveal image and at least 15% more peripheral information than any other light projection. Our lens distortion curve, in addition to its modulation transfer function, produces a high-resolution projection for real-time tracking and image transmission applications.

  20. On an assessment of surface roughness estimates from lunar laser altimetry pulse-widths for the Moon from LOLA using LROC narrow-angle stereo DTMs.

    NASA Astrophysics Data System (ADS)

    Muller, Jan-Peter; Poole, William

    2013-04-01

    Neumann et al. [1] proposed that laser altimetry pulse-widths could be employed to derive "within-footprint" surface roughness as opposed to surface roughness estimated from between laser altimetry pierce-points such as the example for Mars [2] and more recently from the 4-pointed star-shaped LOLA (Lunar reconnaissance Orbiter Laser Altimeter) onboard the NASA-LRO [3]. Since 2009, the LOLA has been collecting extensive global laser altimetry data with a 5m footprint and ?25m between the 5 points in a star-shape. In order to assess how accurately surface roughness (defined as simple RMS after slope correction) derived from LROC matches with surface roughness derived from LOLA footprints, publicly released LROC-NA (LRO Camera Narrow Angle) 1m Digital Terrain Models (DTMs) were employed to measure the surface roughness directly within each 5m footprint. A set of 20 LROC-NA DTMs were examined. Initially the match-up between the LOLA and LROC-NA orthorectified images (ORIs) is assessed visually to ensure that the co-registration is better than the LOLA footprint resolution. For each LOLA footprint, the pulse-width geolocation is then retrieved and this is used to "cookie-cut" the surface roughness and slopes derived from the LROC-NA DTMs. The investigation which includes data from a variety of different landforms shows little, if any correlation between surface roughness estimated from DTMs with LOLA pulse-widths at sub-footprint scale. In fact there is only any perceptible correlation between LOLA and LROC-DTMs at baselines of 40-60m for surface roughness and 20m for slopes. [1] Neumann et al. Mars Orbiter Laser Altimeter pulse width measurements and footprint-scale roughness. Geophysical Research Letters (2003) vol. 30 (11), paper 1561. DOI: 10.1029/2003GL017048 [2] Kreslavsky and Head. Kilometer-scale roughness of Mars: results from MOLA data analysis. J Geophys Res (2000) vol. 105 (E11) pp. 26695-26711. [3] Rosenburg et al. Global surface slopes and roughness of the Moon from the Lunar Orbiter Laser Altimeter. Journal of Geophysical Research (2011) vol. 116, paper E02001. DOI: 10.1029/2010JE003716 [4] Chin et al. Lunar Reconnaissance Orbiter Overview: The Instrument Suite and Mission. Space Science Reviews (2007) vol. 129 (4) pp. 391-419

  1. Matching the best viewing angle in depth cameras for biomass estimation based on poplar seedling geometry.

    PubMed

    Andújar, Dionisio; Fernández-Quintanilla, César; Dorado, José

    2015-01-01

    In energy crops for biomass production a proper plant structure is important to optimize wood yields. A precise crop characterization in early stages may contribute to the choice of proper cropping techniques. This study assesses the potential of the Microsoft Kinect for Windows v.1 sensor to determine the best viewing angle of the sensor to estimate the plant biomass based on poplar seedling geometry. Kinect Fusion algorithms were used to generate a 3D point cloud from the depth video stream. The sensor was mounted in different positions facing the tree in order to obtain depth (RGB-D) images from different angles. Individuals of two different ages, e.g., one month and one year old, were scanned. Four different viewing angles were compared: top view (0°), 45° downwards view, front view (90°) and ground upwards view (-45°). The ground-truth used to validate the sensor readings consisted of a destructive sampling in which the height, leaf area and biomass (dry weight basis) were measured in each individual plant. The depth image models agreed well with 45°, 90° and -45° measurements in one-year poplar trees. Good correlations (0.88 to 0.92) between dry biomass and the area measured with the Kinect were found. In addition, plant height was accurately estimated with a few centimeters error. The comparison between different viewing angles revealed that top views showed poorer results due to the fact the top leaves occluded the rest of the tree. However, the other views led to good results. Conversely, small poplars showed better correlations with actual parameters from the top view (0°). Therefore, although the Microsoft Kinect for Windows v.1 sensor provides good opportunities for biomass estimation, the viewing angle must be chosen taking into account the developmental stage of the crop and the desired parameters. The results of this study indicate that Kinect is a promising tool for a rapid canopy characterization, i.e., for estimating crop biomass production, with several important advantages: low cost, low power needs and a high frame rate (frames per second) when dynamic measurements are required. PMID:26053748

  2. Matching the Best Viewing Angle in Depth Cameras for Biomass Estimation Based on Poplar Seedling Geometry

    PubMed Central

    Andújar, Dionisio; Fernández-Quintanilla, César; Dorado, José

    2015-01-01

    In energy crops for biomass production a proper plant structure is important to optimize wood yields. A precise crop characterization in early stages may contribute to the choice of proper cropping techniques. This study assesses the potential of the Microsoft Kinect for Windows v.1 sensor to determine the best viewing angle of the sensor to estimate the plant biomass based on poplar seedling geometry. Kinect Fusion algorithms were used to generate a 3D point cloud from the depth video stream. The sensor was mounted in different positions facing the tree in order to obtain depth (RGB-D) images from different angles. Individuals of two different ages, e.g., one month and one year old, were scanned. Four different viewing angles were compared: top view (0°), 45° downwards view, front view (90°) and ground upwards view (−45°). The ground-truth used to validate the sensor readings consisted of a destructive sampling in which the height, leaf area and biomass (dry weight basis) were measured in each individual plant. The depth image models agreed well with 45°, 90° and −45° measurements in one-year poplar trees. Good correlations (0.88 to 0.92) between dry biomass and the area measured with the Kinect were found. In addition, plant height was accurately estimated with a few centimeters error. The comparison between different viewing angles revealed that top views showed poorer results due to the fact the top leaves occluded the rest of the tree. However, the other views led to good results. Conversely, small poplars showed better correlations with actual parameters from the top view (0°). Therefore, although the Microsoft Kinect for Windows v.1 sensor provides good opportunities for biomass estimation, the viewing angle must be chosen taking into account the developmental stage of the crop and the desired parameters. The results of this study indicate that Kinect is a promising tool for a rapid canopy characterization, i.e., for estimating crop biomass production, with several important advantages: low cost, low power needs and a high frame rate (frames per second) when dynamic measurements are required. PMID:26053748

  3. Lunar Reconnaissance Orbiter Camera (LROC) instrument overview

    USGS Publications Warehouse

    Robinson, M.S.; Brylow, S.M.; Tschimmel, M.; Humm, D.; Lawrence, S.J.; Thomas, P.C.; Denevi, B.W.; Bowman-Cisneros, E.; Zerr, J.; Ravine, M.A.; Caplinger, M.A.; Ghaemi, F.T.; Schaffner, J.A.; Malin, M.C.; Mahanti, P.; Bartels, A.; Anderson, J.; Tran, T.N.; Eliason, E.M.; McEwen, A.S.; Turtle, E.; Jolliff, B.L.; Hiesinger, H.

    2010-01-01

    The Lunar Reconnaissance Orbiter Camera (LROC) Wide Angle Camera (WAC) and Narrow Angle Cameras (NACs) are on the NASA Lunar Reconnaissance Orbiter (LRO). The WAC is a 7-color push-frame camera (100 and 400 m/pixel visible and UV, respectively), while the two NACs are monochrome narrow-angle linescan imagers (0.5 m/pixel). The primary mission of LRO is to obtain measurements of the Moon that will enable future lunar human exploration. The overarching goals of the LROC investigation include landing site identification and certification, mapping of permanently polar shadowed and sunlit regions, meter-scale mapping of polar regions, global multispectral imaging, a global morphology base map, characterization of regolith properties, and determination of current impact hazards.

  4. Mars Observer camera

    NASA Technical Reports Server (NTRS)

    Malin, M. C.; Danielson, G. E.; Ingersoll, A. P.; Masursky, H.; Veverka, J.; Ravine, M. A.; Soulanille, T. A.

    1992-01-01

    The Mars Observer camera (MOC) is a three-component system (one narrow-angle and two wide-angle cameras) designed to take high spatial resolution pictures of the surface of Mars and to obtain lower spatial resolution, synoptic coverage of the planet's surface and atmosphere. The cameras are based on the 'push broom' technique; that is, they do not take 'frames' but rather build pictures, one line at a time, as the spacecraft moves around the planet in its orbit. MOC is primarily a telescope for taking extremely high resolution pictures of selected locations on Mars. Using the narrow-angle camera, areas ranging from 2.8 km x 2.8 km to 2.8 km x 25.2 km (depending on available internal digital buffer memory) can be photographed at about 1.4 m/pixel. Additionally, lower-resolution pictures (to a lowest resolution of about 11 m/pixel) can be acquired by pixel averaging; these images can be much longer, ranging up to 2.8 x 500 km at 11 m/pixel. High-resolution data will be used to study sediments and sedimentary processes, polar processes and deposits, volcanism, and other geologic/geomorphic processes.

  5. Erratum: First Results from the Wide Angle Camera of the ROSETTA Mission [Mem.SAIt Suppl. 6, 28-33 (2005)

    NASA Astrophysics Data System (ADS)

    Barbieri, C.; Fornasier, S.; Bertini, I.; Angrilli, F.; Bianchini, G. A.; Debei, S.; de Cecco, M.; Parzianello, G.; Zaccariotto, M.; da Deppo, V.; Naletto, G.

    The authors acknowledge that the paper fails to convey the correct information about the respective contributions and roles of the partners of the OSIRIS consortium. In particular, the hardware contributions of the Max-Planck Institut fr Sonnensystemforschung, MPS, (Katlenburg Lindau, Germany, formerly MPAe), of the Instituto de Astrofisica de Andalucia (Granada, Spain), of the Department of Astronomy and Space Physics of Uppsala University (DASP), of ESA Research and Scientific Support Department (ESA/RSSD) to the Wide Angle Camera has not been mentioned or incorrectly expounded. The overall responsibility (PI ship) of MPS (MPAe) for OSIRIS and hence for the Wide Angle Camera is not correctly mentioned either. The correct information is given in the paper by Keller et al. (2006, Space Science Review, in press). The authors take this opportunity to acknowledge that the activity of the Italian team has been partly supported by the Italian Space Agency ASI through a contract to CISAS.

  6. The rate and causes of lunar space weathering: Insights from Lunar Reconnaissance Orbiter Wide Angle Camera ultraviolet observations

    NASA Astrophysics Data System (ADS)

    Denevi, B. W.; Robinson, M. S.; Sato, H.; Hapke, B. W.; McEwen, A. S.; Hawke, B. R.

    2011-12-01

    Lunar Reconnaissance Orbiter Wide Angle Camera global ultraviolet and visible imaging provides a unique opportunity to examine the rate and causes of space weathering on the Moon. Silicates typically have a strong decrease in reflectance toward UV wavelengths (<~450 nm) due to strong bands at 250 nm and in the far UV. Metallic iron is relatively spectrally neutral, and laboratory spectra suggest that its addition to mature soils in the form of submicroscopic iron (also known as nanophase iron) flattens silicate spectra, significantly reducing spectral slope in the ultraviolet. Reflectance at ultraviolet wavelengths may be especially sensitive to the surface coatings that form due to exposure to space weathering because scattering from the surfaces of grains contributes a larger fraction to the reflectance spectrum at short wavelengths. We find that the UV slope (as measured by the 320/415 nm ratio) is a more sensitive measure of maturity than indexes based on visible and near-infrared wavelengths. Only the youngest features (less than ~100 Ma) retain a UV slope that is distinct from mature soils of the same composition. No craters >20 km have UV slopes that approach those observed in laboratory spectra of fresh lunar materials (powdered lunar rocks). While the 320/415 nm ratio increases by ~18% from powdered rocks to mature soils in laboratory samples, Giordano Bruno, the freshest large crater, only shows a 3% difference between fresh and mature materials. At the resolution of our UV data (400 m/pixel), we observe some small (<5 km) craters that show a ~14% difference in 320/415 nm ratio from their mature surroundings. UV observations show that Reiner Gamma has had significantly lower levels of space weathering than any of the Copernican craters we examined, and was the only region we found with a UV slope that approached laboratory values for fresh powdered rock samples. This is consistent with the hypothesis that its high albedo is due to magnetic shielding from solar wind sputtering effects. Furthermore the observation that all Copernican craters we examined show some degree of space weathering and the extreme immaturity of Reiner Gamma materials show that space weathering of the surface and the resultant modification of UV spectra proceeds at a fast rate and is dominated by solar wind sputtering. Comparisons of the UV trends on other airless bodies (i.e., asteroids and Mercury) may prove fruitful for understanding the relative rates and causes of space weathering across the inner solar system.

  7. 6. VAL CAMERA CAR, DETAIL OF COMMUNICATION EQUIPMENT INSIDE CAMERA ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    6. VAL CAMERA CAR, DETAIL OF COMMUNICATION EQUIPMENT INSIDE CAMERA CAR WITH CAMERA MOUNT IN FOREGROUND. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  8. 2. VAL CAMERA CAR, VIEW OF CAMERA CAR AND TRACK ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    2. VAL CAMERA CAR, VIEW OF CAMERA CAR AND TRACK WITH CAMERA STATION ABOVE LOOKING WEST TAKEN FROM RESERVOIR. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  9. 7. VAL CAMERA CAR, DETAIL OF 'FLARE' OR TRAJECTORY CAMERA ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    7. VAL CAMERA CAR, DETAIL OF 'FLARE' OR TRAJECTORY CAMERA INSIDE CAMERA CAR. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  10. Seismic waveform variations seen via a vector narrow-angle one-way propagator for three-dimensional anisotropic inhomogeneous media.

    NASA Astrophysics Data System (ADS)

    Angus, D.; Thomson, C.; Pratt, R.

    2003-04-01

    Improvements in data quantity and quality from engineering to global scales need to be matched by improved waveform modelling tools based on physically-motivated approximations. Such approximations should relate directly to the local material properties and yet describe the frequency-dependent effects of wave propagation. The narrow-angle one-way seismic wave equation for three-dimensional anisotropic inhomogeneous media derived by Thomson (1999) is one such approach that produces finite-frequency waveforms. Although backscattering is neglected, the finite-difference implementation of this propagator should provide a sufficiently accurate, efficient and robust simulation of primary wave(s) passing through regions of smoothly-varying weak and/or strong anisotropy. We present characteristic waveform effects associated with conical points (acoustic axes) for rock elasticities representative of mantle, crustal and basin-scale applications. The effects of frequency-dependent wavetype coupling and rapidly-rotating polarization eigenvectors are expressed by: merging/splitting pulses; wavefront `tearing'; gaps/polarity-reversals; and incipient Hilbert-transform like first-motions. These examples have been computed only for homogeneous media to facilitate comparison with a separation-of-variables `exact' reference solution. The particular form of the one-way equation implemented for these homogeneous examples is part of a hierarchy of approximations which are easily implemented in Cartesian coordinates. For heterogeneous media where steeply-dipping and turning waves occur a curvilinear formulation is more appropriate and has been implemented for the narrow-angle limit. In this approach, the computational grid (i.e. curvilinear reference frame) attempts to track the true wavefronts and phases via ray tracing in a suitably-chosen reference medium. Our starting models have simple gradual transitions representing curved `interfaces' so we can explore the effects of wavetype coupling and focussing. In the longer term we hope to explore models as complicated as those from refraction/wide-angle-reflection profiling of the lithosphere. All the calculations are carried out on a single 1.5GHz desktop PC with 1GB RAM and a typical run takes 10mins.

  11. X-ray camera for high- and small-angle x-ray diffraction studies of the drawing and annealing of polymers at Daresbury Synchrotron Radiation Source

    NASA Astrophysics Data System (ADS)

    Mahendrasingam, A.; Fuller, W.; Forsyth, V. T.; Oldman, R. J.; MacKerron, D.; Blundell, D. J.

    1992-01-01

    A purpose-designed x-ray fiber diffraction camera has been constructed in the Keele University Physics Department to be used at the SERC Daresbury Laboratory Synchrotron Radiation Source. The camera allows time-resolved studies of the change in both the high- and low-angle diffraction patterns during drawing and annealing of polymer films to be recorded. Drawing of the films is achieved by two opposed stepper motors which allow films to be drawn uniaxially in both directions. The temperature of the sample environment can be controlled to within 1 C by a radio spares proportional, integral, and derivative (PID) controller. Diffraction patterns can be recorded on the Enraf-Nonius TV FAST detector or on photographic film. Exposure times using the FAST detector are typically 5 s, representing a gain of approximately a factor of 5 over photographic film. The FAST detector has a further advantage over photographic film in that essentially an unlimited number of diffraction patterns can be recorded end-to-end while a structural transition is being followed. A video camera is incorporated to allow the variation in the gross appearance of the specimen to be recorded during drawing and annealing and to be related to the variation in the diffraction pattern. The application of the camera in the study of drawing and annealing of poly (aryl-ether-ether-ketone) (PEEK) is described.

  12. Reliability of sagittal plane hip, knee, and ankle joint angles from a single frame of video data using the GAITRite camera system.

    PubMed

    Ross, Sandy A; Rice, Clinton; Von Behren, Kristyn; Meyer, April; Alexander, Rachel; Murfin, Scott

    2015-01-01

    The purpose of this study was to establish intra-rater, intra-session, and inter-rater, reliability of sagittal plane hip, knee, and ankle angles with and without reflective markers using the GAITRite walkway and single video camera between student physical therapists and an experienced physical therapist. This study included thirty-two healthy participants age 20-59, stratified by age and gender. Participants performed three successful walks with and without markers applied to anatomical landmarks. GAITRite software was used to digitize sagittal hip, knee, and ankle angles at two phases of gait: (1) initial contact; and (2) mid-stance. Intra-rater reliability was more consistent for the experienced physical therapist, regardless of joint or phase of gait. Intra-session reliability was variable, the experienced physical therapist showed moderate to high reliability (intra-class correlation coefficient (ICC)?=?0.50-0.89) and the student physical therapist showed very poor to high reliability (ICC?=?0.07-0.85). Inter-rater reliability was highest during mid-stance at the knee with markers (ICC?=?0.86) and lowest during mid-stance at the hip without markers (ICC?=?0.25). Reliability of a single camera system, especially at the knee joint shows promise. Depending on the specific type of reliability, error can be attributed to the testers (e.g. lack of digitization practice and marker placement), participants (e.g. loose fitting clothing) and camera systems (e.g. frame rate and resolution). However, until the camera technology can be upgraded to a higher frame rate and resolution, and the software can be linked to the GAITRite walkway, the clinical utility for pre/post measures is limited. PMID:25230893

  13. The measurement of in vivo joint angles during a squat using a single camera markerless motion capture system as compared to a marker based system.

    PubMed

    Schmitz, Anne; Ye, Mao; Boggess, Grant; Shapiro, Robert; Yang, Ruigang; Noehren, Brian

    2015-02-01

    Markerless motion capture may have the potential to make motion capture technology widely clinically practical. However, the ability of a single markerless camera system to quantify clinically relevant, lower extremity joint angles has not been studied in vivo. Therefore, the goal of this study was to compare in vivo joint angles calculated using a marker-based motion capture system and a Microsoft Kinect during a squat. Fifteen individuals participated in the study: 8 male, 7 female, height 1.7020.089m, mass 67.910.4kg, age 244 years, BMI 23.42.2kg/m(2). Marker trajectories and Kinect depth map data of the leg were collected while each subject performed a slow squat motion. Custom code was used to export virtual marker trajectories for the Kinect data. Each set of marker trajectories was utilized to calculate Cardan knee and hip angles. The patterns of motion were similar between systems with average absolute differences of <5 deg. Peak joint angles showed high between-trial reliability with ICC>0.9 for both systems. The peak angles calculated by the marker-based and Kinect systems were largely correlated (r>0.55). These results suggest the data from the Kinect can be post processed in way that it may be a feasible markerless motion capture system that can be used in the clinic. PMID:25708833

  14. 7. VAL CAMERA STATION, INTERIOR VIEW OF CAMERA MOUNT, COMMUNICATION ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    7. VAL CAMERA STATION, INTERIOR VIEW OF CAMERA MOUNT, COMMUNICATION EQUIPMENT AND STORAGE CABINET. - Variable Angle Launcher Complex, Camera Stations, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  15. 3. VAL CAMERA CAR, VIEW OF CAMERA CAR AND TRACK ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    3. VAL CAMERA CAR, VIEW OF CAMERA CAR AND TRACK WITH THE VAL TO THE RIGHT, LOOKING NORTHEAST. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  16. 1. VARIABLEANGLE LAUNCHER CAMERA CAR, VIEW OF CAMERA CAR AND ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    1. VARIABLE-ANGLE LAUNCHER CAMERA CAR, VIEW OF CAMERA CAR AND TRACK WITH CAMERA STATION ABOVE LOOKING NORTH TAKEN FROM RESERVOIR. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  17. Easily Accessible Camera Mount

    NASA Technical Reports Server (NTRS)

    Chalson, H. E.

    1986-01-01

    Modified mount enables fast alinement of movie cameras in explosionproof housings. Screw on side and readily reached through side door of housing. Mount includes right-angle drive mechanism containing two miter gears that turn threaded shaft. Shaft drives movable dovetail clamping jaw that engages fixed dovetail plate on camera. Mechanism alines camera in housing and secures it. Reduces installation time by 80 percent.

  18. 9. VIEW OF CAMERA STATIONS UNDER CONSTRUCTION INCLUDING CAMERA CAR ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    9. VIEW OF CAMERA STATIONS UNDER CONSTRUCTION INCLUDING CAMERA CAR ON RAILROAD TRACK AND FIXED CAMERA STATION 1400 (BUILDING NO. 42021) ABOVE, ADJACENT TO STATE HIGHWAY 39, LOOKING WEST, March 23, 1948. (Original photograph in possession of Dave Willis, San Diego, California.) - Variable Angle Launcher Complex, Camera Stations, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  19. Exploring the Moon with the Lunar Reconnaissance Orbiter Camera

    NASA Astrophysics Data System (ADS)

    Robinson, M. S.; Speyerer, E. J.; Boyd, A.; Waller, D.; Wagner, R. V.; Burns, K. N.

    2012-08-01

    The Lunar Reconnaissance Orbiter Camera (LROC) consists of three imaging systems: a Wide Angle Camera (WAC) and two Narrow Angle Cameras (NACs). Since entering lunar orbit in June of 2009, LROC has collected over 700,000 images. A subset of WAC images were reduced into a global morphologic basemap, a near-global digital elevation model, and multitemporal movie sequences that characterize illumination conditions of the polar regions. In addition, NAC observations were reduced to meter scale maps and digital elevation models of select regions of interest. These Reduced Data Record (RDR) products were publicly released through NASA's Planetary Data System to aid scientists and engineers in planning future lunar missions and addressing key science questions.

  20. LROC - Lunar Reconnaissance Orbiter Camera

    NASA Astrophysics Data System (ADS)

    Robinson, M. S.; Bowman-Cisneros, E.; Brylow, S. M.; Eliason, E.; Hiesinger, H.; Jolliff, B. L.; McEwen, A. S.; Malin, M. C.; Roberts, D.; Thomas, P. C.; Turtle, E.

    2006-12-01

    The Lunar Reconnaissance Orbiter Camera (LROC) is designed to address two of the prime LRO measurement requirements. 1) Assess meter and smaller-scale features to facilitate safety analysis for potential lunar landing sites near polar resources, and elsewhere on the Moon. 2) Acquire multi-temporal synoptic imaging of the poles every orbit to characterize the polar illumination environment (100 m scale), identifying regions of permanent shadow and permanent or near-permanent illumination over a full lunar year. The LROC consists of two narrow-angle camera components (NACs) to provide 0.5-m scale panchromatic images over a 5-km swath, a wide-angle camera component (WAC) to provide images at a scale of 100 and 400 m in seven color bands over a 100-km swath, and a common Sequence and Compressor System (SCS). In addition to acquiring the two LRO prime measurement sets, LROC will return six other high-value datasets that support LRO goals, the Robotic Lunar Exploration Program (RLEP), and basic lunar science. These additional datasets include: 3) meter-scale mapping of regions of permanent or near-permanent illumination of polar massifs; 4) multiple co-registered observations of portions of potential landing sites and elsewhere for derivation of high-resolution topography through stereogrammetric and photometric stereo analyses; 5) a global multispectral map in 7 wavelengths (300-680 nm) to characterize lunar resources, in particular ilmenite; 6) a global 100-m/pixel basemap with incidence angles (60-80) favorable for morphologic interpretations; 7) sub-meter imaging of a variety of geologic units to characterize physical properties, variability of the regolith, and key science questions; and 8) meter-scale coverage overlapping with Apollo era Panoramic images (1-2 m/pixel) to document the number of small impacts since 1971-1972, to ascertain hazards for future surface operations and interplanetary travel.

  1. Wide Angle Movie

    NASA Technical Reports Server (NTRS)

    1999-01-01

    This brief movie illustrates the passage of the Moon through the Saturn-bound Cassini spacecraft's wide-angle camera field of view as the spacecraft passed by the Moon on the way to its closest approach with Earth on August 17, 1999. From beginning to end of the sequence, 25 wide-angle images (with a spatial image scale of about 14 miles per pixel (about 23 kilometers)were taken over the course of 7 and 1/2 minutes through a series of narrow and broadband spectral filters and polarizers, ranging from the violet to the near-infrared regions of the spectrum, to calibrate the spectral response of the wide-angle camera. The exposure times range from 5 milliseconds to 1.5 seconds. Two of the exposures were smeared and have been discarded and replaced with nearby images to make a smooth movie sequence. All images were scaled so that the brightness of Crisium basin, the dark circular region in the upper right, is approximately the same in every image. The imaging data were processed and released by the Cassini Imaging Central Laboratory for Operations (CICLOPS)at the University of Arizona's Lunar and Planetary Laboratory, Tucson, AZ.

    Photo Credit: NASA/JPL/Cassini Imaging Team/University of Arizona

    Cassini, launched in 1997, is a joint mission of NASA, the European Space Agency and Italian Space Agency. The mission is managed by NASA's Jet Propulsion Laboratory, Pasadena, CA, for NASA's Office of Space Science, Washington DC. JPL is a division of the California Institute of Technology, Pasadena, CA.

  2. Is Perceptual Narrowing Too Narrow?

    ERIC Educational Resources Information Center

    Cashon, Cara H.; Denicola, Christopher A.

    2011-01-01

    There is a growing list of examples illustrating that infants are transitioning from having earlier abilities that appear more "universal," "broadly tuned," or "unconstrained" to having later abilities that appear more "specialized," "narrowly tuned," or "constrained." Perceptual narrowing, a well-known phenomenon related to face, speech, and…

  3. Determining iconometric parameters of imaging devices using a wide-angle collimator. [calibration of satellite-borne television and photographic cameras

    NASA Technical Reports Server (NTRS)

    Ziman, Y. L.

    1974-01-01

    The problem of determining the iconometric parameters of the imaging device can be solved if the camera being calibrated is used to obtain the image of a group of reference points, the directions to which are known. In order to specify the imaging device coordinate system, it is sufficient in principle to obtain on the picture the images of three reference points which do not lie on a single straight line. Many more such points are required in order to determine the distortion corrections, and they must be distributed uniformly over the entire field of view of the camera being calibrated. Experimental studies were made using this technique to calibrate photographic and phototelevision systems. Evaluation of the results of these experiments permits recommending collimators for calibrating television and phototelevision imaging systems, and also short-focus small-format photographic cameras.

  4. How are the nine MISR cameras arranged?

    Atmospheric Science Data Center

    2014-12-08

    ... and the D cameras the longest. With the exception of the A design, which is used for the nadir view as well as the near-nadir views, each ... denotes the camera angle, that is, the zenith angle of the optical axis of the camera. The small letters (f, n, a) denote whether the ...

  5. The DSLR Camera

    NASA Astrophysics Data System (ADS)

    Berkó, Ernő; Argyle, R. W.

    Cameras have developed significantly in the past decade; in particular, digital Single-Lens Reflex Cameras (DSLR) have appeared. As a consequence we can buy cameras of higher and higher pixel number, and mass production has resulted in the great reduction of prices. CMOS sensors used for imaging are increasingly sensitive, and the electronics in the cameras allows images to be taken with much less noise. The software background is developing in a similar way—intelligent programs are created for after-processing and other supplementary works. Nowadays we can find a digital camera in almost every household, most of these cameras are DSLR ones. These can be used very well for astronomical imaging, which is nicely demonstrated by the amount and quality of the spectacular astrophotos appearing in different publications. These examples also show how much post-processing software contributes to the rise in the standard of the pictures. To sum up, the DSLR camera serves as a cheap alternative for the CCD camera, with somewhat weaker technical characteristics. In the following, I will introduce how we can measure the main parameters (position angle and separation) of double stars, based on the methods, software and equipment I use. Others can easily apply these for their own circumstances.

  6. 6. VAL CAMERA STATION, VIEW FROM INTERIOR OUT OF WINDOW ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    6. VAL CAMERA STATION, VIEW FROM INTERIOR OUT OF WINDOW OPENING TOWARD VAL FIRING RANGE LOOKING SOUTHEAST WITH CAMERA MOUNT IN FOREGROUND. - Variable Angle Launcher Complex, Camera Stations, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  7. 8. VAL CAMERA CAR, CLOSEUP VIEW OF 'FLARE' OR TRAJECTORY ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    8. VAL CAMERA CAR, CLOSE-UP VIEW OF 'FLARE' OR TRAJECTORY CAMERA ON SLIDING MOUNT. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  8. 84. VIEW FROM CAMERA TOWER LOOKING SOUTHWEST SHOWING VAL FIRING ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    84. VIEW FROM CAMERA TOWER LOOKING SOUTHWEST SHOWING VAL FIRING RANGE WITH OVERHEAD CAMERA AND CABLES, Date unknown, circa 1949. - Variable Angle Launcher Complex, Variable Angle Launcher, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  9. Ceres Photometry and Albedo from Dawn Framing Camera Images

    NASA Astrophysics Data System (ADS)

    Schrder, S. E.; Mottola, S.; Keller, H. U.; Li, J.-Y.; Matz, K.-D.; Otto, K.; Roatsch, T.; Stephan, K.; Raymond, C. A.; Russell, C. T.

    2015-10-01

    The Dawn spacecraft is in orbit around dwarf planet Ceres. The onboard Framing Camera (FC) [1] is mapping the surface through a clear filter and 7 narrow-band filters at various observational geometries. Generally, Ceres' appearance in these images is affected by shadows and shading, effects which become stronger for larger solar phase angles, obscuring the intrinsic reflective properties of the surface. By means of photometric modeling we attempt to remove these effects and reconstruct the surface albedo over the full visible wavelength range. Knowledge of the albedo distribution will contribute to our understanding of the physical nature and composition of the surface.

  10. 9. WEST SIDE OF EAST TOWER CAMERA POINTED STRAIGHT DOWN ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    9. WEST SIDE OF EAST TOWER CAMERA POINTED STRAIGHT DOWN FOR THE BENEFIT OF ACROPHOBICS; NOTE EXTREMELY TURBULENT WATER FROM CURRENT - Tacoma Narrows Bridge, Spanning Narrows at State Route 16, Tacoma, Pierce County, WA

  11. The PAU Camera

    NASA Astrophysics Data System (ADS)

    Casas, R.; Ballester, O.; Cardiel-Sas, L.; Carretero, J.; Castander, F. J.; Castilla, J.; Crocce, M.; de Vicente, J.; Delfino, M.; Fernndez, E.; Fosalba, P.; Garca-Bellido, J.; Gaztaaga, E.; Graena, F.; Jimnez, J.; Madrid, F.; Maiorino, M.; Mart, P.; Miquel, R.; Neissner, C.; Ponce, R.; Snchez, E.; Serrano, S.; Sevilla, I.; Tonello, N.; Troyano, I.

    2011-11-01

    The PAU Camera (PAUCam) is a wide-field camera designed to be mounted at the William Herschel Telescope (WHT) prime focus, located at the Observatorio del Roque de los Muchachos in the island of La Palma (Canary Islands).Its primary function is to carry out a cosmological survey, the PAU Survey, covering an area of several hundred square degrees of sky. Its purpose is to determine positions and distances using photometric redshift techniques. To achieve accurate photo-z's, PAUCam will be equipped with 40 narrow-band filters covering the range from 450 to850 nm, and six broad-band filters, those of the SDSS system plus the Y band. To fully cover the focal plane delivered by the telescope optics, 18 CCDs 2k x 4k are needed. The pixels are square of 15 ? m size. The optical characteristics of the prime focus corrector deliver a field-of-view where eight of these CCDs will have an illumination of more than 95% covering a field of 40 arc minutes. The rest of the CCDs will occupy the vignetted region extending the field diameter to one degree. Two of the CCDs will be devoted to auto-guiding.This camera have some innovative features. Firstly, both the broad-band and the narrow-band filters will be placed in mobile trays, hosting 16 such filters at most. Those are located inside the cryostat at few millimeters in front of the CCDs when observing. Secondly, a pressurized liquid nitrogen tank outside the camera will feed a boiler inside the cryostat with a controlled massflow. The read-out electronics will use the Monsoon architecture, originally developed by NOAO, modified and manufactured by our team in the frame of the DECam project (the camera used in the DES Survey).PAUCam will also be available to the astronomical community of the WHT.

  12. Automatic inference of geometric camera parameters and inter-camera topology in uncalibrated disjoint surveillance cameras

    NASA Astrophysics Data System (ADS)

    den Hollander, Richard J. M.; Bouma, Henri; Baan, Jan; Eendebak, Pieter T.; van Rest, Jeroen H. C.

    2015-10-01

    Person tracking across non-overlapping cameras and other types of video analytics benefit from spatial calibration information that allows an estimation of the distance between cameras and a relation between pixel coordinates and world coordinates within a camera. In a large environment with many cameras, or for frequent ad-hoc deployments of cameras, the cost of this calibration is high. This creates a barrier for the use of video analytics. Automating the calibration allows for a short configuration time, and the use of video analytics in a wider range of scenarios, including ad-hoc crisis situations and large scale surveillance systems. We show an autocalibration method entirely based on pedestrian detections in surveillance video in multiple non-overlapping cameras. In this paper, we show the two main components of automatic calibration. The first shows the intra-camera geometry estimation that leads to an estimate of the tilt angle, focal length and camera height, which is important for the conversion from pixels to meters and vice versa. The second component shows the inter-camera topology inference that leads to an estimate of the distance between cameras, which is important for spatio-temporal analysis of multi-camera tracking. This paper describes each of these methods and provides results on realistic video data.

  13. 1. VARIABLEANGLE LAUNCHER (VAL) CONCRETE 'A' FRAME STRUCTURE SHOWING CAMERA ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    1. VARIABLE-ANGLE LAUNCHER (VAL) CONCRETE 'A' FRAME STRUCTURE SHOWING CAMERA TOWER STRUCTURE LOOKING SOUTH AND ARCHED OPENING FOR ROADWAY. - Variable Angle Launcher Complex, Variable Angle Launcher, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  14. 1. VARIABLEANGLE LAUNCHER CAMERA STATION 1400 (BUILDING NO. 42021), VIEW ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    1. VARIABLE-ANGLE LAUNCHER CAMERA STATION 1400 (BUILDING NO. 42021), VIEW OF EXTERIOR LOOKING NORTHEAST WITH CAMERA STATION IN 1100 (BUILDING NO. 42020) BACKGROUND. - Variable Angle Launcher Complex, Camera Stations, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  15. Infrared Camera

    NASA Technical Reports Server (NTRS)

    1997-01-01

    A sensitive infrared camera that observes the blazing plumes from the Space Shuttle or expendable rocket lift-offs is capable of scanning for fires, monitoring the environment and providing medical imaging. The hand-held camera uses highly sensitive arrays in infrared photodetectors known as quantum well infrared photo detectors (QWIPS). QWIPS were developed by the Jet Propulsion Laboratory's Center for Space Microelectronics Technology in partnership with Amber, a Raytheon company. In October 1996, QWIP detectors pointed out hot spots of the destructive fires speeding through Malibu, California. Night vision, early warning systems, navigation, flight control systems, weather monitoring, security and surveillance are among the duties for which the camera is suited. Medical applications are also expected.

  16. Electronic Still Camera

    NASA Technical Reports Server (NTRS)

    Holland, S. Douglas (Inventor)

    1992-01-01

    A handheld, programmable, digital camera is disclosed that supports a variety of sensors and has program control over the system components to provide versatility. The camera uses a high performance design which produces near film quality images from an electronic system. The optical system of the camera incorporates a conventional camera body that was slightly modified, thus permitting the use of conventional camera accessories, such as telephoto lenses, wide-angle lenses, auto-focusing circuitry, auto-exposure circuitry, flash units, and the like. An image sensor, such as a charge coupled device ('CCD') collects the photons that pass through the camera aperture when the shutter is opened, and produces an analog electrical signal indicative of the image. The analog image signal is read out of the CCD and is processed by preamplifier circuitry, a correlated double sampler, and a sample and hold circuit before it is converted to a digital signal. The analog-to-digital converter has an accuracy of eight bits to insure accuracy during the conversion. Two types of data ports are included for two different data transfer needs. One data port comprises a general purpose industrial standard port and the other a high speed/high performance application specific port. The system uses removable hard disks as its permanent storage media. The hard disk receives the digital image signal from the memory buffer and correlates the image signal with other sensed parameters, such as longitudinal or other information. When the storage capacity of the hard disk has been filled, the disk can be replaced with a new disk.

  17. 3. VAL CAMERA STATION, DETAIL OF ROOF OVERHANG AND EXPOSED ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    3. VAL CAMERA STATION, DETAIL OF ROOF OVERHANG AND EXPOSED CONCRETE SURFACES. - Variable Angle Launcher Complex, Camera Stations, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  18. 5. VAL CAMERA CAR, DETAIL OF HOIST AT SIDE OF ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    5. VAL CAMERA CAR, DETAIL OF HOIST AT SIDE OF BRIDGE AND ENGINE CAR ON TRACKS, LOOKING NORTHEAST. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  19. NASA's Lunar Reconnaissance Orbiter Cameras (LROC)

    NASA Astrophysics Data System (ADS)

    Robinson, M.; McEwen, A.; Eliason, E.; Joliff, B.; Hiesinger, H.; Malin, M.; Thomas, P.; Turtle, E.; Brylow, S.

    The Lunar Reconnaissance Orbiter LRO mission is scheduled to launch in the fall of 2008 as part of NASA s Robotic Lunar Exploration Program and is the first spacecraft to be built as part of NASA s Vision for Space Exploration The orbiter will be equipped with seven scientific instrument packages one of which is LROC The Lunar Reconnaissance Orbiter Camera LROC has been designed to address two of LRO s primary measurement objectives landing site certification and monitoring of polar illumination In order to examine potential landing sites high-resolution images 0 5 m pixel will be used to assess meter-scale features near the pole and other regions on the lunar surface The LROC will also acquire 100 m pixel images of the polar regions of the Moon during each orbit for a year to identify areas of permanent shadow and permanent or near-permanent illumination In addition to these two main objectives the LROC team also plans to conduct meter-scale monitoring of polar regions under varying illumination angles acquire overlapping observations to enable derivation of meter-scale topography acquire global multispectral imaging to map ilmenite and other minerals derive a global morphology base map characterize regolith properties and determine current impact hazards by re-imaging areas covered by Apollo images to search for newly-formed impact craters The LROC is a modified version of the Mars Reconnaissance Orbiter s Context Camera and Mars Color Imager The LROC will be made up of four optical elements two identical narrow-angle telescopes

  20. CCD Camera

    DOEpatents

    Roth, Roger R. (Minnetonka, MN)

    1983-01-01

    A CCD camera capable of observing a moving object which has varying intensities of radiation eminating therefrom and which may move at varying speeds is shown wherein there is substantially no overlapping of successive images and wherein the exposure times and scan times may be varied independently of each other.

  1. CCD Camera

    DOEpatents

    Roth, R.R.

    1983-08-02

    A CCD camera capable of observing a moving object which has varying intensities of radiation emanating therefrom and which may move at varying speeds is shown wherein there is substantially no overlapping of successive images and wherein the exposure times and scan times may be varied independently of each other. 7 figs.

  2. Cassini Camera Contamination Anomaly: Experiences and Lessons Learned

    NASA Technical Reports Server (NTRS)

    Haemmerle, Vance R.; Gerhard, James H.

    2006-01-01

    We discuss the contamination 'Haze' anomaly for the Cassini Narrow Angle Camera (NAC), one of two optical telescopes that comprise the Imaging Science Subsystem (ISS). Cassini is a Saturn Orbiter with a 4-year nominal mission. The incident occurred in 2001, five months after Jupiter encounter during the Cruise phase and ironically at the resumption of planned maintenance decontamination cycles. The degraded optical performance was first identified by the Instrument Operations Team with the first ISS Saturn imaging six weeks later. A distinct haze of varying size from image to image marred the images of Saturn. A photometric star calibration of the Pleiades, 4 days after the incident, showed stars with halos. Analysis showed that while the halo's intensity was only 1 - 2% of the intensity of the central peak of a star, the halo contained 30 - 70% of its integrated flux. This condition would impact science return. In a review of our experiences, we examine the contamination control plan, discuss the analysis of the limited data available and describe the one-year campaign to remove the haze from the camera. After several long conservative heating activities and interim analysis of their results, the contamination problem as measured by the camera's point spread function was essentially back to preanomaly size and at a point where there would be more risk to continue. We stress the importance of the flexibility of operations and instrument design, the need to do early infight instrument calibration and continual monitoring of instrument performance.

  3. 7. VAL CONCRETE 'A' FRAME STRUCTURE SHOWING CAMERA TOWER ARCHED ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    7. VAL CONCRETE 'A' FRAME STRUCTURE SHOWING CAMERA TOWER ARCHED OPENING FOR ROADWAY AND COUNTERWEIGHT SLOPE TAKEN FROM RESERVOIR LOOKING WEST. - Variable Angle Launcher Complex, Variable Angle Launcher, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  4. 11. VAL, DETAIL OF CAMERA TOWER AND THE TOP OF ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    11. VAL, DETAIL OF CAMERA TOWER AND THE TOP OF CONCRETE 'A' FRAME STRUCTURE LOOKING NORTH. - Variable Angle Launcher Complex, Variable Angle Launcher, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  5. 2. VAL CONCRETE 'A' FRAME STRUCTURE SHOWING CAMERA TOWER, PROJECTILE ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    2. VAL CONCRETE 'A' FRAME STRUCTURE SHOWING CAMERA TOWER, PROJECTILE LOADING DECK AND BREECH END OF LAUNCHER BRIDGE LOOKING SOUTH. - Variable Angle Launcher Complex, Variable Angle Launcher, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  6. Calibration and Epipolar Geometry of Generic Heterogenous Camera Systems

    NASA Astrophysics Data System (ADS)

    Luber, A.; Rue, D.; Manthey, K.; Reulke, R.

    2012-07-01

    The application of perspective camera systems in photogrammetry and computer vision is state of the art. In recent years nonperspective and especially omnidirectional camera systems were increasingly used in close-range photogrammetry tasks. In general perspective camera model, i. e. pinhole model, cannot be applied when using non-perspective camera systems. However, several camera models for different omnidirectional camera systems are proposed in literature. Using different types of cameras in a heterogeneous camera system may lead to an advantageous combination. The advantages of different camera systems, e. g. field of view and resolution, result in a new enhanced camera system. If these different kinds of cameras can be modeled, using a unified camera model, the total calibration process can be simplified. Sometimes it is not possible to give the specific camera model in advance. In these cases a generic approach is helpful. Furthermore, a simple stereo reconstruction becomes possible using a fisheye and a perspective camera for example. In this paper camera models for perspective, wide-angle and omnidirectional camera systems are evaluated. The crucial initialization of the model's parameters is conducted using a generic method that is independent of the particular camera system. The accuracy of this generic camera calibration approach is evaluated by calibration of a dozen of real camera systems. It will be shown, that a unified method of modeling, parameter approximation and calibration of interior and exterior orientation can be applied to derive 3D object data.

  7. Calibration of the Lunar Reconnaissance Orbiter Camera

    NASA Astrophysics Data System (ADS)

    Tschimmel, M.; Robinson, M. S.; Humm, D. C.; Denevi, B. W.; Lawrence, S. J.; Brylow, S.; Ravine, M.; Ghaemi, T.

    2008-12-01

    The Lunar Reconnaissance Orbiter Camera (LROC) onboard the NASA Lunar Reconnaissance Orbiter (LRO) spacecraft consists of three cameras: the Wide-Angle Camera (WAC) and two identical Narrow Angle Cameras (NAC-L, NAC-R). The WAC is push-frame imager with 5 visible wavelength filters (415 to 680 nm) at a spatial resolution of 100 m/pixel and 2 UV filters (315 and 360 nm) with a resolution of 400 m/pixel. In addition to the multicolor imaging the WAC can operate in monochrome mode to provide a global large- incidence angle basemap and a time-lapse movie of the illumination conditions at both poles. The WAC has a highly linear response, a read noise of 72 e- and a full well capacity of 47,200 e-. The signal-to-noise ratio in each band is 140 in the worst case. There are no out-of-band leaks and the spectral response of each filter is well characterized. Each NAC is a monochrome pushbroom scanner, providing images with a resolution of 50 cm/pixel from a 50-km orbit. A single NAC image has a swath width of 2.5 km and a length of up to 26 km. The NACs are mounted to acquire side-by-side imaging for a combined swath width of 5 km. The NAC is designed to fully characterize future human and robotic landing sites in terms of topography and hazard risks. The North and South poles will be mapped on a 1-meter-scale poleward of 85.5° latitude. Stereo coverage can be provided by pointing the NACs off-nadir. The NACs are also highly linear. Read noise is 71 e- for NAC-L and 74 e- for NAC-R and the full well capacity is 248,500 e- for NAC-L and 262,500 e- for NAC- R. The focal lengths are 699.6 mm for NAC-L and 701.6 mm for NAC-R; the system MTF is 28% for NAC-L and 26% for NAC-R. The signal-to-noise ratio is at least 46 (terminator scene) and can be higher than 200 (high sun scene). Both NACs exhibit a straylight feature, which is caused by out-of-field sources and is of a magnitude of 1-3%. However, as this feature is well understood it can be greatly reduced during ground processing. All three cameras were calibrated in the laboratory under ambient conditions. Future thermal vacuum tests will characterize critical behaviors across the full range of lunar operating temperatures. In-flight tests will check for changes in response after launch and provide key data for meeting the requirements of 1% relative and 10% absolute radiometric calibration.

  8. Pinhole Cameras: For Science, Art, and Fun!

    ERIC Educational Resources Information Center

    Button, Clare

    2007-01-01

    A pinhole camera is a camera without a lens. A tiny hole replaces the lens, and light is allowed to come in for short amount of time by means of a hand-operated shutter. The pinhole allows only a very narrow beam of light to enter, which reduces confusion due to scattered light on the film. This results in an image that is focused, reversed, and

  9. Bacterial motion in narrow capillaries.

    PubMed

    Ping, Liyan; Wasnik, Vaibhav; Emberly, Eldon

    2015-02-01

    Motile bacteria often have to pass through small tortuous pores in soil or tissue of higher organisms. However, their motion in this prevalent type of niche is not fully understood. Here, we modeled it with narrow glass capillaries and identified a critical radius (Rc) for bacterial motion. Near the surface of capillaries narrower than that, the swimming trajectories are helices. In larger capillaries, they swim in distorted circles. Under non-slip condition, the peritrichous Escherichia coli swam in left-handed helices with an Rc of ~10 ?m near glass surface. However, slipping could occur in the fast monotrichous Pseudomonas fluorescens, when a speed threshold was exceeded, and thus both left-handed and right-handed helices were executed in glass capillaries. In the natural non-cylindrical pores, the near-surface trajectories would be spirals and twisted loops. Engaging in such motions reduces the bacterial migration rate. With a given pore size, the run length and the tumbling angle of the bacterium determine the probability and duration of their near-surface motion. Shear flow and chemotaxis potentially enhance it. Based on this observation, the puzzling previous observations on bacterial migration in porous environments can be interpreted. PMID:25764548

  10. Bacterial motion in narrow capillaries

    PubMed Central

    Ping, Liyan; Wasnik, Vaibhav; Emberly, Eldon

    2014-01-01

    Motile bacteria often have to pass through small tortuous pores in soil or tissue of higher organisms. However, their motion in this prevalent type of niche is not fully understood. Here, we modeled it with narrow glass capillaries and identified a critical radius (Rc) for bacterial motion. Near the surface of capillaries narrower than that, the swimming trajectories are helices. In larger capillaries, they swim in distorted circles. Under non-slip condition, the peritrichous Escherichia coli swam in left-handed helices with an Rc of ∼10 μm near glass surface. However, slipping could occur in the fast monotrichous Pseudomonas fluorescens, when a speed threshold was exceeded, and thus both left-handed and right-handed helices were executed in glass capillaries. In the natural non-cylindrical pores, the near-surface trajectories would be spirals and twisted loops. Engaging in such motions reduces the bacterial migration rate. With a given pore size, the run length and the tumbling angle of the bacterium determine the probability and duration of their near-surface motion. Shear flow and chemotaxis potentially enhance it. Based on this observation, the puzzling previous observations on bacterial migration in porous environments can be interpreted. PMID:25764548

  11. 11. COMPLETED FIXED CAMERA STATION 1700 (BUILDING NO. 42022) LOOKING ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    11. COMPLETED FIXED CAMERA STATION 1700 (BUILDING NO. 42022) LOOKING WEST SHOWING WINDOW OPENING FOR CAMERA, March 31, 1948. (Original photograph in possession of Dave Willis, San Diego, California.) - Variable Angle Launcher Complex, Camera Stations, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  12. Omnidirectional Underwater Camera Design and Calibration

    PubMed Central

    Bosch, Josep; Gracias, Nuno; Ridao, Pere; Ribas, David

    2015-01-01

    This paper presents the development of an underwater omnidirectional multi-camera system (OMS) based on a commercially available six-camera system, originally designed for land applications. A full calibration method is presented for the estimation of both the intrinsic and extrinsic parameters, which is able to cope with wide-angle lenses and non-overlapping cameras simultaneously. This method is valid for any OMS in both land or water applications. For underwater use, a customized housing is required, which often leads to strong image distortion due to refraction among the different media. This phenomena makes the basic pinhole camera model invalid for underwater cameras, especially when using wide-angle lenses, and requires the explicit modeling of the individual optical rays. To address this problem, a ray tracing approach has been adopted to create a field-of-view (FOV) simulator for underwater cameras. The simulator allows for the testing of different housing geometries and optics for the cameras to ensure a complete hemisphere coverage in underwater operation. This paper describes the design and testing of a compact custom housing for a commercial off-the-shelf OMS camera (Ladybug 3) and presents the first results of its use. A proposed three-stage calibration process allows for the estimation of all of the relevant camera parameters. Experimental results are presented, which illustrate the performance of the calibration method and validate the approach. PMID:25774707

  13. Omnidirectional underwater camera design and calibration.

    PubMed

    Bosch, Josep; Gracias, Nuno; Ridao, Pere; Ribas, David

    2015-01-01

    This paper presents the development of an underwater omnidirectional multi-camera system (OMS) based on a commercially available six-camera system, originally designed for land applications. A full calibration method is presented for the estimation of both the intrinsic and extrinsic parameters, which is able to cope with wide-angle lenses and non-overlapping cameras simultaneously. This method is valid for any OMS in both land or water applications. For underwater use, a customized housing is required, which often leads to strong image distortion due to refraction among the different media. This phenomena makes the basic pinhole camera model invalid for underwater cameras, especially when using wide-angle lenses, and requires the explicit modeling of the individual optical rays. To address this problem, a ray tracing approach has been adopted to create a field-of-view (FOV) simulator for underwater cameras. The simulator allows for the testing of different housing geometries and optics for the cameras to ensure a complete hemisphere coverage in underwater operation. This paper describes the design and testing of a compact custom housing for a commercial off-the-shelf OMS camera (Ladybug 3) and presents the first results of its use. A proposed three-stage calibration process allows for the estimation of all of the relevant camera parameters. Experimental results are presented, which illustrate the performance of the calibration method and validate the approach. PMID:25774707

  14. 5. VAL CAMERA STATION, VIEW FROM INTERIOR OUT OF WINDOW ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    5. VAL CAMERA STATION, VIEW FROM INTERIOR OUT OF WINDOW OPENING TOWARD VAL FIRING RANGE LOOKING EAST WITH VARIABLE ANGLE LAUNCHER IN BACKGROUND. - Variable Angle Launcher Complex, Camera Stations, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  15. New two-dimensional photon camera

    NASA Technical Reports Server (NTRS)

    Papaliolios, C.; Mertz, L.

    1982-01-01

    A photon-sensitive camera, applicable to speckle imaging of astronomical sources, high-resolution spectroscopy of faint galaxies in a crossed-dispersion spectrograph, or narrow-band direct imaging of galaxies, is presented. The camera is shown to supply 8-bit by 8-bit photon positions (256 x 256 pixels) for as many as 10 to the 6th photons/sec with a maximum linear resolution of approximately 10 microns. The sequence of photon positions is recorded digitally with a VHS-format video tape recorder or formed into an immediate image via a microcomputer. The four basic elements of the camera are described in detail: a high-gain image intensifier with fast-decay output phosphor, a glass-prism optical-beam splitter, a set of Gray-coded masks, and a photomultiplier tube for each mask. The characteristics of the camera are compared to those of other photon cameras.

  16. Caught on Camera.

    ERIC Educational Resources Information Center

    Milshtein, Amy

    2002-01-01

    Describes the benefits of and rules to be followed when using surveillance cameras for school security. Discusses various camera models, including indoor and outdoor fixed position cameras, pan-tilt zoom cameras, and pinhole-lens cameras for covert surveillance. (EV)

  17. Automatic commanding of the Mars Observer Camera

    NASA Technical Reports Server (NTRS)

    Caplinger, Michael

    1994-01-01

    Mars Observer, launched in September 1992, was intended to be a 'survey-type' mission that acquired global coverage of Mars from a low, circular, near-polar orbit during an entire Martian year. As such, most of its instruments had fixed data rates, wide fields of view, and relatively low resolution, with fairly limited requirements for commanding. An exception is the Mars Observer Camera, or MOC. The MOC consists of a two-color Wide Angle (WA) system that can acquire both global images at low resolution (7.5 km/pixel) and regional images at commandable resolutions up to 250 m/pixel. Complementing the WA is the Narrow Angle (NA) system, that can acquire images at 8 resolutions from 12 m/pixel to 1.5 m/pixel, with a maximum crosstrack dimension of 3 km. The MOC also provides various forms of data compression (both lossless and lossy), and is designed to work at data rates from 700 bits per second (bps) to over 80k bps. Because of this flexibility, developing MOC command sequences is much more difficult than the routine mode-changing that characterizes other instrument operations. Although the MOC cannot be pointed (the spacecraft is fixed nadir-pointing and has no scan platform), the timing, downlink stream allocation, compression type and parameters, and image dimensions of each image must be commanded from the ground, subject to the constraints inherent in the MOC and the spacecraft. To minimize the need for a large operations staff, the entire command generation process has been automated within the MOC Ground Data System. Following the loss of the Mars Observer spacecraft in August 1993, NASA intends to launch a new spacecraft, Mars Global Surveyor (MGS), in late 1996. This spacecraft will carry the MOC flight spare (MOC 2). The MOC 2 operations plan will be largely identical to that developed for MOC, and all of the algorithms described here are applicable to it.

  18. Readout electronics of physics of accelerating universe camera

    NASA Astrophysics Data System (ADS)

    de Vicente, Juan; Castilla, Javier; Jimnez, Jorge; Cardiel-Sas, L.; Illa, Jos M.

    2014-08-01

    The Physics of Accelerating Universe Camera (PAUCam) is a new camera for dark energy studies that will be installed in the William Herschel telescope. The main characteristic of the camera is the capacity for high precision photometric redshift measurement. The camera is composed of eighteen Hamamatsu Photonics CCDs providing a wide field of view covering a diameter of one degree. Unlike the common five optical filters of other similar surveys, PAUCam has forty optical narrow band filters which will provide higher resolution in photometric redshifts. In this paper a general description of the electronics of the camera and its status is presented.

  19. Experience with duplex bearings in narrow angle oscillating applications

    NASA Technical Reports Server (NTRS)

    Phinney, D. D.; Pollard, C. L.; Hinricks, J. T.

    1988-01-01

    Duplex ball bearings are matched pairs on which the abutting faces of the rings have been accurately ground so that when the rings are clamped together, a controlled amount of interference (preload) exists across the balls. These bearings are vulnerable to radial temperature gradients, blocking in oscillation and increased sensitivity to contamination. These conditions decrease the service life of these bearings. It was decided that an accelerated thermal vacuum life test should be conducted. The test apparatus and results are described and the rationale is presented for reducing a multiyear life test on oil lubricated bearings to less than a year.

  20. System Synchronizes Recordings from Separated Video Cameras

    NASA Technical Reports Server (NTRS)

    Nail, William; Nail, William L.; Nail, Jasper M.; Le, Doung T.

    2009-01-01

    A system of electronic hardware and software for synchronizing recordings from multiple, physically separated video cameras is being developed, primarily for use in multiple-look-angle video production. The system, the time code used in the system, and the underlying method of synchronization upon which the design of the system is based are denoted generally by the term "Geo-TimeCode(TradeMark)." The system is embodied mostly in compact, lightweight, portable units (see figure) denoted video time-code units (VTUs) - one VTU for each video camera. The system is scalable in that any number of camera recordings can be synchronized. The estimated retail price per unit would be about $350 (in 2006 dollars). The need for this or another synchronization system external to video cameras arises because most video cameras do not include internal means for maintaining synchronization with other video cameras. Unlike prior video-camera-synchronization systems, this system does not depend on continuous cable or radio links between cameras (however, it does depend on occasional cable links lasting a few seconds). Also, whereas the time codes used in prior video-camera-synchronization systems typically repeat after 24 hours, the time code used in this system does not repeat for slightly more than 136 years; hence, this system is much better suited for long-term deployment of multiple cameras.

  1. Characterization of previously unidentified lunar pyroclastic deposits using Lunar Reconnaissance Orbiter Camera (LROC) data

    USGS Publications Warehouse

    Gustafson, J. Olaf; Bell, James F.; Gaddis, Lisa R.R.; Hawke, B. Ray Ray; Giguere, Thomas A.

    2012-01-01

    We used a Lunar Reconnaissance Orbiter Camera (LROC) global monochrome Wide-angle Camera (WAC) mosaic to conduct a survey of the Moon to search for previously unidentified pyroclastic deposits. Promising locations were examined in detail using LROC multispectral WAC mosaics, high-resolution LROC Narrow Angle Camera (NAC) images, and Clementine multispectral (ultraviolet-visible or UVVIS) data. Out of 47 potential deposits chosen for closer examination, 12 were selected as probable newly identified pyroclastic deposits. Potential pyroclastic deposits were generally found in settings similar to previously identified deposits, including areas within or near mare deposits adjacent to highlands, within floor-fractured craters, and along fissures in mare deposits. However, a significant new finding is the discovery of localized pyroclastic deposits within floor-fractured craters Anderson E and F on the lunar farside, isolated from other known similar deposits. Our search confirms that most major regional and localized low-albedo pyroclastic deposits have been identified on the Moon down to ~100 m/pix resolution, and that additional newly identified deposits are likely to be either isolated small deposits or additional portions of discontinuous, patchy deposits.

  2. Pre-hibernation performances of the OSIRIS cameras onboard the Rosetta spacecraft

    NASA Astrophysics Data System (ADS)

    Magrin, S.; La Forgia, F.; Da Deppo, V.; Lazzarin, M.; Bertini, I.; Ferri, F.; Pajola, M.; Barbieri, M.; Naletto, G.; Barbieri, C.; Tubiana, C.; Kppers, M.; Fornasier, S.; Jorda, L.; Sierks, H.

    2015-02-01

    Context. The ESA cometary mission Rosetta was launched in 2004. In the past years and until the spacecraft hibernation in June 2011, the two cameras of the OSIRIS imaging system (Narrow Angle and Wide Angle Camera, NAC and WAC) observed many different sources. On 20 January 2014 the spacecraft successfully exited hibernation to start observing the primary scientific target of the mission, comet 67P/Churyumov-Gerasimenko. Aims: A study of the past performances of the cameras is now mandatory to be able to determine whether the system has been stable through the time and to derive, if necessary, additional analysis methods for the future precise calibration of the cometary data. Methods: The instrumental responses and filter passbands were used to estimate the efficiency of the system. A comparison with acquired images of specific calibration stars was made, and a refined photometric calibration was computed, both for the absolute flux and for the reflectivity of small bodies of the solar system. Results: We found a stability of the instrumental performances within 1.5% from 2007 to 2010, with no evidence of an aging effect on the optics or detectors. The efficiency of the instrumentation is found to be as expected in the visible range, but lower than expected in the UV and IR range. A photometric calibration implementation was discussed for the two cameras. Conclusions: The calibration derived from pre-hibernation phases of the mission will be checked as soon as possible after the awakening of OSIRIS and will be continuously monitored until the end of the mission in December 2015. A list of additional calibration sources has been determined that are to be observed during the forthcoming phases of the mission to ensure a better coverage across the wavelength range of the cameras and to study the possible dust contamination of the optics.

  3. Angle performance on optima MDxt

    SciTech Connect

    David, Jonathan; Kamenitsa, Dennis

    2012-11-06

    Angle control on medium current implanters is important due to the high angle-sensitivity of typical medium current implants, such as halo implants. On the Optima MDxt, beam-to-wafer angles are controlled in both the horizontal and vertical directions. In the horizontal direction, the beam angle is measured through six narrow slits, and any angle adjustment is made by electrostatically steering the beam, while cross-wafer beam parallelism is adjusted by changing the focus of the electrostatic parallelizing lens (P-lens). In the vertical direction, the beam angle is measured through a high aspect ratio mask, and any angle adjustment is made by slightly tilting the wafer platen prior to implant. A variety of tests were run to measure the accuracy and repeatability of Optima MDxt's angle control. SIMS profiles of a high energy, channeling sensitive condition show both the cross-wafer angle uniformity, along with the small-angle resolution of the system. Angle repeatability was quantified by running a channeling sensitive implant as a regular monitor over a seven month period and measuring the sheet resistance-to-angle sensitivity. Even though crystal cut error was not controlled for in this case, when attributing all Rs variation to angle changes, the overall angle repeatability was measured as 0.16 Degree-Sign (1{sigma}). A separate angle repeatability test involved running a series of V-curves tests over a four month period using low crystal cut wafers selected from the same boule. The results of this test showed the angle repeatability to be <0.1 Degree-Sign (1{sigma}).

  4. Three-dimensional camera

    NASA Astrophysics Data System (ADS)

    Bothe, Thorsten; Gesierich, Achim; Legarda-Saenz, Ricardo; Jueptner, Werner P. O.

    2003-05-01

    Industrial- and multimedia applications need cost effective, compact and flexible 3D profiling instruments. In the talk we will show the principle of, applications for and results from a new miniaturized 3-D profiling system for macroscopic scenes. The system uses a compact housing and is usable like a camera with minimum stabilization like a tripod. The system is based on common fringe projection technique. Camera and projector are assembled with parallel optical axes having coplanar projection and imaging plane. Their axes distance is comparable to the human eyes distance altogether giving a complete system of 21x20x11 cm size and allowing to measure high gradient objects like the interior of tubes. The fringe projector uses a LCD which enables fast and flexible pattern projection. Camera and projector have a short focal length and a high system aperture as well as a large depth of focus. Thus, objects can be measured from a shorter distance compared to common systems (e.g. 1 m sized objects in 80 cm distance). Actually, objects with diameters up to 4 m can be profiled because the set-up allows working with completely opened aperture combined with bright lamps giving a big amount of available light and a high Signal to Noise Ratio. Normally a small basis has the disadvantage of reduced sensitivity. We investigated in methods to compensate the reduced sensitivity via setup and enhanced evaluation methods. For measurement we use synthetic wavelengths. The developed algorithms are completely adaptable concerning the users needs of speed and accuracy. The 3D camera is built from low cost components, robust, nearly handheld and delivers insights also into difficult technical objects like tubes and inside volumes. Besides the realized high resolution phase measurement the system calibration is an important task for usability. While calibrating with common photogrammetric models (which are typically used for actual fringe projection systems) problems were found that originate from the short focal length and the extreme opening angle of the system as well as the large depth of focus. The actual calibration method is outlined and current problems are shown. An improved calibration of the system is discussed for improved results in future.

  5. Tunable compound eye cameras

    NASA Astrophysics Data System (ADS)

    Pätz, Daniel; Leopold, Steffen; Knöbber, Fabian; Sinzinger, Stefan; Hoffmann, Martin; Ambacher, Oliver

    2010-05-01

    We present design and realization concepts for thin compound eye cameras with enhanced optical functionality. The systems are based on facets with individually tunable focus lengths and viewing angles for scanning of the object space. The active lens elements are made of aluminum nitride (AlN)/nanocrystalline diamond (NCD) membranes. This material system allows slow thermally actuated elements with a large deformation range as well as fast piezoelectric elements with a smaller deformation range. Due to the extreme mechanical stability of these materials, we are able to realize microoptical components with optimum surface qualities as well as an excellent long-term stability. We use facets of microlenses with 1 mm in diameter and a tunable focusing power to compensate for the focus shift for different viewing angles during the scanning procedure. The beam deflection for scanning is realized either by laterally shifting spherical elements or by a tunable microprism with reduced aberrations. For both actuators we present a design, fabrication concept and first experimental results.

  6. Narrow QRS complex tachycardias.

    PubMed

    Jacobson, Carol

    2007-01-01

    Narrow QRS complex tachycardias are either atrioventricular (AV) nodal passive or AV nodal active. AV nodal passive tachycardias do not require the participation of the AV node in maintenance of the tachycardia. Examples are atrial tachycardia, atrial flutter, and atrial fibrillation. Treatment is directed at ventricular rate control with calcium channel blockers or beta-blockers. AV nodal active tachycardias require active participation of the AV node in maintaining the tachycardia. Examples include AV nodal reentry tachycardia and circus movement tachycardia using an accessory pathway. Treatment with a vagal maneuver or adenosine usually terminates the tachycardia. Recognition of these tachycardias is reviewed. PMID:18019517

  7. Determining Camera Gain in Room Temperature Cameras

    SciTech Connect

    Joshua Cogliati

    2010-12-01

    James R. Janesick provides a method for determining the amplification of a CCD or CMOS camera when only access to the raw images is provided. However, the equation that is provided ignores the contribution of dark current. For CCD or CMOS cameras that are cooled well below room temperature, this is not a problem, however, the technique needs adjustment for use with room temperature cameras. This article describes the adjustment made to the equation, and a test of this method.

  8. Two-Camera Acquisition and Tracking of a Flying Target

    NASA Technical Reports Server (NTRS)

    Biswas, Abhijit; Assad, Christopher; Kovalik, Joseph M.; Pain, Bedabrata; Wrigley, Chris J.; Twiss, Peter

    2008-01-01

    A method and apparatus have been developed to solve the problem of automated acquisition and tracking, from a location on the ground, of a luminous moving target in the sky. The method involves the use of two electronic cameras: (1) a stationary camera having a wide field of view, positioned and oriented to image the entire sky; and (2) a camera that has a much narrower field of view (a few degrees wide) and is mounted on a two-axis gimbal. The wide-field-of-view stationary camera is used to initially identify the target against the background sky. So that the approximate position of the target can be determined, pixel locations on the image-detector plane in the stationary camera are calibrated with respect to azimuth and elevation. The approximate target position is used to initially aim the gimballed narrow-field-of-view camera in the approximate direction of the target. Next, the narrow-field-of view camera locks onto the target image, and thereafter the gimbals are actuated as needed to maintain lock and thereby track the target with precision greater than that attainable by use of the stationary camera.

  9. Angle Performance on Optima XE

    SciTech Connect

    David, Jonathan; Satoh, Shu

    2011-01-07

    Angle control on high energy implanters is important due to shrinking device dimensions, and sensitivity to channeling at high beam energies. On Optima XE, beam-to-wafer angles are controlled in both the horizontal and vertical directions. In the horizontal direction, the beam angle is measured through a series of narrow slits, and any angle adjustment is made by steering the beam with the corrector magnet. In the vertical direction, the beam angle is measured through a high aspect ratio mask, and any angle adjustment is made by slightly tilting the wafer platen during implant.Using a sensitive channeling condition, we were able to quantify the angle repeatability of Optima XE. By quantifying the sheet resistance sensitivity to both horizontal and vertical angle variation, the total angle variation was calculated as 0.04 deg. (1{sigma}). Implants were run over a five week period, with all of the wafers selected from a single boule, in order to control for any crystal cut variation.

  10. Evaluating intensified camera systems

    SciTech Connect

    S. A. Baker

    2000-06-30

    This paper describes image evaluation techniques used to standardize camera system characterizations. The authors group is involved with building and fielding several types of camera systems. Camera types include gated intensified cameras, multi-frame cameras, and streak cameras. Applications range from X-ray radiography to visible and infrared imaging. Key areas of performance include sensitivity, noise, and resolution. This team has developed an analysis tool, in the form of image processing software, to aid an experimenter in measuring a set of performance metrics for their camera system. These performance parameters are used to identify a camera system's capabilities and limitations while establishing a means for camera system comparisons. The analysis tool is used to evaluate digital images normally recorded with CCD cameras. Electro-optical components provide fast shuttering and/or optical gain to camera systems. Camera systems incorporate a variety of electro-optical components such as microchannel plate (MCP) or proximity focused diode (PFD) image intensifiers; electro-static image tubes; or electron-bombarded (EB) CCDs. It is often valuable to evaluate the performance of an intensified camera in order to determine if a particular system meets experimental requirements.

  11. Vacuum Camera Cooler

    NASA Technical Reports Server (NTRS)

    Laugen, Geoffrey A.

    2011-01-01

    Acquiring cheap, moving video was impossible in a vacuum environment, due to camera overheating. This overheating is brought on by the lack of cooling media in vacuum. A water-jacketed camera cooler enclosure machined and assembled from copper plate and tube has been developed. The camera cooler (see figure) is cup-shaped and cooled by circulating water or nitrogen gas through copper tubing. The camera, a store-bought "spy type," is not designed to work in a vacuum. With some modifications the unit can be thermally connected when mounted in the cup portion of the camera cooler. The thermal conductivity is provided by copper tape between parts of the camera and the cooled enclosure. During initial testing of the demonstration unit, the camera cooler kept the CPU (central processing unit) of this video camera at operating temperature. This development allowed video recording of an in-progress test, within a vacuum environment.

  12. Novel fundus camera design

    NASA Astrophysics Data System (ADS)

    Dehoog, Edward A.

    A fundus camera a complex optical system that makes use of the principle of reflex free indirect ophthalmoscopy to image the retina. Despite being in existence as early as 1900's, little has changed in the design of a fundus camera and there is minimal information about the design principles utilized. Parameters and specifications involved in the design of fundus camera are determined and their affect on system performance are discussed. Fundus cameras incorporating different design methods are modeled and a performance evaluation based on design parameters is used to determine the effectiveness of each design strategy. By determining the design principles involved in the fundus camera, new cameras can be designed to include specific imaging modalities such as optical coherence tomography, imaging spectroscopy and imaging polarimetry to gather additional information about properties and structure of the retina. Design principles utilized to incorporate such modalities into fundus camera systems are discussed. Design, implementation and testing of a snapshot polarimeter fundus camera are demonstrated.

  13. Constrained space camera assembly

    DOEpatents

    Heckendorn, Frank M.; Anderson, Erin K.; Robinson, Casandra W.; Haynes, Harriet B.

    1999-01-01

    A constrained space camera assembly which is intended to be lowered through a hole into a tank, a borehole or another cavity. The assembly includes a generally cylindrical chamber comprising a head and a body and a wiring-carrying conduit extending from the chamber. Means are included in the chamber for rotating the body about the head without breaking an airtight seal formed therebetween. The assembly may be pressurized and accompanied with a pressure sensing means for sensing if a breach has occurred in the assembly. In one embodiment, two cameras, separated from their respective lenses, are installed on a mounting apparatus disposed in the chamber. The mounting apparatus includes means allowing both longitudinal and lateral movement of the cameras. Moving the cameras longitudinally focuses the cameras, and moving the cameras laterally away from one another effectively converges the cameras so that close objects can be viewed. The assembly further includes means for moving lenses of different magnification forward of the cameras.

  14. Making Ceramic Cameras

    ERIC Educational Resources Information Center

    Squibb, Matt

    2009-01-01

    This article describes how to make a clay camera. This idea of creating functional cameras from clay allows students to experience ceramics, photography, and painting all in one unit. (Contains 1 resource and 3 online resources.)

  15. Nanosecond frame cameras

    SciTech Connect

    Frank, A M; Wilkins, P R

    2001-01-05

    The advent of CCD cameras and computerized data recording has spurred the development of several new cameras and techniques for recording nanosecond images. We have made a side by side comparison of three nanosecond frame cameras, examining them for both performance and operational characteristics. The cameras include; Micro-Channel Plate/CCD, Image Diode/CCD and Image Diode/Film; combinations of gating/data recording. The advantages and disadvantages of each device will be discussed.

  16. Ultraviolet Spectroscopy of Narrow Coronal Mass Ejections

    NASA Astrophysics Data System (ADS)

    Dobrzycka, D.; Raymond, J. C.; Biesecker, D. A.; Li, J.; Ciaravella, A.

    2003-05-01

    We present Ultraviolet Coronagraph Spectrometer (UVCS) observations of five narrow coronal mass ejections (CMEs) that were among 15 narrow CMEs originally selected by Gilbert and coworkers. Two events (1999 March 27, April 15) were ``structured,'' i.e., in white-light data they exhibited well-defined interior features, and three (1999 May 9, May 21, June 3) were ``unstructured,'' i.e., appeared featureless. In UVCS data the events were seen as 4-13 wide enhancements of the strongest coronal lines H I Ly? and O VI ??1032, 1037. We derived electron densities for several of the events from the Large Angle and Spectrometric Coronagraph Experiment (LASCO) C2 white-light observations. They are comparable to or smaller than densities inferred for other CMEs. We modeled the observable properties of examples of the structured (1999 April 15) and unstructured (1999 May 9) narrow CMEs at different heights in the corona between 1.5 and 2 Rsolar. The derived electron temperatures, densities, and outflow speeds are similar for those two types of ejections. They were compared with properties of polar coronal jets and other CMEs. We discuss different scenarios of narrow CME formation as either a jet formed by reconnection onto open field lines or a CME ejected by expansion of closed field structures. Overall, we conclude that the existing observations do not definitively place the narrow CMEs into the jet or the CME picture, but the acceleration of the 1999 April 15 event resembles acceleration seen in many CMEs, rather than constant speeds or deceleration observed in jets.

  17. Digital Pinhole Camera

    ERIC Educational Resources Information Center

    Lancor, Rachael; Lancor, Brian

    2014-01-01

    In this article we describe how the classic pinhole camera demonstration can be adapted for use with digital cameras. Students can easily explore the effects of the size of the pinhole and its distance from the sensor on exposure time, magnification, and image quality. Instructions for constructing a digital pinhole camera and our method for

  18. Digital Pinhole Camera

    ERIC Educational Resources Information Center

    Lancor, Rachael; Lancor, Brian

    2014-01-01

    In this article we describe how the classic pinhole camera demonstration can be adapted for use with digital cameras. Students can easily explore the effects of the size of the pinhole and its distance from the sensor on exposure time, magnification, and image quality. Instructions for constructing a digital pinhole camera and our method for…

  19. Prediction of Viking lander camera image quality

    NASA Technical Reports Server (NTRS)

    Huck, F. O.; Burcher, E. E.; Jobson, D. J.; Wall, S. D.

    1976-01-01

    Formulations are presented that permit prediction of image quality as a function of camera performance, surface radiance properties, and lighting and viewing geometry. Predictions made for a wide range of surface radiance properties reveal that image quality depends strongly on proper camera dynamic range command and on favorable lighting and viewing geometry. Proper camera dynamic range commands depend mostly on the surface albedo that will be encountered. Favorable lighting and viewing geometries depend mostly on lander orientation with respect to the diurnal sun path over the landing site, and tend to be independent of surface albedo and illumination scattering function. Side lighting with low sun elevation angles (10 to 30 deg) is generally favorable for imaging spatial details and slopes, whereas high sun elevation angles are favorable for measuring spectral reflectances.

  20. 10. CONSTRUCTION OF FIXED CAMERA STATION 1100 (BUILDING NO. 42020) ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    10. CONSTRUCTION OF FIXED CAMERA STATION 1100 (BUILDING NO. 42020) LOOKING NORTHEAST SHOWING CONCRETE FOUNDATION, WOOD FORMWORK AND STEEL REINFORCING, March 26, 1948. (Original photograph in possession of Dave Willis, San Diego, California.) - Variable Angle Launcher Complex, Camera Stations, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  1. 9. COMPLETED ROLLING CAMERA CAR ON RAILROAD TRACK AND BRIDGE ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    9. COMPLETED ROLLING CAMERA CAR ON RAILROAD TRACK AND BRIDGE LOOKING WEST, APRIL 26, 1948. (ORIGINAL PHOTOGRAPH IN POSSESSION OF DAVE WILLIS, SAN DIEGO, CALIFORNIA.) - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  2. The MMT all-sky camera

    NASA Astrophysics Data System (ADS)

    Pickering, T. E.

    2006-06-01

    The MMT all-sky camera is a low-cost, wide-angle camera system that takes images of the sky every 10 seconds, day and night. It is based on an Adirondack Video Astronomy StellaCam II video camera and utilizes an auto-iris fish-eye lens to allow safe operation under all lighting conditions, even direct sunlight. This combined with the anti-blooming characteristics of the StellaCam's detector allows useful images to be obtained during sunny days as well as brightly moonlit nights. Under dark skies the system can detect stars as faint as 6th magnitude as well as very thin cirrus and low surface brightness zodiacal features such as gegenschein. The total hardware cost of the system was less than $3500 including computer and framegrabber card, a fraction of the cost of comparable systems utilizing traditional CCD cameras.

  3. Classroom multispectral imaging using inexpensive digital cameras.

    NASA Astrophysics Data System (ADS)

    Fortes, A. D.

    2007-12-01

    The proliferation of increasingly cheap digital cameras in recent years means that it has become easier to exploit the broad wavelength sensitivity of their CCDs (360 - 1100 nm) for classroom-based teaching. With the right tools, it is possible to open children's eyes to the invisible world of UVA and near-IR radiation either side of our narrow visual band. The camera-filter combinations I describe can be used to explore the world of animal vision, looking for invisible markings on flowers, or in bird plumage, for example. In combination with a basic spectroscope (such as the Project-STAR handheld plastic spectrometer, 25), it is possible to investigate the range of human vision and camera sensitivity, and to explore the atomic and molecular absorption lines from the solar and terrestrial atmospheres. My principal use of the cameras has been to teach multispectral imaging of the kind used to determine remotely the composition of planetary surfaces. A range of camera options, from 50 circuit-board mounted CCDs up to $900 semi-pro infrared camera kits (including mobile phones along the way), and various UV-vis-IR filter options will be presented. Examples of multispectral images taken with these systems are used to illustrate the range of classroom topics that can be covered. Particular attention is given to learning about spectral reflectance curves and comparing images from Earth and Mars taken using the same filter combination that it used on the Mars Rovers.

  4. Tower Camera Handbook

    SciTech Connect

    Moudry, D

    2005-01-01

    The tower camera in Barrow provides hourly images of ground surrounding the tower. These images may be used to determine fractional snow cover as winter arrives, for comparison with the albedo that can be calculated from downward-looking radiometers, as well as some indication of present weather. Similarly, during spring time, the camera images show the changes in the ground albedo as the snow melts. The tower images are saved in hourly intervals. In addition, two other cameras, the skydeck camera in Barrow and the piling camera in Atqasuk, show the current conditions at those sites.

  5. Rethinking camera user interfaces

    NASA Astrophysics Data System (ADS)

    Brewster, Stephen; McAdam, Christopher; McDonald, James; Maciver, James

    2012-01-01

    Digital cameras and camera phones are now very widely used but there are some issues that affect their use and the quality of the images captured. Many of these issues are due to problem of interaction or feedback from the camera. Modern smartphones have a wide range of sensors, rich feedback mechanisms and lots of processing power. We have developed and evaluated a range of new interaction techniques for cameras and camera phones that improve the picture taking process and allow people to take better pictures first time.

  6. The nucleus of comet 67P through the eyes of the OSIRIS cameras

    NASA Astrophysics Data System (ADS)

    Guettler, Carsten; Sierks, Holger; Barbieri, Cesare; Lamy, Philippe; Rodrigo, Rafael; Koschny, Detlef; Rickman, Hans; OSIRIS Team; Capaccioni, Fabrizio; Filacchione, Gianrico; Ciarniello, Mauro; Erard, Stephane; Rinaldi, Giovanna; Tosi, Federico

    2015-11-01

    The Rosetta spacecraft is studying comet 67P/Churyumov-Gerasimenko from a close distance since August 2014. Onboard the spacecraft, the two scientific cameras, the OSIRIS narrow- and the wide-angle camera, are observing the cometary nucleus, its activity, as well as the dust and gas environment.This overview paper will cover OSIRIS science from the early arrival and mapping phase, the PHILAE landing, and the escort phase including the two close fly-bys. With a first characterization of global physical parameters of the nucleus, the OSIRIS cameras also provided the data to reconstruct a 3D shape model of the comet and a division into morphologic sub-units. From observations of near-surface activity, jet-like features can be projected onto the surface and active sources can be correlated with surface features like cliffs, pits, or flat planes. The increase of activity during and after perihelion in August 2015 showed several outbursts, which were seen as strong, collimated jets originating from the southern hemisphere.A comparison of results between different Rosetta instruments will give further inside into the physics of the comet's nucleus and its coma. The OSIRIS and VIRTIS instruments are particularly well suited to support and complement each other. With an overlap in spectral range, one instrument can provide the best spatial resolution while the other is strong in the spectral resolution. A summary on collaborative efforts will be given.

  7. Narrow band 3 3 Mueller polarimetric endoscopy

    PubMed Central

    Qi, Ji; Ye, Menglong; Singh, Mohan; Clancy, Neil T.; Elson, Daniel S.

    2013-01-01

    Mueller matrix polarimetric imaging has shown potential in tissue diagnosis but is challenging to implement endoscopically. In this work, a narrow band 3 3 Mueller matrix polarimetric endoscope was designed by rotating the endoscope to generate 0, 45 and 90 linearly polarized illumination and positioning a rotating filter wheel in front of the camera containing three polarisers to permit polarization state analysis for backscattered light. The system was validated with a rotating linear polarizer and a diffuse reflection target. Initial measurements of 3 3 Mueller matrices on a rat are demonstrated, followed by matrix decomposition into the depolarization and retardance matrices for further analysis. Our work shows the feasibility of implementing polarimetric imaging in a rigid endoscope conveniently and economically in order to reveal diagnostic information. PMID:24298405

  8. Automated Camera Calibration

    NASA Technical Reports Server (NTRS)

    Chen, Siqi; Cheng, Yang; Willson, Reg

    2006-01-01

    Automated Camera Calibration (ACAL) is a computer program that automates the generation of calibration data for camera models used in machine vision systems. Machine vision camera models describe the mapping between points in three-dimensional (3D) space in front of the camera and the corresponding points in two-dimensional (2D) space in the camera s image. Calibrating a camera model requires a set of calibration data containing known 3D-to-2D point correspondences for the given camera system. Generating calibration data typically involves taking images of a calibration target where the 3D locations of the target s fiducial marks are known, and then measuring the 2D locations of the fiducial marks in the images. ACAL automates the analysis of calibration target images and greatly speeds the overall calibration process.

  9. Automatic camera tracking for remote manipulators

    SciTech Connect

    Stoughton, R.S.; Martin, H.L.; Bentz, R.R.

    1984-07-01

    The problem of automatic camera tracking of mobile objects is addressed with specific reference to remote manipulators and using either fixed or mobile cameras. The technique uses a kinematic approach employing 4 x 4 coordinate transformation matrices to solve for the needed camera PAN and TILT angles. No vision feedback systems are used, as the required input data are obtained entirely from position sensors from the manipulator and the camera-positioning system. All hardware requirements are generally satisfied by currently available remote manipulator systems with a supervisory computer. The system discussed here implements linear plus on/off (bang-bang) closed-loop control with a +-2-deg deadband. The deadband area is desirable to avoid operator seasickness caused by continuous camera movement. Programming considerations for camera control, including operator interface options, are discussed. The example problem presented is based on an actual implementation using a PDP 11/34 computer, a TeleOperator Systems SM-229 manipulator, and an Oak Ridge National Laboratory (ORNL) camera-positioning system. 3 references, 6 figures, 2 tables.

  10. Automatic camera tracking for remote manipulators

    SciTech Connect

    Stoughton, R.S.; Martin, H.L.; Bentz, R.R.

    1984-04-01

    The problem of automatic camera tracking of mobile objects is addressed with specific reference to remote manipulators and using either fixed or mobile cameras. The technique uses a kinematic approach employing 4 x 4 coordinate transformation matrices to solve for the needed camera PAN and TILT angles. No vision feedback systems are used, as the required input data are obtained entirely from position sensors from the manipulator and the camera-positioning system. All hardware requirements are generally satisfied by currently available remote manipulator systems with a supervisory computer. The system discussed here implements linear plus on/off (bang-bang) closed-loop control with a +-2/sup 0/ deadband. The deadband area is desirable to avoid operator seasickness caused by continuous camera movement. Programming considerations for camera control, including operator interface options, are discussed. The example problem presented is based on an actual implementation using a PDP 11/34 computer, a TeleOperator Systems SM-229 manipulator, and an Oak Ridge National Laboratory (ORNL) camera-positioning system. 3 references, 6 figures, 2 tables.

  11. Ultra-fast framing camera tube

    DOEpatents

    Kalibjian, Ralph (1051 Batavia Ave., Livermore, CA 94550)

    1981-01-01

    An electronic framing camera tube features focal plane image dissection and synchronized restoration of the dissected electron line images to form two-dimensional framed images. Ultra-fast framing is performed by first streaking a two-dimensional electron image across a narrow slit, thereby dissecting the two-dimensional electron image into sequential electron line images. The dissected electron line images are then restored into a framed image by a restorer deflector operated synchronously with the dissector deflector. The number of framed images on the tube's viewing screen is equal to the number of dissecting slits in the tube. The distinguishing features of this ultra-fast framing camera tube are the focal plane dissecting slits, and the synchronously-operated restorer deflector which restores the dissected electron line images into a two-dimensional framed image. The framing camera tube can produce image frames having high spatial resolution of optical events in the sub-100 picosecond range.

  12. Single-Camera Panoramic-Imaging Systems

    NASA Technical Reports Server (NTRS)

    Lindner, Jeffrey L.; Gilbert, John

    2007-01-01

    Panoramic detection systems (PDSs) are developmental video monitoring and image-data processing systems that, as their name indicates, acquire panoramic views. More specifically, a PDS acquires images from an approximately cylindrical field of view that surrounds an observation platform. The main subsystems and components of a basic PDS are a charge-coupled- device (CCD) video camera and lens, transfer optics, a panoramic imaging optic, a mounting cylinder, and an image-data-processing computer. The panoramic imaging optic is what makes it possible for the single video camera to image the complete cylindrical field of view; in order to image the same scene without the benefit of the panoramic imaging optic, it would be necessary to use multiple conventional video cameras, which have relatively narrow fields of view.

  13. 800 x 800 charge-coupled device /CCD/ camera for the Galileo Jupiter Orbiter mission

    NASA Technical Reports Server (NTRS)

    Clary, M. C.; Klaasen, K. P.; Snyder, L. M.; Wang, P. K.

    1979-01-01

    During January 1982 the NASA space transportation system will launch a Galileo spacecraft composed of an orbiting bus and an atmospheric entry probe to arrive at the planet Jupiter in July 1985. A prime element of the orbiter's scientific instrument payload will be a new generation slow-scan planetary imaging system based on a newly developed 800 x 800 charge-coupled device (CCD) image sensor. Following Jupiter orbit insertion, the single, narrow-angle, CCD camera, designated the Solid State Imaging (SSI) Subsystem, will operate for 20 months as the orbiter makes repeated encounters with Jupiter and its Galilean Satellites. During this period the SSI will acquire 40,000 images of Jupiter's atmosphere and the surfaces of the Galilean Satellites. This paper describes the SSI, its operational modes, and science objectives.

  14. Microchannel plate streak camera

    DOEpatents

    Wang, C.L.

    1989-03-21

    An improved streak camera in which a microchannel plate electron multiplier is used in place of or in combination with the photocathode used in prior streak cameras is disclosed. The improved streak camera is far more sensitive to photons (UV to gamma-rays) than the conventional x-ray streak camera which uses a photocathode. The improved streak camera offers gamma-ray detection with high temporal resolution. It also offers low-energy x-ray detection without attenuation inside the cathode. Using the microchannel plate in the improved camera has resulted in a time resolution of about 150 ps, and has provided a sensitivity sufficient for 1,000 KeV x-rays. 3 figs.

  15. Microchannel plate streak camera

    DOEpatents

    Wang, Ching L. (Livermore, CA)

    1989-01-01

    An improved streak camera in which a microchannel plate electron multiplier is used in place of or in combination with the photocathode used in prior streak cameras. The improved streak camera is far more sensitive to photons (UV to gamma-rays) than the conventional x-ray streak camera which uses a photocathode. The improved streak camera offers gamma-ray detection with high temporal resolution. It also offers low-energy x-ray detection without attenuation inside the cathode. Using the microchannel plate in the improved camera has resulted in a time resolution of about 150 ps, and has provided a sensitivity sufficient for 1000 KeV x-rays.

  16. Microchannel plate streak camera

    DOEpatents

    Wang, C.L.

    1984-09-28

    An improved streak camera in which a microchannel plate electron multiplier is used in place of or in combination with the photocathode used in prior streak cameras. The improved streak camera is far more sensitive to photons (uv to gamma-rays) than the conventional x-ray streak camera which uses a photocathode. The improved streak camera offers gamma-ray detection with high temporal resolution. It also offers low-energy x-ray detection without attenuation inside the cathode. Using the microchannel plate in the improved camera has resulted in a time resolution of about 150 ps, and has provided a sensitivity sufficient for 1000 keV x-rays.

  17. Analytical multicollimator camera calibration

    USGS Publications Warehouse

    Tayman, W.P.

    1978-01-01

    Calibration with the U.S. Geological survey multicollimator determines the calibrated focal length, the point of symmetry, the radial distortion referred to the point of symmetry, and the asymmetric characteristiecs of the camera lens. For this project, two cameras were calibrated, a Zeiss RMK A 15/23 and a Wild RC 8. Four test exposures were made with each camera. Results are tabulated for each exposure and averaged for each set. Copies of the standard USGS calibration reports are included. ?? 1978.

  18. Ultraviolet Spectroscopy of Narrow CMEs

    NASA Astrophysics Data System (ADS)

    Dobrzycka, D.; Raymond, J. C.; Biesecker, D. A.; Li, J.; Ciaravella, A.

    2002-12-01

    Coronal mass ejections (CMEs) are commonly described as new, discrete, bright features appearing in the field of view of a white light coronagraph and moving outward over a period of minutes to hours. Apparent angular widths of the CMEs cover a wide range, from few to 360. The very narrow structures (narrower than ~15-20) form only a small subset of all the observed CMEs and are usually referred to as rays, spikes, fans, etc. Recently, Gilbert et al. (2001, ApJ, 550, 1093) reported LASCO white light observations of 15 selected narrow CMEs. We extended the study and analyzed ultraviolet spectroscopy of narrow ejections, including several events listed by Gilbert et al. The data were obtained by the Ultraviolet Coronagraph Spectrometer (UVCS/SOHO). We present comparison of narrow and large CMEs and discuss the relation of the narrow CMEs to coronal jets and/or other narrow transient events. This work is supported by NASA under Grant NAG5-11420 to the Smithsonian Astrophysical Observatory, by the Italian Space Agency and by PRODEX (Swiss contribution).

  19. LSST camera optics design

    NASA Astrophysics Data System (ADS)

    Olivier, Scot S.; Riot, Vincent J.; Gilmore, David K.; Bauman, Brian; Pratuch, Steve; Seppala, Lynn; Ku, John; Nordby, Martin; Foss, Mike; Antilogus, Pierre; Morgado, Nazario; Sassolas, Benoit; Flaminio, Raffaele; Michel, Christophe

    2012-09-01

    The Large Synoptic Survey Telescope (LSST) uses a novel, three-mirror, telescope design feeding a camera system that includes a set of broad-band filters and three refractive corrector lenses to produce a flat field at the focal plane with a wide field of view. Optical design of the camera lenses and filters is integrated in with the optical design of telescope mirrors to optimize performance. We discuss the rationale for the LSST camera optics design, describe the methodology for fabricating, coating, mounting and testing the lenses and filters, and present the results of detailed analyses demonstrating that the camera optics will meet their performance goals.

  20. LSST Camera Optics Design

    SciTech Connect

    Riot, V J; Olivier, S; Bauman, B; Pratuch, S; Seppala, L; Gilmore, D; Ku, J; Nordby, M; Foss, M; Antilogus, P; Morgado, N

    2012-05-24

    The Large Synoptic Survey Telescope (LSST) uses a novel, three-mirror, telescope design feeding a camera system that includes a set of broad-band filters and three refractive corrector lenses to produce a flat field at the focal plane with a wide field of view. Optical design of the camera lenses and filters is integrated in with the optical design of telescope mirrors to optimize performance. We discuss the rationale for the LSST camera optics design, describe the methodology for fabricating, coating, mounting and testing the lenses and filters, and present the results of detailed analyses demonstrating that the camera optics will meet their performance goals.

  1. Digital Electronic Still Camera

    NASA Technical Reports Server (NTRS)

    Holland, Samuel D.; Yeates, Herbert D.

    1993-01-01

    Digital electronic still camera part of electronic recording, processing, tansmitting, and displaying system. Removable hard-disk drive in camera serves as digital electronic equivalent of photographic film. Images viewed, analyzed, or transmitted quickly. Camera takes images of nearly photographic quality and stores them in digital form. Portable, hand-held, battery-powered unit designed for scientific use. Camera used in conjunction with playback unit also serving as transmitting unit if images sent to remote station. Remote station equipped to store, process, and display images. Digital image data encoded with error-correcting code at playback/transmitting unit for error-free transmission to remote station.

  2. Ringfield lithographic camera

    DOEpatents

    Sweatt, William C. (Albuquerque, NM)

    1998-01-01

    A projection lithography camera is presented with a wide ringfield optimized so as to make efficient use of extreme ultraviolet radiation from a large area radiation source (e.g., D.sub.source .apprxeq.0.5 mm). The camera comprises four aspheric mirrors optically arranged on a common axis of symmetry with an increased etendue for the camera system. The camera includes an aperture stop that is accessible through a plurality of partial aperture stops to synthesize the theoretical aperture stop. Radiation from a mask is focused to form a reduced image on a wafer, relative to the mask, by reflection from the four aspheric mirrors.

  3. Intrinsic camera calibration equipped with Scheimpflug optical device

    NASA Astrophysics Data System (ADS)

    Fasogbon, Peter; Duvieubourg, Luc; Lacaze, Pierre-Antoine; Macaire, Ludovic

    2015-04-01

    We present the problem of setting up an intrinsic camera calibration under Scheimpflug condition for an industrial application. We aim to calibrate the Scheimpflug camera using a roughly hand positioned calibration pattern with bundle adjustment technique. The assumptions used by classical calibration methodologies are not valid anymore for cameras undergoing Scheimpflug condition. Therefore, we slightly modify pin-hole model to estimate the Scheimpflug angles. The results are tested on real data sets captured from cameras limited by various industrial constraints, and in the presence of large distortions.

  4. 71. COMPLETED 'A' FRAME STRUCTURE LOOKING SOUTH SHOWING CAMERA TOWER, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    71. COMPLETED 'A' FRAME STRUCTURE LOOKING SOUTH SHOWING CAMERA TOWER, DRIVE GEARS, COUNTERWEIGHT CAR AND CANTILEVERED WALKWAYS, July 28, 1948. (Original photograph in possession of Dave Willis, San Diego, California.) - Variable Angle Launcher Complex, Variable Angle Launcher, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  5. New developments to improve SO2 cameras

    NASA Astrophysics Data System (ADS)

    Luebcke, P.; Bobrowski, N.; Hoermann, C.; Kern, C.; Klein, A.; Kuhn, J.; Vogel, L.; Platt, U.

    2012-12-01

    The SO2 camera is a remote sensing instrument that measures the two-dimensional distribution of SO2 (column densities) in volcanic plumes using scattered solar radiation as a light source. From these data SO2-fluxes can be derived. The high time resolution of the order of 1 Hz allows correlating SO2 flux measurements with other traditional volcanological measurement techniques, i.e., seismology. In the last years the application of SO2 cameras has increased, however, there is still potential to improve the instrumentation. First of all, the influence of aerosols and ash in the volcanic plume can lead to large errors in the calculated SO2 flux, if not accounted for. We present two different concepts to deal with the influence of ash and aerosols. The first approach uses a co-axial DOAS system that was added to a two filter SO2 camera. The camera used Filter A (peak transmission centred around 315 nm) to measures the optical density of SO2 and Filter B (centred around 330 nm) to correct for the influence of ash and aerosol. The DOAS system simultaneously performs spectroscopic measurements in a small area of the camera's field of view and gives additional information to correct for these effects. Comparing the optical densities for the two filters with the SO2 column density from the DOAS allows not only a much more precise calibration, but also to draw conclusions about the influence from ash and aerosol scattering. Measurement examples from Popocatépetl, Mexico in 2011 are shown and interpreted. Another approach combines the SO2 camera measurement principle with the extremely narrow and periodic transmission of a Fabry-Pérot interferometer. The narrow transmission window allows to select individual SO2 absorption bands (or series of bands) as a substitute for Filter A. Measurements are therefore more selective to SO2. Instead of Filter B, as in classical SO2 cameras, the correction for aerosol can be performed by shifting the transmission window of the Fabry-Pérot interferometer towards the SO2 absorption cross section minima. A correction of ash and aerosol influences with this technique can decrease deviation from the true column by more than 60%, since the wavelength difference between the two measurement channels is much smaller than in classical SO2 cameras. While the implementation of this approach for a 2D camera encompasses many challenges, it gives the possibility to build a relatively simple and robust scanning instrument for volcanic SO2 distributions. A second problem of the SO2 camera technique is the relatively high price, which prevents its use in many volcano observatories in developing countries. Most SO2 cameras use CCDs that were originally designed for astronomical purposes. The large pixel size and low noise of these detectors compensates for the low intensity of solar radiation in the UV and the low quantum efficiency of the detector in this spectral range. However, the detectors used cost several thousand US dollars. We present results from test measurements using a consumer DSLR camera as a detector of an SO2 camera. Since the camera is not sensitive in the UV, the incoming radiation is first imaged onto a screen that is covered with a suitable fluorescent dye converting the UV radiation to visible light.

  6. Improved Tracking of Targets by Cameras on a Mars Rover

    NASA Technical Reports Server (NTRS)

    Kim, Won; Ansar, Adnan; Steele, Robert

    2007-01-01

    A paper describes a method devised to increase the robustness and accuracy of tracking of targets by means of three stereoscopic pairs of video cameras on a Mars-rover-type exploratory robotic vehicle. Two of the camera pairs are mounted on a mast that can be adjusted in pan and tilt; the third camera pair is mounted on the main vehicle body. Elements of the method include a mast calibration, a camera-pointing algorithm, and a purely geometric technique for handing off tracking between different camera pairs at critical distances as the rover approaches a target of interest. The mast calibration is an extension of camera calibration in which the camera images of calibration targets at known positions are collected at various pan and tilt angles. In the camerapointing algorithm, pan and tilt angles are computed by a closed-form, non-iterative solution of inverse kinematics of the mast combined with mathematical models of the cameras. The purely geometric camera-handoff technique involves the use of stereoscopic views of a target of interest in conjunction with the mast calibration.

  7. Camera Operator and Videographer

    ERIC Educational Resources Information Center

    Moore, Pam

    2007-01-01

    Television, video, and motion picture camera operators produce images that tell a story, inform or entertain an audience, or record an event. They use various cameras to shoot a wide range of material, including television series, news and sporting events, music videos, motion pictures, documentaries, and training sessions. Those who film or…

  8. Camera Operator and Videographer

    ERIC Educational Resources Information Center

    Moore, Pam

    2007-01-01

    Television, video, and motion picture camera operators produce images that tell a story, inform or entertain an audience, or record an event. They use various cameras to shoot a wide range of material, including television series, news and sporting events, music videos, motion pictures, documentaries, and training sessions. Those who film or

  9. Compact Solar Camera.

    ERIC Educational Resources Information Center

    Juergens, Albert

    1980-01-01

    Describes a compact solar camera built as a one-semester student project. This camera is used for taking pictures of the sun and moon and for direct observation of the image of the sun on a screen. (Author/HM)

  10. Constrained space camera assembly

    DOEpatents

    Heckendorn, F.M.; Anderson, E.K.; Robinson, C.W.; Haynes, H.B.

    1999-05-11

    A constrained space camera assembly which is intended to be lowered through a hole into a tank, a borehole or another cavity is disclosed. The assembly includes a generally cylindrical chamber comprising a head and a body and a wiring-carrying conduit extending from the chamber. Means are included in the chamber for rotating the body about the head without breaking an airtight seal formed therebetween. The assembly may be pressurized and accompanied with a pressure sensing means for sensing if a breach has occurred in the assembly. In one embodiment, two cameras, separated from their respective lenses, are installed on a mounting apparatus disposed in the chamber. The mounting apparatus includes means allowing both longitudinal and lateral movement of the cameras. Moving the cameras longitudinally focuses the cameras, and moving the cameras laterally away from one another effectively converges the cameras so that close objects can be viewed. The assembly further includes means for moving lenses of different magnification forward of the cameras. 17 figs.

  11. The Camera Cook Book.

    ERIC Educational Resources Information Center

    Education Development Center, Inc., Newton, MA.

    Intended for use with the photographic materials available from the Workshop for Learning Things, Inc., this "camera cookbook" describes procedures that have been tried in classrooms and workshops and proven to be the most functional and inexpensive. Explicit starting off instructions--directions for exploring and loading the camera and for taking…

  12. What are the benefits of having multiple camera angles?

    Atmospheric Science Data Center

    2014-12-08

    ... the properties of the target, just as the more familiar spectral differences are exploited to document its chemical composition. Most imaging space-borne instruments acquire measurements for each location on Earth ...

  13. Dry imaging cameras

    PubMed Central

    Indrajit, IK; Alam, Aftab; Sahni, Hirdesh; Bhatia, Mukul; Sahu, Samaresh

    2011-01-01

    Dry imaging cameras are important hard copy devices in radiology. Using dry imaging camera, multiformat images of digital modalities in radiology are created from a sealed unit of unexposed films. The functioning of a modern dry camera, involves a blend of concurrent processes, in areas of diverse sciences like computers, mechanics, thermal, optics, electricity and radiography. Broadly, hard copy devices are classified as laser and non laser based technology. When compared with the working knowledge and technical awareness of different modalities in radiology, the understanding of a dry imaging camera is often superficial and neglected. To fill this void, this article outlines the key features of a modern dry camera and its important issues that impact radiology workflow. PMID:21799589

  14. Replacing 16 mm film cameras with high definition digital cameras

    SciTech Connect

    Balch, K.S.

    1995-12-31

    For many years 16 mm film cameras have been used in severe environments. These film cameras are used on Hy-G automotive sleds, airborne gun cameras, range tracking and other hazardous environments. The companies and government agencies using these cameras are in need of replacing them with a more cost effective solution. Film-based cameras still produce the best resolving capability, however, film development time, chemical disposal, recurring media cost, and faster digital analysis are factors influencing the desire for a 16 mm film camera replacement. This paper will describe a new camera from Kodak that has been designed to replace 16 mm high speed film cameras.

  15. Six-year operation of the Venus Monitoring Camera (Venus Express): spatial and temporal variations of the properties of particles in upper clouds of Venus from the phase dependence of the near-IR brightness

    NASA Astrophysics Data System (ADS)

    Shalygina, O. S.; Petrova, E. V.; Markiewicz, W. J.

    2015-10-01

    Since May, 2006, the Venus Monitoring Camera (VMC) [1] has been imaging Venus in four narrow spectral channels centered at the wavelengths of 0.365 ?m (UV), 0.513 ?m (VIS), 0.965 ?m (NIR1), and 1.010 ?m (NIR2). It took around 300 000 images in four channels covering almost all the latitudes, including night and day sides. We analyze the whole set of the VMC data processed to October, 2012, i.e. the data from orbits 60 - 2 352 obtained in the phase angle range

  16. Ringfield lithographic camera

    DOEpatents

    Sweatt, W.C.

    1998-09-08

    A projection lithography camera is presented with a wide ringfield optimized so as to make efficient use of extreme ultraviolet radiation from a large area radiation source (e.g., D{sub source} {approx_equal} 0.5 mm). The camera comprises four aspheric mirrors optically arranged on a common axis of symmetry. The camera includes an aperture stop that is accessible through a plurality of partial aperture stops to synthesize the theoretical aperture stop. Radiation from a mask is focused to form a reduced image on a wafer, relative to the mask, by reflection from the four aspheric mirrors. 11 figs.

  17. Kitt Peak speckle camera

    NASA Technical Reports Server (NTRS)

    Breckinridge, J. B.; Mcalister, H. A.; Robinson, W. G.

    1979-01-01

    The speckle camera in regular use at Kitt Peak National Observatory since 1974 is described in detail. The design of the atmospheric dispersion compensation prisms, the use of film as a recording medium, the accuracy of double star measurements, and the next generation speckle camera are discussed. Photographs of double star speckle patterns with separations from 1.4 sec of arc to 4.7 sec of arc are shown to illustrate the quality of image formation with this camera, the effects of seeing on the patterns, and to illustrate the isoplanatic patch of the atmosphere.

  18. Structured light camera calibration

    NASA Astrophysics Data System (ADS)

    Garbat, P.; Skarbek, W.; Tomaszewski, M.

    2013-03-01

    Structured light camera which is being designed with the joined effort of Institute of Radioelectronics and Institute of Optoelectronics (both being large units of the Warsaw University of Technology within the Faculty of Electronics and Information Technology) combines various hardware and software contemporary technologies. In hardware it is integration of a high speed stripe projector and a stripe camera together with a standard high definition video camera. In software it is supported by sophisticated calibration techniques which enable development of advanced application such as real time 3D viewer of moving objects with the free viewpoint or 3D modeller for still objects.

  19. IRNG-camera family

    NASA Astrophysics Data System (ADS)

    Dupiech, Michael; Marche, Pierre M.

    1996-06-01

    Further to the development of the SYNERGI set of modules dedicated to new generation multipurpose high performance thermal cameras based on the SOFRADIR 288 by 4 element IRCCD detector, THOMSON-CSF OPTRONIQUE have decided to extend the family of second generation cameras with the development of SOPHIE. SOPHIE is a handheld infra-red camera, also organized around the 288 by 4 element detector, corresponding to a different cost/performance trade-off. It is an ultralow-cost, ultralight, medium range imager designed for passive observation and surveillance. It exhibits growth potential such as low cost infra-red sights for light armored vehicles firing posts.

  20. Streak camera time calibration procedures

    NASA Technical Reports Server (NTRS)

    Long, J.; Jackson, I.

    1978-01-01

    Time calibration procedures for streak cameras utilizing a modulated laser beam are described. The time calibration determines a writing rate accuracy of 0.15% with a rotating mirror camera and 0.3% with an image converter camera.

  1. Educational Applications for Digital Cameras.

    ERIC Educational Resources Information Center

    Cavanaugh, Terence; Cavanaugh, Catherine

    1997-01-01

    Discusses uses of digital cameras in education. Highlights include advantages and disadvantages, digital photography assignments and activities, camera features and operation, applications for digital images, accessory equipment, and comparisons between digital cameras and other digitizers. (AEF)

  2. Wide-Angle Quasar Feedback

    NASA Astrophysics Data System (ADS)

    Chartas, George

    2015-08-01

    I will present results from the detection of near-relativistic winds launched near the innermost stable circular orbits of SMBHs. A recent detection of a powerful wind in the X-ray bright narrow absorption line (NAL) quasar HS 0810 strengthens the case that quasars play a significant role in feedback. In both deep Chandra and XMM-Newton observations of HS 0810 we detected blueshifted absorption lines implying outflowing velocities ranging between 0.1c and 0.5c. The presence of both an emission line at 6.8 keV and an absorption line at 7.8 keV in the spectral line profile of HS 0810 is a characteristic feature of a P-Cygni profile supporting the presence of an expanding outflowing highly ionized Fe absorber in this object. A hard excess component is detected in the XMM-Newton observation of HS 0810 possibly originating from reflection off the disk. Modeling of the XMM-Newton spectrum constrains the inclination angle to be about 30 degrees. The presence of relativistic winds in both low inclination angle NAL quasars and well as in high inclination angle BAL quasars implies that the solid angle of quasar winds may be quite large. The larger solid angle of quasar winds would also indicate that their contribution to the regulation of the host galaxy may be even more important than previously thought.

  3. Seasonal and vertical changes in leaf angle distribution for selected deciduous broadleaf tree species common to Europe

    NASA Astrophysics Data System (ADS)

    Raabe, Kairi; Pisek, Jan; Sonnentag, Oliver; Annuk, Kalju

    2014-05-01

    Leaf inclination angle distribution is a key parameter in determining the transmission and reflection of radiation by vegetation canopies. It has been previously observed that leaf inclination angle might change gradually from more vertical in the upper canopy and in high light habitats to more horizontal in the lower canopy and in low light habitats [1]. Despite its importance, relatively few measurements on actual leaf angle distributions have been reported for different tree species. Even smaller number of studies have dealt with the possible seasonal changes in leaf angle distribution [2]. In this study the variation of leaf inclination angle distributions was examined both temporally throughout the growing season and vertically at different heights of trees. We report on leaf inclination angle distributions for five deciduous broadleaf species found commonly in several parts of Europe: grey alder (Alnus incana), Silver birch (Betula pendula Roth), chestnut (Castanea), Norway maple (Acer platanoides), and aspen (Populus tremula). The angles were measured using the leveled camera method [3], with the data collected at several separate heights and four times during the period of May-September 2013. The results generally indicate the greatest change in leaf inclination angles for spring, with the changes usually being the most pronounced at the top of the canopy. It should also be noted, however, that whereas the temporal variation proved to be rather consistent for different species, the vertical variation differed more between species. The leveled camera method was additionally tested in terms of sensitivity to different users. Ten people were asked to measure the leaf angles for four different species. The results indicate the method is quite robust in providing coinciding distributions irrespective of the user and level of previous experience with the method. However, certain caution must be exercised when measuring long narrow leaves. References [1] G.G. McMillen, and J.H. McClendon, "Leaf angle: an adaptive feature of sun and shade leaves," Botanical Gazette, vol. 140, pp. 437-442, 1979. [2] J. Pisek, O. Sonnentag, A.D. Richardson, and M. Mõttus, "Is the spherical leaf inclination angle distribution a valid assumption for temperate and boreal broadleaf tree species?" Agricultural and Forest Meteorology, vol. 169, pp. 186-194, 2013. [3] Y. Ryu, O. Sonnentag, T. Nilson, R. Vargas, H. Kobayashi, R. Wenk, and D. Baldocchi, "How to quantify tree leaf area index in a heterogenous savanna ecosystem: a multi-instrument and multimodel approach," Agricultural and Forest Meteorology, vol. 150, pp. 63-76, 2010.

  4. Early Experience & Multisensory Perceptual Narrowing

    PubMed Central

    Lewkowicz, David J.

    2014-01-01

    Perceptual narrowing is a reflection of early experience and contributes in key ways to perceptual and cognitive development. In general, findings have shown that unisensory perceptual sensitivity in early infancy is broadly tuned such that young infants respond to, and discriminate, native as well as non-native sensory inputs, whereas older infants only respond to native inputs. Recently, my colleagues and I discovered that perceptual narrowing occurs at the multisensory processing level as well. The present article reviews this new evidence and puts it in the larger context of multisensory perceptual development and the role that perceptual experience plays in it. Together, the evidence on unisensory and multisensory narrowing shows that early experience shapes the emergence of perceptual specialization and expertise. PMID:24435505

  5. Advanced CCD camera developments

    SciTech Connect

    Condor, A.

    1994-11-15

    Two charge coupled device (CCD) camera systems are introduced and discussed, describing briefly the hardware involved, and the data obtained in their various applications. The Advanced Development Group Defense Sciences Engineering Division has been actively designing, manufacturing, fielding state-of-the-art CCD camera systems for over a decade. These systems were originally developed for the nuclear test program to record data from underground nuclear tests. Today, new and interesting application for these systems have surfaced and development is continuing in the area of advanced CCD camera systems, with the new CCD camera that will allow experimenters to replace film for x-ray imaging at the JANUS, USP, and NOVA laser facilities.

  6. Miniature TV Camera

    NASA Technical Reports Server (NTRS)

    2004-01-01

    Originally devised to observe Saturn stage separation during Apollo flights, Marshall Space Flight Center's Miniature Television Camera, measuring only 4 x 3 x 1 1/2 inches, quickly made its way to the commercial telecommunications market.

  7. Calibration of Action Cameras for Photogrammetric Purposes

    PubMed Central

    Balletti, Caterina; Guerra, Francesco; Tsioukas, Vassilios; Vernier, Paolo

    2014-01-01

    The use of action cameras for photogrammetry purposes is not widespread due to the fact that until recently the images provided by the sensors, using either still or video capture mode, were not big enough to perform and provide the appropriate analysis with the necessary photogrammetric accuracy. However, several manufacturers have recently produced and released new lightweight devices which are: (a) easy to handle, (b) capable of performing under extreme conditions and more importantly (c) able to provide both still images and video sequences of high resolution. In order to be able to use the sensor of action cameras we must apply a careful and reliable self-calibration prior to the use of any photogrammetric procedure, a relatively difficult scenario because of the short focal length of the camera and its wide angle lens that is used to obtain the maximum possible resolution of images. Special software, using functions of the OpenCV library, has been created to perform both the calibration and the production of undistorted scenes for each one of the still and video image capturing mode of a novel action camera, the GoPro Hero 3 camera that can provide still images up to 12 Mp and video up 8 Mp resolution. PMID:25237898

  8. Calibration of action cameras for photogrammetric purposes.

    PubMed

    Balletti, Caterina; Guerra, Francesco; Tsioukas, Vassilios; Vernier, Paolo

    2014-01-01

    The use of action cameras for photogrammetry purposes is not widespread due to the fact that until recently the images provided by the sensors, using either still or video capture mode, were not big enough to perform and provide the appropriate analysis with the necessary photogrammetric accuracy. However, several manufacturers have recently produced and released new lightweight devices which are: (a) easy to handle, (b) capable of performing under extreme conditions and more importantly (c) able to provide both still images and video sequences of high resolution. In order to be able to use the sensor of action cameras we must apply a careful and reliable self-calibration prior to the use of any photogrammetric procedure, a relatively difficult scenario because of the short focal length of the camera and its wide angle lens that is used to obtain the maximum possible resolution of images. Special software, using functions of the OpenCV library, has been created to perform both the calibration and the production of undistorted scenes for each one of the still and video image capturing mode of a novel action camera, the GoPro Hero 3 camera that can provide still images up to 12 Mp and video up 8 Mp resolution. PMID:25237898

  9. Spacecraft camera image registration

    NASA Technical Reports Server (NTRS)

    Kamel, Ahmed A. (Inventor); Graul, Donald W. (Inventor); Chan, Fred N. T. (Inventor); Gamble, Donald W. (Inventor)

    1987-01-01

    A system for achieving spacecraft camera (1, 2) image registration comprises a portion external to the spacecraft and an image motion compensation system (IMCS) portion onboard the spacecraft. Within the IMCS, a computer (38) calculates an image registration compensation signal (60) which is sent to the scan control loops (84, 88, 94, 98) of the onboard cameras (1, 2). At the location external to the spacecraft, the long-term orbital and attitude perturbations on the spacecraft are modeled. Coefficients (K, A) from this model are periodically sent to the onboard computer (38) by means of a command unit (39). The coefficients (K, A) take into account observations of stars and landmarks made by the spacecraft cameras (1, 2) themselves. The computer (38) takes as inputs the updated coefficients (K, A) plus synchronization information indicating the mirror position (AZ, EL) of each of the spacecraft cameras (1, 2), operating mode, and starting and stopping status of the scan lines generated by these cameras (1, 2), and generates in response thereto the image registration compensation signal (60). The sources of periodic thermal errors on the spacecraft are discussed. The system is checked by calculating measurement residuals, the difference between the landmark and star locations predicted at the external location and the landmark and star locations as measured by the spacecraft cameras (1, 2).

  10. Pulse stretcher for narrow pulses

    NASA Technical Reports Server (NTRS)

    Lindsey, R. S., Jr. (inventor)

    1974-01-01

    A pulse stretcher for narrow pulses is presented. The stretcher is composed of an analog section for processing each arriving analog pulse and a digital section with logic for providing command signals to the gates and switches in the analog section.

  11. Spectral narrowing via quantum coherence

    SciTech Connect

    Mikhailov, Eugeniy E.; Rostovtsev, Yuri V.; Zhang Aihua; Welch, George R.; Sautenkov, Vladimir A.; Zubairy, M. Suhail; Scully, Marlan O.

    2006-07-15

    We have studied the transmission through an optically thick {sup 87}Rb vapor that is illuminated by monochromatic and noise-broadened laser fields in {lambda} configuration. The spectral width of the beat signal between the two fields after transmission through the atomic medium is more than 1000 times narrower than the spectral width of this signal before the medium.

  12. Aircraft Altitude Estimation Using Un-calibrated Onboard Cameras

    NASA Astrophysics Data System (ADS)

    Naidu, V. P. S.; Mukherjee, J.

    2012-10-01

    In the present study, implementation and study of aircraft altitude estimation using un-calibrated onboard camera is obtained. A camera model has been implemented to simulate the test data. From the results, it was observed that the rounding nature of pixel coordinates creates fluctuations around the true vanishing point (VP) angle and height computations. These fluctuations were smoothened using a Kalman filter based state estimator. The effects of camera tilt and focal length on VP angle and height computations were also studied. It is concluded that the camera should be perpendicular to the runway for there to be no effect of the focal length on the height computation. It is being planned to apply this algorithm for real time imaging data along with Integrated Enhanced Synthetic Vision (IESVS) on HANSA aircraft.

  13. Calibration Procedures in Mid Format Camera Setups

    NASA Astrophysics Data System (ADS)

    Pivnicka, F.; Kemper, G.; Geissler, S.

    2012-07-01

    A growing number of mid-format cameras are used for aerial surveying projects. To achieve a reliable and geometrically precise result also in the photogrammetric workflow, awareness on the sensitive parts is important. The use of direct referencing systems (GPS/IMU), the mounting on a stabilizing camera platform and the specific values of the mid format camera make a professional setup with various calibration and misalignment operations necessary. An important part is to have a proper camera calibration. Using aerial images over a well designed test field with 3D structures and/or different flight altitudes enable the determination of calibration values in Bingo software. It will be demonstrated how such a calibration can be performed. The direct referencing device must be mounted in a solid and reliable way to the camera. Beside the mechanical work especially in mounting the camera beside the IMU, 2 lever arms have to be measured in mm accuracy. Important are the lever arms from the GPS Antenna to the IMU's calibrated centre and also the lever arm from the IMU centre to the Camera projection centre. In fact, the measurement with a total station is not a difficult task but the definition of the right centres and the need for using rotation matrices can cause serious accuracy problems. The benefit of small and medium format cameras is that also smaller aircrafts can be used. Like that, a gyro bases stabilized platform is recommended. This causes, that the IMU must be mounted beside the camera on the stabilizer. The advantage is, that the IMU can be used to control the platform, the problematic thing is, that the IMU to GPS antenna lever arm is floating. In fact we have to deal with an additional data stream, the values of the movement of the stabiliser to correct the floating lever arm distances. If the post-processing of the GPS-IMU data by taking the floating levers into account, delivers an expected result, the lever arms between IMU and camera can be applied. However, there is a misalignment (bore side angle) that must be evaluated by photogrammetric process using advanced tools e.g. in Bingo. Once, all these parameters have been determined, the system is capable for projects without or with only a few ground control points. But which effect has the photogrammetric process when directly applying the achieved direct orientation values compared with an AT based on a proper tiepoint matching? The paper aims to show the steps to be done by potential users and gives a kind of quality estimation about the importance and quality influence of the various calibration and adjustment steps.

  14. Image dissector camera system study

    NASA Technical Reports Server (NTRS)

    Howell, L.

    1984-01-01

    Various aspects of a rendezvous and docking system using an image dissector detector as compared to a GaAs detector were discussed. Investigation into a gimbled scanning system is also covered and the measured video response curves from the image dissector camera are presented. Rendezvous will occur at ranges greater than 100 meters. The maximum range considered was 1000 meters. During docking, the range, range-rate, angle, and angle-rate to each reflector on the satellite must be measured. Docking range will be from 3 to 100 meters. The system consists of a CW laser diode transmitter and an image dissector receiver. The transmitter beam is amplitude modulated with three sine wave tones for ranging. The beam is coaxially combined with the receiver beam. Mechanical deflection of the transmitter beam, + or - 10 degrees in both X and Y, can be accomplished before or after it is combined with the receiver beam. The receiver will have a field-of-view (FOV) of 20 degrees and an instantaneous field-of-view (IFOV) of two milliradians (mrad) and will be electronically scanned in the image dissector. The increase in performance obtained from the GaAs photocathode is not needed to meet the present performance requirements.

  15. Uncooled radiometric camera performance

    NASA Astrophysics Data System (ADS)

    Meyer, Bill; Hoelter, T.

    1998-07-01

    Thermal imaging equipment utilizing microbolometer detectors operating at room temperature has found widespread acceptance in both military and commercial applications. Uncooled camera products are becoming effective solutions to applications currently using traditional, photonic infrared sensors. The reduced power consumption and decreased mechanical complexity offered by uncooled cameras have realized highly reliable, low-cost, hand-held instruments. Initially these instruments displayed only relative temperature differences which limited their usefulness in applications such as Thermography. Radiometrically calibrated microbolometer instruments are now available. The ExplorIR Thermography camera leverages the technology developed for Raytheon Systems Company's first production microbolometer imaging camera, the Sentinel. The ExplorIR camera has a demonstrated temperature measurement accuracy of 4 degrees Celsius or 4% of the measured value (whichever is greater) over scene temperatures ranges of minus 20 degrees Celsius to 300 degrees Celsius (minus 20 degrees Celsius to 900 degrees Celsius for extended range models) and camera environmental temperatures of minus 10 degrees Celsius to 40 degrees Celsius. Direct temperature measurement with high resolution video imaging creates some unique challenges when using uncooled detectors. A temperature controlled, field-of-view limiting aperture (cold shield) is not typically included in the small volume dewars used for uncooled detector packages. The lack of a field-of-view shield allows a significant amount of extraneous radiation from the dewar walls and lens body to affect the sensor operation. In addition, the transmission of the Germanium lens elements is a function of ambient temperature. The ExplorIR camera design compensates for these environmental effects while maintaining the accuracy and dynamic range required by today's predictive maintenance and condition monitoring markets.

  16. What convention is used for the illumination and view angles?

    Atmospheric Science Data Center

    2014-12-08

    ... Azimuth angles are measured clockwise from the direction of travel to local north. For both the Sun and cameras, azimuth describes the ... to the equator, because of its morning equator crossing time. Additionally, the difference in view and solar azimuth angle will be near ...

  17. Sensing driver awareness by combining fisheye camera and Kinect

    NASA Astrophysics Data System (ADS)

    Wuhe, Z.; Lei, Z.; Ning, D.

    2014-11-01

    In this paper, we propose a Driver's Awareness Catching System to sense the driver's awareness. The system consists of a fisheye camera and a Kinect. The Kinect mounted inside vehicle is used to recognize and locate the 3D face of the driver. The fisheye camera mounted outside vehicle is used to monitor the road. The relative pose between two cameras is calibrated via a state-of-the-art method for calibrating cameras with non-overlapping field of view. The camera system works in this way: First, the SDK of Kinect released by Microsoft is used to tracking driver's face and capture eye's location together with sight direction. Secondly, the eye's location and the sight direction are transformed to the coordinate system of fisheye camera. Thirdly, corresponding view field is extracted from fisheye image. As there is a small displacement between driver's eyes and the optical center of fisheye camera, it will lead to a view angle deviation. Finally, we did a systematic analysis of the error distribution by numerical simulation and proved the feasibility of our camera system. On the other hand, we realized this camera system and achieved desired effect in realworld experiment.

  18. The Dark Energy Camera

    SciTech Connect

    Flaugher, B.

    2015-04-11

    The Dark Energy Camera is a new imager with a 2.2-degree diameter field of view mounted at the prime focus of the Victor M. Blanco 4-meter telescope on Cerro Tololo near La Serena, Chile. The camera was designed and constructed by the Dark Energy Survey Collaboration, and meets or exceeds the stringent requirements designed for the wide-field and supernova surveys for which the collaboration uses it. The camera consists of a five element optical corrector, seven filters, a shutter with a 60 cm aperture, and a CCD focal plane of 250-μm thick fully depleted CCDs cooled inside a vacuum Dewar. The 570 Mpixel focal plane comprises 62 2k x 4k CCDs for imaging and 12 2k x 2k CCDs for guiding and focus. The CCDs have 15μm x 15μm pixels with a plate scale of 0.263" per pixel. A hexapod system provides state-of-the-art focus and alignment capability. The camera is read out in 20 seconds with 6-9 electrons readout noise. This paper provides a technical description of the camera's engineering, construction, installation, and current status.

  19. The Dark Energy Camera

    NASA Astrophysics Data System (ADS)

    Flaugher, B.; Diehl, H. T.; Honscheid, K.; Abbott, T. M. C.; Alvarez, O.; Angstadt, R.; Annis, J. T.; Antonik, M.; Ballester, O.; Beaufore, L.; Bernstein, G. M.; Bernstein, R. A.; Bigelow, B.; Bonati, M.; Boprie, D.; Brooks, D.; Buckley-Geer, E. J.; Campa, J.; Cardiel-Sas, L.; Castander, F. J.; Castilla, J.; Cease, H.; Cela-Ruiz, J. M.; Chappa, S.; Chi, E.; Cooper, C.; da Costa, L. N.; Dede, E.; Derylo, G.; DePoy, D. L.; de Vicente, J.; Doel, P.; Drlica-Wagner, A.; Eiting, J.; Elliott, A. E.; Emes, J.; Estrada, J.; Fausti Neto, A.; Finley, D. A.; Flores, R.; Frieman, J.; Gerdes, D.; Gladders, M. D.; Gregory, B.; Gutierrez, G. R.; Hao, J.; Holland, S. E.; Holm, S.; Huffman, D.; Jackson, C.; James, D. J.; Jonas, M.; Karcher, A.; Karliner, I.; Kent, S.; Kessler, R.; Kozlovsky, M.; Kron, R. G.; Kubik, D.; Kuehn, K.; Kuhlmann, S.; Kuk, K.; Lahav, O.; Lathrop, A.; Lee, J.; Levi, M. E.; Lewis, P.; Li, T. S.; Mandrichenko, I.; Marshall, J. L.; Martinez, G.; Merritt, K. W.; Miquel, R.; Muñoz, F.; Neilsen, E. H.; Nichol, R. C.; Nord, B.; Ogando, R.; Olsen, J.; Palaio, N.; Patton, K.; Peoples, J.; Plazas, A. A.; Rauch, J.; Reil, K.; Rheault, J.-P.; Roe, N. A.; Rogers, H.; Roodman, A.; Sanchez, E.; Scarpine, V.; Schindler, R. H.; Schmidt, R.; Schmitt, R.; Schubnell, M.; Schultz, K.; Schurter, P.; Scott, L.; Serrano, S.; Shaw, T. M.; Smith, R. C.; Soares-Santos, M.; Stefanik, A.; Stuermer, W.; Suchyta, E.; Sypniewski, A.; Tarle, G.; Thaler, J.; Tighe, R.; Tran, C.; Tucker, D.; Walker, A. R.; Wang, G.; Watson, M.; Weaverdyck, C.; Wester, W.; Woods, R.; Yanny, B.; DES Collaboration

    2015-11-01

    The Dark Energy Camera is a new imager with a 2.°2 diameter field of view mounted at the prime focus of the Victor M. Blanco 4 m telescope on Cerro Tololo near La Serena, Chile. The camera was designed and constructed by the Dark Energy Survey Collaboration and meets or exceeds the stringent requirements designed for the wide-field and supernova surveys for which the collaboration uses it. The camera consists of a five-element optical corrector, seven filters, a shutter with a 60 cm aperture, and a charge-coupled device (CCD) focal plane of 250 μm thick fully depleted CCDs cooled inside a vacuum Dewar. The 570 megapixel focal plane comprises 62 2k × 4k CCDs for imaging and 12 2k × 2k CCDs for guiding and focus. The CCDs have 15 μm × 15 μm pixels with a plate scale of 0.″263 pixel-1. A hexapod system provides state-of-the-art focus and alignment capability. The camera is read out in 20 s with 6-9 electron readout noise. This paper provides a technical description of the camera's engineering, construction, installation, and current status.

  20. Satellite camera image navigation

    NASA Technical Reports Server (NTRS)

    Kamel, Ahmed A. (Inventor); Graul, Donald W. (Inventor); Savides, John (Inventor); Hanson, Charles W. (Inventor)

    1987-01-01

    Pixels within a satellite camera (1, 2) image are precisely located in terms of latitude and longitude on a celestial body, such as the earth, being imaged. A computer (60) on the earth generates models (40, 50) of the satellite's orbit and attitude, respectively. The orbit model (40) is generated from measurements of stars and landmarks taken by the camera (1, 2), and by range data. The orbit model (40) is an expression of the satellite's latitude and longitude at the subsatellite point, and of the altitude of the satellite, as a function of time, using as coefficients (K) the six Keplerian elements at epoch. The attitude model (50) is based upon star measurements taken by each camera (1, 2). The attitude model (50) is a set of expressions for the deviations in a set of mutually orthogonal reference optical axes (x, y, z) as a function of time, for each camera (1, 2). Measured data is fit into the models (40, 50) using a walking least squares fit algorithm. A transformation computer (66 ) transforms pixel coordinates as telemetered by the camera (1, 2) into earth latitude and longitude coordinates, using the orbit and attitude models (40, 50).

  1. Clementine longwave infrared camera

    NASA Astrophysics Data System (ADS)

    Priest, Robert E.; Lewis, Isabella T.; Sewall, Noel R.; Park, Hye-Sook; Shannon, Michael J.; Ledebuhr, Arno G.; Pleasance, Lyn D.; Massie, Mark A.; Metschuleit, Karen

    1995-06-01

    The Clementine mission provided the first ever complete, systematic surface mapping of the moon from the ultra-violet to the near-infrared regions. More than 1.7 million images of the moon, earth, and space were returned from this mission. The long-wave-infrared (LWIR) camera supplemented the UV/visible and near-infrared mapping cameras providing limited strip coverage of the moon, giving insight to the thermal properties of the soils. This camera provided approximately 100 m spatial resolution at 400 km periselene, and a 7 km across- track swath. This 2.1 kg camera using a 128 X 128 mercury-cadmium-telluride (MCT) FPA viewed thermal emission of the lunar surface and lunar horizon in the 8.0 to 9.5 micrometers wavelength region. A description of this lightweight, low power LWIR camera along with a summary of lessons learned is presented. Design goals and preliminary on-orbit performance estimates are addressed in terms of meeting the mission's primary objective for flight qualifying the sensors for future Department of Defense flights.

  2. Narrow dip inside a natural linewidth absorption profile in a system of two atoms

    NASA Astrophysics Data System (ADS)

    Makarov, A. A.

    2015-11-01

    Absorption spectrum of a system of two closely spaced identical atoms displays, at certain preparation, a dip that can be much narrower than the natural linewidth. This preparation includes (i) application of a strong magnetic field at an angle ? , that is very close to the magic angle ?0=arccos(1 /?{3 }) ?54.7, with respect to the direction from one atom to another, and (ii) in-plane illumination by a laser light in the form of a nonresonant standing wave polarized at the same angle ? . Both qualitative and quantitative arguments for the narrow dip effect are presented.

  3. Binocular Camera for cockpit visibility of general aviation aircraft

    NASA Astrophysics Data System (ADS)

    Barile, A. J.

    1981-04-01

    A history of cockpit visibility studies and requirements with regard to aircraft safety, human factors, collision avoidance, and accident investigations is presented. The Federal Aviation Administration's development of the Binocular Camera is reviewed, and the technical details of a new and improved camera are discussed. The Binocular Camera uses two 65 mm wide angle F6.8 lenses and covers an 88 1/2 deg field of vision. The camera produces images, representative of what the human eyes see before the brain integrates them into one, thus making it possible to analyze the effect of obstruction to vision. The improvements, applications, and uses of the camera in the research, development, and operations of general aviation aircraft are discussed.

  4. Selective-imaging camera

    NASA Astrophysics Data System (ADS)

    Szu, Harold; Hsu, Charles; Landa, Joseph; Cha, Jae H.; Krapels, Keith A.

    2015-05-01

    How can we design cameras that image selectively in Full Electro-Magnetic (FEM) spectra? Without selective imaging, we cannot use, for example, ordinary tourist cameras to see through fire, smoke, or other obscurants contributing to creating a Visually Degraded Environment (VDE). This paper addresses a possible new design of selective-imaging cameras at firmware level. The design is consistent with physics of the irreversible thermodynamics of Boltzmann's molecular entropy. It enables imaging in appropriate FEM spectra for sensing through the VDE, and displaying in color spectra for Human Visual System (HVS). We sense within the spectra the largest entropy value of obscurants such as fire, smoke, etc. Then we apply a smart firmware implementation of Blind Sources Separation (BSS) to separate all entropy sources associated with specific Kelvin temperatures. Finally, we recompose the scene using specific RGB colors constrained by the HVS, by up/down shifting Planck spectra at each pixel and time.

  5. SOPHIE: portable infrared camera

    NASA Astrophysics Data System (ADS)

    Dupiech, Michael

    1999-07-01

    After 20 years of experience in land-based IR systems, THOMSON-CSF is able to offer its customers a new product: SOPHIE. It is the fruit of an assortment of skills mustered at the heart of THOMSON-CSF OPTRONIQUE and of the Integrated Cooler Assembly developed by SOFRADIR and CRYOTECHNOLOGIES. SOPHIE, the world's first handled IR camera/binocular, weighs only 2 kg and offers a reconnaissance range performance well beyond 2 km, comparable to the cameras of the first generation. It can run from its own self-contained power supply or from the mains, giving it further flexibility in use. It is a genuine night-or-day instrument, operating in the 8-12 micrometers wavelength. Its leading edge technologies, together with its light weight, make it a dual-purpose product, functioning either as a camera that can be linked to a monitor, or as a conventional pair of binoculars.

  6. Solid state television camera

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The design, fabrication, and tests of a solid state television camera using a new charge-coupled imaging device are reported. An RCA charge-coupled device arranged in a 512 by 320 format and directly compatible with EIA format standards was the sensor selected. This is a three-phase, sealed surface-channel array that has 163,840 sensor elements, which employs a vertical frame transfer system for image readout. Included are test results of the complete camera system, circuit description and changes to such circuits as a result of integration and test, maintenance and operation section, recommendations to improve the camera system, and a complete set of electrical and mechanical drawing sketches.

  7. Study on the diagnostic system of scoliosis by using infrared camera.

    PubMed

    Jeong, Jin-hyoung; Park, Eun-jeong; Cho, Chang-ok; Kim, Yoon-jeong; Lee, Sang-sik

    2015-01-01

    In this study, the radiation generated in the diagnosis of scoliosis, to solve the problems by using an infrared camera and an optical marker system that can diagnose scoliosis developed. System developed by the infrared camera attached to the optical spinal curvature is recognized as a marker to shoot the angle between the two optical markers are measured. Measurement of angle, we used the Cobb's Angle method used in the diagnosis of spinal scoliosis. We developed a software to be able to output to the screen using an infrared camera to diagnose spinal scoliosis. Software is composed of camera output unit was manufactured in Labview, angle measurement unit, in Cobb's Angle measurement unit. In the future, kyphosis, Hallux Valgus, such as the diagnosis of orthopedic disorders that require the use of a diagnostic system is expected case. PMID:26405878

  8. Autonomous pedestrian localization technique using CMOS camera sensors

    NASA Astrophysics Data System (ADS)

    Chun, Chanwoo

    2014-09-01

    We present a pedestrian localization technique that does not need infrastructure. The proposed angle-only measurement method needs specially manufactured shoes. Each shoe has two CMOS cameras and two markers such as LEDs attached on the inward side. The line of sight (LOS) angles towards the two markers on the forward shoe are measured using the two cameras on the other rear shoe. Our simulation results shows that a pedestrian walking down in a shopping mall wearing this device can be accurately guided to the front of a destination store located 100m away, if the floor plan of the mall is available.

  9. 3-D Flow Visualization with a Light-field Camera

    NASA Astrophysics Data System (ADS)

    Thurow, B.

    2012-12-01

    Light-field cameras have received attention recently due to their ability to acquire photographs that can be computationally refocused after they have been acquired. In this work, we describe the development of a light-field camera system for 3D visualization of turbulent flows. The camera developed in our lab, also known as a plenoptic camera, uses an array of microlenses mounted next to an image sensor to resolve both the position and angle of light rays incident upon the camera. For flow visualization, the flow field is seeded with small particles that follow the fluid's motion and are imaged using the camera and a pulsed light source. The tomographic MART algorithm is then applied to the light-field data in order to reconstruct a 3D volume of the instantaneous particle field. 3D, 3C velocity vectors are then determined from a pair of 3D particle fields using conventional cross-correlation algorithms. As an illustration of the concept, 3D/3C velocity measurements of a turbulent boundary layer produced on the wall of a conventional wind tunnel are presented. Future experiments are planned to use the camera to study the influence of wall permeability on the 3-D structure of the turbulent boundary layer.Schematic illustrating the concept of a plenoptic camera where each pixel represents both the position and angle of light rays entering the camera. This information can be used to computationally refocus an image after it has been acquired. Instantaneous 3D velocity field of a turbulent boundary layer determined using light-field data captured by a plenoptic camera.

  10. Artificial human vision camera

    NASA Astrophysics Data System (ADS)

    Goudou, J.-F.; Maggio, S.; Fagno, M.

    2014-10-01

    In this paper we present a real-time vision system modeling the human vision system. Our purpose is to inspire from human vision bio-mechanics to improve robotic capabilities for tasks such as objects detection and tracking. This work describes first the bio-mechanical discrepancies between human vision and classic cameras and the retinal processing stage that takes place in the eye, before the optic nerve. The second part describes our implementation of these principles on a 3-camera optical, mechanical and software model of the human eyes and associated bio-inspired attention model.

  11. Laser Range Camera Modeling

    SciTech Connect

    Storjohann, K.

    1990-01-01

    This paper describes an imaging model that was derived for use with a laser range camera (LRC) developed by the Advanced Intelligent Machines Division of Odetics. However, this model could be applied to any comparable imaging system. Both the derivation of the model and the determination of the LRC's intrinsic parameters are explained. For the purpose of evaluating the LRC's extrinsic parameters, i.e., its external orientation, a transformation of the LRC's imaging model into a standard camera's (SC) pinhole model is derived. By virtue of this transformation, the evaluation of the LRC's external orientation can be found by applying any SC calibration technique.

  12. Camera Calibration Based on Perspective Geometry and Its Application in LDWS

    NASA Astrophysics Data System (ADS)

    Xu, Huarong; Wang, Xiaodong

    In this paper, we present a novel algorithm to calibrate cameras for lane departure warning system(LDWS). The algorithm only need a set of parallel lane markings and parallel lines perpendicular to the ground plane to determine the camera parameters such as the roll angle, the tilt angle, the pan angle and the focal length. Then with the camera height, the positions of objects in world space can be easily obtained from the image. We apply the proposed method to our lane departure warning system which monitors the distance between the car and road boundaries. Experiments show that the proposed method is easy to operate, and can achieve accurate results.

  13. Anger Camera Firmware

    Energy Science and Technology Software Center (ESTSC)

    2010-11-19

    The firmware is responsible for the operation of Anger Camera Electronics, calculation of position, time of flight and digital communications. It provides a first stage analysis of 48 signals from 48 analog signals that have been converted to digital values using A/D convertors.

  14. Photogrammetric camera calibration

    USGS Publications Warehouse

    Tayman, W.P.; Ziemann, H.

    1984-01-01

    Section 2 (Calibration) of the document "Recommended Procedures for Calibrating Photogrammetric Cameras and Related Optical Tests" from the International Archives of Photogrammetry, Vol. XIII, Part 4, is reviewed in the light of recent practical work, and suggestions for changes are made. These suggestions are intended as a basis for a further discussion. ?? 1984.

  15. Behind the Camera.

    ERIC Educational Resources Information Center

    Kuhns, William; Giardino, Thomas F.

    Intended for the beginning filmmaker, this book presents basic information on major aspects of shooting a film. It covers characteristics of various cameras, films, lenses, and lighting equipment and tells how to use them. The importance of a shooting script is stressed. The mechanics of sound systems, editing, and titles, animations, and special

  16. Jack & the Video Camera

    ERIC Educational Resources Information Center

    Charlan, Nathan

    2010-01-01

    This article narrates how the use of video camera has transformed the life of Jack Williams, a 10-year-old boy from Colorado Springs, Colorado, who has autism. The way autism affected Jack was unique. For the first nine years of his life, Jack remained in his world, alone. Functionally non-verbal and with motor skill problems that affected his

  17. Communities, Cameras, and Conservation

    ERIC Educational Resources Information Center

    Patterson, Barbara

    2012-01-01

    Communities, Cameras, and Conservation (CCC) is the most exciting and valuable program the author has seen in her 30 years of teaching field science courses. In this citizen science project, students and community volunteers collect data on mountain lions ("Puma concolor") at four natural areas and public parks along the Front Range of Colorado.…

  18. Communities, Cameras, and Conservation

    ERIC Educational Resources Information Center

    Patterson, Barbara

    2012-01-01

    Communities, Cameras, and Conservation (CCC) is the most exciting and valuable program the author has seen in her 30 years of teaching field science courses. In this citizen science project, students and community volunteers collect data on mountain lions ("Puma concolor") at four natural areas and public parks along the Front Range of Colorado.

  19. Make a Pinhole Camera

    ERIC Educational Resources Information Center

    Fisher, Diane K.; Novati, Alexander

    2009-01-01

    On Earth, using ordinary visible light, one can create a single image of light recorded over time. Of course a movie or video is light recorded over time, but it is a series of instantaneous snapshots, rather than light and time both recorded on the same medium. A pinhole camera, which is simple to make out of ordinary materials and using ordinary…

  20. Ultraminiature television camera

    NASA Technical Reports Server (NTRS)

    Deterville, R. J.; Drago, N.

    1967-01-01

    Ultraminiature television camera with a total volume of 20.25 cubic inches, requires 28 vdc power, operates on UHF and accommodates standard 8-mm optics. It uses microelectronic assembly packaging techniques and contains a magnetically deflected and electrostatically focused vidicon, automatic gain control circuit, power supply, and transmitter.

  1. Bluetooth digital still camera

    NASA Astrophysics Data System (ADS)

    Santoro, Gaetano; Cucchi, Andrea; Guarnera, Mirko; Binachessi, Marco

    2003-05-01

    The Bluetooth specifications currently include many application profiles that define the requirements for some usage cases. In this paper the attention is focused on the 'Basic Imaging Profile' BIP. The BIP could add important features to classic Digital Still Cameras, such as the possibility to send images to a Bluetooth Printer, or to send the image on Internet with a Bluetooth cellular phone.

  2. Make a Pinhole Camera

    ERIC Educational Resources Information Center

    Fisher, Diane K.; Novati, Alexander

    2009-01-01

    On Earth, using ordinary visible light, one can create a single image of light recorded over time. Of course a movie or video is light recorded over time, but it is a series of instantaneous snapshots, rather than light and time both recorded on the same medium. A pinhole camera, which is simple to make out of ordinary materials and using ordinary

  3. Jack & the Video Camera

    ERIC Educational Resources Information Center

    Charlan, Nathan

    2010-01-01

    This article narrates how the use of video camera has transformed the life of Jack Williams, a 10-year-old boy from Colorado Springs, Colorado, who has autism. The way autism affected Jack was unique. For the first nine years of his life, Jack remained in his world, alone. Functionally non-verbal and with motor skill problems that affected his…

  4. The LSST Camera Overview

    SciTech Connect

    Gilmore, Kirk; Kahn, Steven A.; Nordby, Martin; Burke, David; O'Connor, Paul; Oliver, John; Radeka, Veljko; Schalk, Terry; Schindler, Rafe; /SLAC

    2007-01-10

    The LSST camera is a wide-field optical (0.35-1um) imager designed to provide a 3.5 degree FOV with better than 0.2 arcsecond sampling. The detector format will be a circular mosaic providing approximately 3.2 Gigapixels per image. The camera includes a filter mechanism and, shuttering capability. It is positioned in the middle of the telescope where cross-sectional area is constrained by optical vignetting and heat dissipation must be controlled to limit thermal gradients in the optical beam. The fast, f/1.2 beam will require tight tolerances on the focal plane mechanical assembly. The focal plane array operates at a temperature of approximately -100 C to achieve desired detector performance. The focal plane array is contained within an evacuated cryostat, which incorporates detector front-end electronics and thermal control. The cryostat lens serves as an entrance window and vacuum seal for the cryostat. Similarly, the camera body lens serves as an entrance window and gas seal for the camera housing, which is filled with a suitable gas to provide the operating environment for the shutter and filter change mechanisms. The filter carousel can accommodate 5 filters, each 75 cm in diameter, for rapid exchange without external intervention.

  5. The Martian Atmosphere as seen by the OSIRIS camera

    NASA Astrophysics Data System (ADS)

    Moissl, R.; Pajola, M.; Mttnen, A.; Kppers, M.

    2013-09-01

    Despite the long time that has passed since the observations, only few studies based on the data from the wide- (WAC) and narrow- (NAC) angle camera systems of OSIRIS have been published to date. In this Paper we will present the results on the observations of the Martian Limbs acquired by the OSIRIS [1] instrument on board the ESA mission Rosetta during its swing-by maneuver around February 25th, 2007 on the way to Comet 67P/Churyumov-Gerasimenko, during the onset of the very active dust storm season of Mars year 28 (at Ls ~190). Although OSIRIS did only capture the Planet during a relatively short time interval of several hours, the obtained global view and the spectral coverage, from the UV (245 nm) over the full visible range to the near IR (1000 nm), allow for a valuable global overview over the state of the Martian atmosphere. The image acquisition started a February 24 around 18:00 UTC from a distance of about 260.000 km and continued until 04:51 UTC on February 25 to a distance of 105.000 km. During the Closest Approach to the Planet at 01:54 UTC o February 25 at a distance of 250 km. All images have been manually co-registered with the help of SPICE data, and vertical profiles have been extracted over the limb in intervals of ~0.5 degrees (se Figures 1 and 2). Despite the long time that has passed since the observations, only few studies based on the data from the wide- (WAC) and narrow- (NAC) angle camera systems of OSIRIS have been published to date. In this Paper we will present the results on the observations of the Martian Limbs acquired by the OSIRIS [1] instrument on board the ESA mission Rosetta during its swing-by maneuver around February 25th, 2007 on the way to Comet 67P/Churyumov-Gerasimenko, during the onset of the very active dust storm season of Mars year 28 (at Ls ~190). Although OSIRIS did only capture the Planet during a relatively short time interval of several hours, the obtained global view and the spectral coverage, from the UV (245 nm) over the full visible range to the near IR (1000 nm), allow for a valuable global overview over the state of the Martian atmosphere. The image acquisition started at February 24 around 18:00 UTC from a distance of about 260.000 km and continued until 04:51 UTC on February 25 to a distance of 105.000 km. During the Closest Approach to the Planet at 01:54 UTC on February 25 at a distance of 250 km. All images have been manually co-registered with the help of SPICE data, and vertical profiles have been extracted over the limb in intervals of ~0.5 degrees (see Figures 1 and 2). In this work we will focus on our findings about the vertical structure of the atmosphere over the Martian limbs and report on the observed altitudes and optical densities of dust and (partially detached) clouds and put the findings in context with data from other satellites in orbit around Mars at the same time (e.g. Mars Express). Based on previous datasets (MGS/TES, MOd/THEMIS, MRO/MCS, see, e.g., [2], [3] and [4]) we can expect to observe the waning of the South polar hood and the development of the Northern one. Some remains of the aphelion cloud belt might still be visible near the equator. Detached layers have been recently observed at this season by MEx/SPICAM [5] and MRO/MCS [6].

  6. Image Sensors Enhance Camera Technologies

    NASA Technical Reports Server (NTRS)

    2010-01-01

    In the 1990s, a Jet Propulsion Laboratory team led by Eric Fossum researched ways of improving complementary metal-oxide semiconductor (CMOS) image sensors in order to miniaturize cameras on spacecraft while maintaining scientific image quality. Fossum s team founded a company to commercialize the resulting CMOS active pixel sensor. Now called the Aptina Imaging Corporation, based in San Jose, California, the company has shipped over 1 billion sensors for use in applications such as digital cameras, camera phones, Web cameras, and automotive cameras. Today, one of every three cell phone cameras on the planet feature Aptina s sensor technology.

  7. Direct imaging of exoplanetary systems with a monolithic multispectral camera

    NASA Astrophysics Data System (ADS)

    Hicks, Brian; Oram, Kathleen; Lewis, Nikole; Mendillo, Christopher; Bierden, Paul; Cook, TImothy; Chakrabarti, Supriya

    2013-09-01

    We present a monolithic multispectral camera (MMC) for high contrast direct imaging of inner exoplanetary environments. The primary scientific goal of the camera is to enable eight color characterization of jovian exoplanets and interplanetary dust and debris distributions around nearby stars. Technological highlights of the design include: 1. Diffraction limited resolution at 350 nm through active optical aberration correction; 2. Greater than million-to-one contrast at narrow star separation using interferometry and post-processing techniques; 3. Demonstration of deep broadband interferometric nulling and interband image stability through the use of monolithic optical assemblies; 4. Optimization of multispectral throughput while minimizing components.

  8. 15. ELEVATED CAMERA STAND, SHOWING LINE OF CAMERA STANDS PARALLEL ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    15. ELEVATED CAMERA STAND, SHOWING LINE OF CAMERA STANDS PARALLEL TO SLED TRACK. Looking west southwest down Camera Road. - Edwards Air Force Base, South Base Sled Track, Edwards Air Force Base, North of Avenue B, between 100th & 140th Streets East, Lancaster, Los Angeles County, CA

  9. Dual cameras acquisition and display system of retina-like sensor camera and rectangular sensor camera

    NASA Astrophysics Data System (ADS)

    Cao, Nan; Cao, Fengmei; Lin, Yabin; Bai, Tingzhu; Song, Shengyu

    2015-04-01

    For a new kind of retina-like senor camera and a traditional rectangular sensor camera, dual cameras acquisition and display system need to be built. We introduce the principle and the development of retina-like senor. Image coordinates transformation and interpolation based on sub-pixel interpolation need to be realized for our retina-like sensor's special pixels distribution. The hardware platform is composed of retina-like senor camera, rectangular sensor camera, image grabber and PC. Combined the MIL and OpenCV library, the software program is composed in VC++ on VS 2010. Experience results show that the system can realizes two cameras' acquisition and display.

  10. Wide-angle imaging system with fiberoptic components providing angle-dependent virtual material stops

    NASA Technical Reports Server (NTRS)

    Vaughan, Arthur H. (Inventor)

    1993-01-01

    A strip imaging wide angle optical system is provided. The optical system is provided with a 'virtual' material stop to avoid aberrational effects inherent in wide angle optical systems. The optical system includes a spherical mirror section for receiving light from a 180 deg strip or arc of a target image. Light received by the spherical mirror section is reflected to a frustoconical mirror section for subsequent rereflection to a row of optical fibers. Each optical fiber transmits a portion of the received light to a detector. The optical system exploits the narrow cone of acceptance associated with optical fibers to substantially eliminate vignetting effects inherent in wide angle systems. Further, the optical system exploits the narrow cone of acceptance of the optical fibers to substantially limit spherical aberration. The optical system is ideally suited for any application wherein a 180 deg strip image need be detected, and is particularly well adapted for use in hostile environments such as in planetary exploration.

  11. LSST Camera Optics

    SciTech Connect

    Olivier, S S; Seppala, L; Gilmore, K; Hale, L; Whistler, W

    2006-06-05

    The Large Synoptic Survey Telescope (LSST) is a unique, three-mirror, modified Paul-Baker design with an 8.4m primary, a 3.4m secondary, and a 5.0m tertiary feeding a camera system that includes corrector optics to produce a 3.5 degree field of view with excellent image quality (<0.3 arcsecond 80% encircled diffracted energy) over the entire field from blue to near infra-red wavelengths. We describe the design of the LSST camera optics, consisting of three refractive lenses with diameters of 1.6m, 1.0m and 0.7m, along with a set of interchangeable, broad-band, interference filters with diameters of 0.75m. We also describe current plans for fabricating, coating, mounting and testing these lenses and filters.

  12. NSTX Tangential Divertor Camera

    SciTech Connect

    A.L. Roquemore; Ted Biewer; D. Johnson; S.J. Zweben; Nobuhiro Nishino; V.A. Soukhanovskii

    2004-07-16

    Strong magnetic field shear around the divertor x-point is numerically predicted to lead to strong spatial asymmetries in turbulence driven particle fluxes. To visualize the turbulence and associated impurity line emission near the lower x-point region, a new tangential observation port has been recently installed on NSTX. A reentrant sapphire window with a moveable in-vessel mirror images the divertor region from the center stack out to R 80 cm and views the x-point for most plasma configurations. A coherent fiber optic bundle transmits the image through a remotely selected filter to a fast camera, for example a 40500 frames/sec Photron CCD camera. A gas puffer located in the lower inboard divertor will localize the turbulence in the region near the x-point. Edge fluid and turbulent codes UEDGE and BOUT will be used to interpret impurity and deuterium emission fluctuation measurements in the divertor.

  13. Combustion pinhole camera system

    DOEpatents

    Witte, A.B.

    1984-02-21

    A pinhole camera system is described utilizing a sealed optical-purge assembly which provides optical access into a coal combustor or other energy conversion reactors. The camera system basically consists of a focused-purge pinhole optical port assembly, a conventional TV vidicon receiver, an external, variable density light filter which is coupled electronically to the vidicon automatic gain control (agc). The key component of this system is the focused-purge pinhole optical port assembly which utilizes a purging inert gas to keep debris from entering the port and a lens arrangement which transfers the pinhole to the outside of the port assembly. One additional feature of the port assembly is that it is not flush with the interior of the combustor. 2 figs.

  14. Combustion pinhole camera system

    DOEpatents

    Witte, Arvel B.

    1984-02-21

    A pinhole camera system utilizing a sealed optical-purge assembly which provides optical access into a coal combustor or other energy conversion reactors. The camera system basically consists of a focused-purge pinhole optical port assembly, a conventional TV vidicon receiver, an external, variable density light filter which is coupled electronically to the vidicon automatic gain control (agc). The key component of this system is the focused-purge pinhole optical port assembly which utilizes a purging inert gas to keep debris from entering the port and a lens arrangement which transfers the pinhole to the outside of the port assembly. One additional feature of the port assembly is that it is not flush with the interior of the combustor.

  15. Streak camera receiver definition study

    NASA Technical Reports Server (NTRS)

    Johnson, C. B.; Hunkler, L. T., Sr.; Letzring, S. A.; Jaanimagi, P.

    1990-01-01

    Detailed streak camera definition studies were made as a first step toward full flight qualification of a dual channel picosecond resolution streak camera receiver for the Geoscience Laser Altimeter and Ranging System (GLRS). The streak camera receiver requirements are discussed as they pertain specifically to the GLRS system, and estimates of the characteristics of the streak camera are given, based upon existing and near-term technological capabilities. Important problem areas are highlighted, and possible corresponding solutions are discussed.

  16. Automated Camera Array Fine Calibration

    NASA Technical Reports Server (NTRS)

    Clouse, Daniel; Padgett, Curtis; Ansar, Adnan; Cheng, Yang

    2008-01-01

    Using aerial imagery, the JPL FineCalibration (JPL FineCal) software automatically tunes a set of existing CAHVOR camera models for an array of cameras. The software finds matching features in the overlap region between images from adjacent cameras, and uses these features to refine the camera models. It is not necessary to take special imagery of a known target and no surveying is required. JPL FineCal was developed for use with an aerial, persistent surveillance platform.

  17. Orbiter Camera Payload System

    NASA Technical Reports Server (NTRS)

    1980-01-01

    Components for an orbiting camera payload system (OCPS) include the large format camera (LFC), a gas supply assembly, and ground test, handling, and calibration hardware. The LFC, a high resolution large format photogrammetric camera for use in the cargo bay of the space transport system, is also adaptable to use on an RB-57 aircraft or on a free flyer satellite. Carrying 4000 feet of film, the LFC is usable over the visible to near IR, at V/h rates of from 11 to 41 milliradians per second, overlap of 10, 60, 70 or 80 percent and exposure times of from 4 to 32 milliseconds. With a 12 inch focal length it produces a 9 by 18 inch format (long dimension in line of flight) with full format low contrast resolution of 88 lines per millimeter (AWAR), full format distortion of less than 14 microns and a complement of 45 Reseau marks and 12 fiducial marks. Weight of the OCPS as supplied, fully loaded is 944 pounds and power dissipation is 273 watts average when in operation, 95 watts in standby. The LFC contains an internal exposure sensor, or will respond to external command. It is able to photograph starfields for inflight calibration upon command.

  18. Multi-Camera Saliency.

    PubMed

    Luo, Yan; Jiang, Ming; Wong, Yongkang; Zhao, Qi

    2015-10-01

    A significant body of literature on saliency modeling predicts where humans look in a single image or video. Besides the scientific goal of understanding how information is fused from multiple visual sources to identify regions of interest in a holistic manner, there are tremendous engineering applications of multi-camera saliency due to the widespread of cameras. This paper proposes a principled framework to smoothly integrate visual information from multiple views to a global scene map, and to employ a saliency algorithm incorporating high-level features to identify the most important regions by fusing visual information. The proposed method has the following key distinguishing features compared with its counterparts: (1) the proposed saliency detection is global (salient regions from one local view may not be important in a global context), (2) it does not require special ways for camera deployment or overlapping field of view, and (3) the key saliency algorithm is effective in highlighting interesting object regions though not a single detector is used. Experiments on several data sets confirm the effectiveness of the proposed principled framework. PMID:26340257

  19. Orbiter Camera Payload System

    NASA Astrophysics Data System (ADS)

    1980-12-01

    Components for an orbiting camera payload system (OCPS) include the large format camera (LFC), a gas supply assembly, and ground test, handling, and calibration hardware. The LFC, a high resolution large format photogrammetric camera for use in the cargo bay of the space transport system, is also adaptable to use on an RB-57 aircraft or on a free flyer satellite. Carrying 4000 feet of film, the LFC is usable over the visible to near IR, at V/h rates of from 11 to 41 milliradians per second, overlap of 10, 60, 70 or 80 percent and exposure times of from 4 to 32 milliseconds. With a 12 inch focal length it produces a 9 by 18 inch format (long dimension in line of flight) with full format low contrast resolution of 88 lines per millimeter (AWAR), full format distortion of less than 14 microns and a complement of 45 Reseau marks and 12 fiducial marks. Weight of the OCPS as supplied, fully loaded is 944 pounds and power dissipation is 273 watts average when in operation, 95 watts in standby. The LFC contains an internal exposure sensor, or will respond to external command. It is able to photograph starfields for inflight calibration upon command.

  20. Gamma ray camera

    DOEpatents

    Perez-Mendez, Victor (Berkeley, CA)

    1997-01-01

    A gamma ray camera for detecting rays emanating from a radiation source such as an isotope. The gamma ray camera includes a sensor array formed of a visible light crystal for converting incident gamma rays to a plurality of corresponding visible light photons, and a photosensor array responsive to the visible light photons in order to form an electronic image of the radiation therefrom. The photosensor array is adapted to record an integrated amount of charge proportional to the incident gamma rays closest to it, and includes a transparent metallic layer, photodiode consisting of a p-i-n structure formed on one side of the transparent metallic layer, and comprising an upper p-type layer, an intermediate layer and a lower n-type layer. In the preferred mode, the scintillator crystal is composed essentially of a cesium iodide (CsI) crystal preferably doped with a predetermined amount impurity, and the p-type upper intermediate layers and said n-type layer are essentially composed of hydrogenated amorphous silicon (a-Si:H). The gamma ray camera further includes a collimator interposed between the radiation source and the sensor array, and a readout circuit formed on one side of the photosensor array.

  1. Gamma ray camera

    DOEpatents

    Perez-Mendez, V.

    1997-01-21

    A gamma ray camera is disclosed for detecting rays emanating from a radiation source such as an isotope. The gamma ray camera includes a sensor array formed of a visible light crystal for converting incident gamma rays to a plurality of corresponding visible light photons, and a photosensor array responsive to the visible light photons in order to form an electronic image of the radiation therefrom. The photosensor array is adapted to record an integrated amount of charge proportional to the incident gamma rays closest to it, and includes a transparent metallic layer, photodiode consisting of a p-i-n structure formed on one side of the transparent metallic layer, and comprising an upper p-type layer, an intermediate layer and a lower n-type layer. In the preferred mode, the scintillator crystal is composed essentially of a cesium iodide (CsI) crystal preferably doped with a predetermined amount impurity, and the p-type upper intermediate layers and said n-type layer are essentially composed of hydrogenated amorphous silicon (a-Si:H). The gamma ray camera further includes a collimator interposed between the radiation source and the sensor array, and a readout circuit formed on one side of the photosensor array. 6 figs.

  2. Hemispherical Laue camera

    DOEpatents

    Li, James C. M.; Chu, Sungnee G.

    1980-01-01

    A hemispherical Laue camera comprises a crystal sample mount for positioning a sample to be analyzed at the center of sphere of a hemispherical, X-radiation sensitive film cassette, a collimator, a stationary or rotating sample mount and a set of standard spherical projection spheres. X-radiation generated from an external source is directed through the collimator to impinge onto the single crystal sample on the stationary mount. The diffracted beam is recorded on the hemispherical X-radiation sensitive film mounted inside the hemispherical film cassette in either transmission or back-reflection geometry. The distances travelled by X-radiation diffracted from the crystal to the hemispherical film are the same for all crystal planes which satisfy Bragg's Law. The recorded diffraction spots or Laue spots on the film thereby preserve both the symmetry information of the crystal structure and the relative intensities which are directly related to the relative structure factors of the crystal orientations. The diffraction pattern on the exposed film is compared with the known diffraction pattern on one of the standard spherical projection spheres for a specific crystal structure to determine the orientation of the crystal sample. By replacing the stationary sample support with a rotating sample mount, the hemispherical Laue camera can be used for crystal structure determination in a manner previously provided in conventional Debye-Scherrer cameras.

  3. Conceptual design of airborne daytime infrared star cameras

    NASA Astrophysics Data System (ADS)

    Zhou, Yuying; Shen, Weimin; Wu, Feng; Chen, Yuheng

    2010-11-01

    Star camera is a kind of sensitive attitude sensors used for navigation of space vehicles. In order to use it on aircrafts in daytime, the conceptual design and the principle of airborne daytime infrared star cameras are introduced in this paper, as there is enough number of stars in near infrared band to be used as reference of a star camera for calculating attitude. Through analyzing the atmospheric scattering background light intensity for different altitudes, observing angles, and solar angles with Modtran software, and considering IR FPA (infrared focal plane array) performance, shot noise and the required star magnitude for daytime star trackers and sensors, the optical system parameters, i.e. FOV (field of view), clear aperture diameter and effective focal length, are determined according to the required SNR (signal to noise ratio).

  4. Narrow-Line Seyfert Galaxies

    NASA Astrophysics Data System (ADS)

    Crenshaw, D. Michael

    We propose to obtain simultaneous SWP and optical spectra of three unusual Seyfert galaxies that have most of the properties of normal Seyfert I galaxies (high-ionization lines, strong nonstellar continua), but have permitted lines that are much narrower (< 1000 km sec^-1 FWHM). We have obtained test SWP exposures of these objects, and find that total exposure times of 12 - 14 hours should be sufficient to detect many of the weak lines that are blended together in Seyferts with broad (1000 - 6000 km sec^-1 FWHM) lines. We have chosen these three high-ionization narrow-line (HINL) Seyferts because their [OIII] 5007/H-beta ratios are small, which indicates that the emission lines from the low-density region do not severely contaminate those from the high-density region that we wish to study. We wish to accomplish the following scientific objectives: 1. We will study the relation between the HINL Seyferts and Seyfert 1 and 2 galaxies by determining the L-alpha/H-beta ratios to see if there is a partially ionized zone in the high-density clouds of HINL Seyferts. We will also compare the strengths of the UV continua in these objects with those in Seyfert I and 2 galaxies. 2. We will investigate the possibility that the emission feature at 1909 A is not entirely due to C III], and may receive a contribution from Fe III. 3. We will determine the reddening of the emission lines from the He II 1640/He II 4686 ratio. 4. We will identify as many weak features as possible and measure their relative intensities. In particular, we will attempt to determine the C:N:O abundances from various emission lines.

  5. Angled Layers in Super Resolution

    NASA Technical Reports Server (NTRS)

    2004-01-01

    Researchers used a special imaging technique with the panoramic camera on NASA's Mars Exploration Rover Opportunity to get as detailed a look as possible at a target region near eastern foot of 'Burns Cliff.' The intervening terrain was too difficult for driving the rover closer. The target is the boundary between two sections of layered rock. The layers in lower section (left) run at a marked angle to the layers in next higher section (right).

    This view is the product of a technique called super resolution. It was generated from data acquired on sol 288 of Opportunity's mission (Nov. 14, 2004) from a position along the southeast wall of 'Endurance Crater.' Resolution slightly higher than normal for the panoramic camera was synthesized for this view by combining 17 separate images of this scene, each one 'dithered' or pointed slightly differently from the previous one. Computer manipulation of the individual images was then used to generate a new synthetic view of the scene in a process known mathematically as iterative deconvolution, but referred to informally as super resolution. Similar methods have been used to enhance the resolution of images from the Mars Pathfinder mission and the Hubble Space Telescope.

  6. ETR BUILDING, TRA642, INTERIOR. BASEMENT. CAMERA IS IN SAME POSITION ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    ETR BUILDING, TRA-642, INTERIOR. BASEMENT. CAMERA IS IN SAME POSITION AS ID-33-G-98 BUT ANGLED TO SHOW FAR END OF CORRIDOR AND OTHER EXPERIMENTAL GEAR. CAMERA FACES WEST. INL NEGATIVE NO. HD46-30-3. Mike Crane, Photographer, 2/2005 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  7. Camera for recording solar access to a site

    SciTech Connect

    Vezie, R.L.

    1981-11-24

    A camera is disclosed as being of the pin hole type having a transparent member therewithin on which is printed sun paths for certain days of the year and local time sun positions for the purpose of superimposing same on the photograph taken. Obstructions to direct solar radiation may then be calculated from an examination of the resulting photograph to determine the suitability of a site for future use of solar energy. A compass and level on the camera body facilitate southerly orientation of the camera in a level position. Film rollers guide the film over the curved indicia bearing member. Film transfer means retains a film supply roll with a take-up spindle for exposed film. A shutter assembly permits light entry through a pin hole aperture with the camera having a short focal length for a wide angle field. The indicia bearing member is detachably mounted to enable substitution of appropriate like members for use at different latitudes.

  8. Visual Feedback Stabilization of Balancing Tasks with Camera Misalignment

    NASA Astrophysics Data System (ADS)

    Hirata, Kentaro; Mizuno, Takashi

    In this paper, we consider visual feedback stabilization which tolerates small camera misalignment. Specifically, a balancing task with a cart-pendulum system using camera image is examined. Such a task is known to rely heavily on the detection of the vertical direction and the angle measurement error due to the camera misalignment could be fatal for stabilization. From a mathematical model of the measurement error, the effect of the misalignment is naturally represented by affine perturbation to the coefficient matrix of the output equation. Motivated by this fact, a special type of robust dynamic output feedback stabilization against polytopic uncertainty is investigated. By solving the related BMI, one can design a controller which tolerates the camera misalignment to some extent. The result is verified via experiments.

  9. Narrow-Band Thermal Radiation Based on Microcavity Resonant Effect

    NASA Astrophysics Data System (ADS)

    Huang, Jin-Guo; Xuan, Yi-Min; Li, Qiang

    2014-09-01

    The microcavity resonant effect is used to realize narrow-band thermal radiation. Periodic circular aperture arrays with square lattice are patterned on Si substrates by using standard photolithographic techniques and reactive ion etching techniques. Ag films are deposited on the surface of Si substrates with aperture arrays to improve the infrared reflectance. On the basis of the micromachining process, an Ag/Si structured surface exhibiting narrow-band radiation and directivity insensitivity is presented. The emittance spectra exhibit several selective emittance bands attributed to the microcavity resonance effect. The dependence of emittance spectra on sizes and direction is also experimentally examined. The results indicate that the emittance peak of the Ag/Si structured surface can be modulated by tailoring the structural sizes. Moreover, the emittance peak is independent of the radiant angle, which is very important for designing high-performance thermal emitters.

  10. Phoenix Robotic Arm Camera

    NASA Astrophysics Data System (ADS)

    Keller, H. U.; Goetz, W.; Hartwig, H.; Hviid, S. F.; Kramm, R.; Markiewicz, W. J.; Reynolds, R.; Shinohara, C.; Smith, P.; Tanner, R.; Woida, P.; Woida, R.; Bos, B. J.; Lemmon, M. T.

    2008-10-01

    The Phoenix Robotic Arm Camera (RAC) is a variable-focus color camera mounted to the Robotic Arm (RA) of the Phoenix Mars Lander. It is designed to acquire both close-up images of the Martian surface and microscopic images (down to a scale of 23 ?m/pixel) of material collected in the RA scoop. The mounting position at the end of the Robotic Arm allows the RAC to be actively positioned for imaging of targets not easily seen by the Stereo Surface Imager (SSI), such as excavated trench walls and targets under the Lander structure. Color information is acquired by illuminating the target with red, green, and blue light-emitting diodes. Digital terrain models (DTM) can be generated from RAC images acquired from different view points. This can provide high-resolution stereo information about fine details of the trench walls. The large stereo baseline possible with the arm can also provide a far-field DTM. The primary science objectives of the RAC are the search for subsurface soil/ice layering at the landing site and the characterization of scoop samples prior to delivery to other instruments on board Phoenix. The RAC shall also provide low-resolution panoramas in support of SSI activities and acquire images of the Lander deck for instrument and Lander check out. The camera design was inherited from the unsuccessful Mars Polar Lander mission (1999) and further developed for the (canceled) Mars Surveyor 2001 Lander (MSL01). Extensive testing and partial recalibration qualified the MSL01 RAC flight model for integration into the Phoenix science payload.

  11. DEVICE CONTROLLER, CAMERA CONTROL

    SciTech Connect

    1998-07-20

    This is a C++ application that is the server for the cameral control system. Devserv drives serial devices, such as cameras and videoswitchers used in a videoconference, upon request from a client such as the camxfgbfbx ccint program. cc Deverv listens on UPD ports for clients to make network contractions. After a client connects and sends a request to control a device (such as to pan,tilt, or zooma camera or do picture-in-picture with a videoswitcher), devserv formats the request into an RS232 message appropriate for the device and sends this message over the serial port to which the device is connected. Devserv then reads the reply from the device from the serial port to which the device is connected. Devserv then reads the reply from the device from the serial port and then formats and sends via multicast a status message. In addition, devserv periodically multicasts status or description messages so that all clients connected to the multicast channel know what devices are supported and their ranges of motion and the current position. The software design employs a class hierarchy such that an abstract base class for devices can be subclassed into classes for various device categories(e.g. sonyevid30, cononvco4, panasonicwjmx50, etc.). which are further subclassed into classes for various device categories. The devices currently supported are the Sony evi-D30, Canon, VCC1, Canon VCC3, and Canon VCC4 cameras and the Panasonic WJ-MX50 videoswitcher. However, developers can extend the class hierarchy to support other devices.

  12. DEVICE CONTROLLER, CAMERA CONTROL

    Energy Science and Technology Software Center (ESTSC)

    1998-07-20

    This is a C++ application that is the server for the cameral control system. Devserv drives serial devices, such as cameras and videoswitchers used in a videoconference, upon request from a client such as the camxfgbfbx ccint program. cc Deverv listens on UPD ports for clients to make network contractions. After a client connects and sends a request to control a device (such as to pan,tilt, or zooma camera or do picture-in-picture with a videoswitcher),more »devserv formats the request into an RS232 message appropriate for the device and sends this message over the serial port to which the device is connected. Devserv then reads the reply from the device from the serial port to which the device is connected. Devserv then reads the reply from the device from the serial port and then formats and sends via multicast a status message. In addition, devserv periodically multicasts status or description messages so that all clients connected to the multicast channel know what devices are supported and their ranges of motion and the current position. The software design employs a class hierarchy such that an abstract base class for devices can be subclassed into classes for various device categories(e.g. sonyevid30, cononvco4, panasonicwjmx50, etc.). which are further subclassed into classes for various device categories. The devices currently supported are the Sony evi-D30, Canon, VCC1, Canon VCC3, and Canon VCC4 cameras and the Panasonic WJ-MX50 videoswitcher. However, developers can extend the class hierarchy to support other devices.« less

  13. Adaptive compressive sensing camera

    NASA Astrophysics Data System (ADS)

    Hsu, Charles; Hsu, Ming K.; Cha, Jae; Iwamura, Tomo; Landa, Joseph; Nguyen, Charles; Szu, Harold

    2013-05-01

    We have embedded Adaptive Compressive Sensing (ACS) algorithm on Charge-Coupled-Device (CCD) camera based on the simplest concept that each pixel is a charge bucket, and the charges comes from Einstein photoelectric conversion effect. Applying the manufactory design principle, we only allow altering each working component at a minimum one step. We then simulated what would be such a camera can do for real world persistent surveillance taking into account of diurnal, all weather, and seasonal variations. The data storage has saved immensely, and the order of magnitude of saving is inversely proportional to target angular speed. We did design two new components of CCD camera. Due to the matured CMOS (Complementary metal-oxide-semiconductor) technology, the on-chip Sample and Hold (SAH) circuitry can be designed for a dual Photon Detector (PD) analog circuitry for changedetection that predicts skipping or going forward at a sufficient sampling frame rate. For an admitted frame, there is a purely random sparse matrix [?] which is implemented at each bucket pixel level the charge transport bias voltage toward its neighborhood buckets or not, and if not, it goes to the ground drainage. Since the snapshot image is not a video, we could not apply the usual MPEG video compression and Hoffman entropy codec as well as powerful WaveNet Wrapper on sensor level. We shall compare (i) Pre-Processing FFT and a threshold of significant Fourier mode components and inverse FFT to check PSNR; (ii) Post-Processing image recovery will be selectively done by CDT&D adaptive version of linear programming at L1 minimization and L2 similarity. For (ii) we need to determine in new frames selection by SAH circuitry (i) the degree of information (d.o.i) K(t) dictates the purely random linear sparse combination of measurement data a la [?]M,N M(t) = K(t) Log N(t).

  14. Neutron Imaging Camera

    NASA Technical Reports Server (NTRS)

    Hunter, Stanley; deNolfo, G. A.; Barbier, L. M.; Link, J. T.; Son, S.; Floyd, S. R.; Guardala, N.; Skopec, M.; Stark, B.

    2008-01-01

    The Neutron Imaging Camera (NIC) is based on the Three-dimensional Track Imager (3DTI) technology developed at GSFC for gamma-ray astrophysics applications. The 3-DTI, a large volume time-projection chamber, provides accurate, approximately 0.4 mm resolution, 3-D tracking of charged particles. The incident direction of fast neutrons, En > 0.5 MeV, are reconstructed from the momenta and energies of the proton and triton fragments resulting from (sup 3)He(n,p) (sup 3)H interactions in the 3-DTI volume. The performance of the NIC from laboratory and accelerator tests is presented.

  15. Subsurface "radar" camera

    NASA Technical Reports Server (NTRS)

    Jain, A.

    1977-01-01

    Long-wave length multiple-frequency radar is used for imaging and determining depth of subsurface stratified layers. Very-low frequency radar signals pinpoint below-ground strata via direct imagery techniques. Variation of frequency and scanning angle adjusts image depth and width.

  16. Optical performance analysis of plenoptic camera systems

    NASA Astrophysics Data System (ADS)

    Langguth, Christin; Oberdrster, Alexander; Brckner, Andreas; Wippermann, Frank; Bruer, Andreas

    2014-09-01

    Adding an array of microlenses in front of the sensor transforms the capabilities of a conventional camera to capture both spatial and angular information within a single shot. This plenoptic camera is capable of obtaining depth information and providing it for a multitude of applications, e.g. artificial re-focusing of photographs. Without the need of active illumination it represents a compact and fast optical 3D acquisition technique with reduced effort in system alignment. Since the extent of the aperture limits the range of detected angles, the observed parallax is reduced compared to common stereo imaging systems, which results in a decreased depth resolution. Besides, the gain of angular information implies a degraded spatial resolution. This trade-off requires a careful choice of the optical system parameters. We present a comprehensive assessment of possible degrees of freedom in the design of plenoptic systems. Utilizing a custom-built simulation tool, the optical performance is quantified with respect to particular starting conditions. Furthermore, a plenoptic camera prototype is demonstrated in order to verify the predicted optical characteristics.

  17. Optimal Number of Angle Images for Calculating Anterior Angle Volume and Iris Volume Measurements

    PubMed Central

    Blieden, Lauren S.; Chuang, Alice Z.; Baker, Laura A.; Bell, Nicholas P.; Fuller, Timothy S.; Mankiewicz, Kimberly A.; Feldman, Robert M.

    2015-01-01

    Purpose. We determined the optimal number of angle images required to obtain reliable measurements of trabecular-iris circumferential volume (TICV) and iris volume (IV) using swept-source Fourier domain anterior segment optical coherence tomography (SSFD-ASOCT) scans in narrow angle eyes. Methods. Scleral spur landmarks (SSL) were manually identified on ASOCT angle images from 128 meridians from each of 24 eyes with chronic primary angle closure (PAC) spectrum of disease. The anterior and posterior corneal curves, and the anterior and posterior iris surfaces were identified automatically by the anterior chamber analysis and interpretation (ACAI) software, then manually examined and edited by the reader if required. Trabecular-iris circumferential volume at 750 ?m from SSL (TICV750) and IV were subsequently calculated using varying numbers of angle images. Threshold error was determined to be less than the lower 95% confidence limit of mean absolute percent error (MAPE) of the change in TICV or IV resulting from laser peripheral iridotomy, which would be 17% for TICV and 5% for IV, based on previous studies. The optimal number of angle images was the smallest number of images where MAPE was less than this threshold for TICV and IV. Results. A total of 32 equally-spaced angle images (16 meridians) was required to estimate TICV750 and 16 angle images (8 meridians) to estimate IV. Both were within 4.6% and 1.6% of MAPE, respectively. Conclusions. It is possible to determine TICV and IV parameters reliably in narrow angles without evaluating all 128 meridians obtained with SSFD-ASOCT. PMID:25829412

  18. Mars Science Laboratory Engineering Cameras

    NASA Technical Reports Server (NTRS)

    Maki, Justin N.; Thiessen, David L.; Pourangi, Ali M.; Kobzeff, Peter A.; Lee, Steven W.; Dingizian, Arsham; Schwochert, Mark A.

    2012-01-01

    NASA's Mars Science Laboratory (MSL) Rover, which launched to Mars in 2011, is equipped with a set of 12 engineering cameras. These cameras are build-to-print copies of the Mars Exploration Rover (MER) cameras, which were sent to Mars in 2003. The engineering cameras weigh less than 300 grams each and use less than 3 W of power. Images returned from the engineering cameras are used to navigate the rover on the Martian surface, deploy the rover robotic arm, and ingest samples into the rover sample processing system. The navigation cameras (Navcams) are mounted to a pan/tilt mast and have a 45-degree square field of view (FOV) with a pixel scale of 0.82 mrad/pixel. The hazard avoidance cameras (Haz - cams) are body-mounted to the rover chassis in the front and rear of the vehicle and have a 124-degree square FOV with a pixel scale of 2.1 mrad/pixel. All of the cameras utilize a frame-transfer CCD (charge-coupled device) with a 1024x1024 imaging region and red/near IR bandpass filters centered at 650 nm. The MSL engineering cameras are grouped into two sets of six: one set of cameras is connected to rover computer A and the other set is connected to rover computer B. The MSL rover carries 8 Hazcams and 4 Navcams.

  19. Stereoscopic camera design

    NASA Astrophysics Data System (ADS)

    Montgomery, David J.; Jones, Christopher K.; Stewart, James N.; Smith, Alan

    2002-05-01

    It is clear from the literature that the majority of work in stereoscopic imaging is directed towards the development of modern stereoscopic displays. As costs come down, wider public interest in this technology is expected to increase. This new technology would require new methods of image formation. Advances in stereo computer graphics will of course lead to the creation of new stereo computer games, graphics in films etc. However, the consumer would also like to see real-world stereoscopic images, pictures of family, holiday snaps etc. Such scenery would have wide ranges of depth to accommodate and would need also to cope with moving objects, such as cars, and in particular other people. Thus, the consumer acceptance of auto/stereoscopic displays and 3D in general would be greatly enhanced by the existence of a quality stereoscopic camera. This paper will cover an analysis of existing stereoscopic camera designs and show that they can be categorized into four different types, with inherent advantages and disadvantages. A recommendation is then made with regard to 3D consumer still and video photography. The paper will go on to discuss this recommendation and describe its advantages and how it can be realized in practice.

  20. PAU camera: detectors characterization

    NASA Astrophysics Data System (ADS)

    Casas, Ricard; Ballester, Otger; Cardiel-Sas, Laia; Castilla, Javier; Jimnez, Jorge; Maiorino, Marino; Po, Cristbal; Sevilla, Ignacio; de Vicente, Juan

    2012-07-01

    The PAU Camera (PAUCam) [1,2] is a wide field camera that will be mounted at the corrected prime focus of the William Herschel Telescope (Observatorio del Roque de los Muchachos, Canary Islands, Spain) in the next months. The focal plane of PAUCam is composed by a mosaic of 18 CCD detectors of 2,048 x 4,176 pixels each one with a pixel size of 15 microns, manufactured by Hamamatsu Photonics K. K. This mosaic covers a field of view (FoV) of 60 arcmin (minutes of arc), 40 of them are unvignetted. The behaviour of these 18 devices, plus four spares, and their electronic response should be characterized and optimized for the use in PAUCam. This job is being carried out in the laboratories of the ICE/IFAE and the CIEMAT. The electronic optimization of the CCD detectors is being carried out by means of an OG (Output Gate) scan and maximizing it CTE (Charge Transfer Efficiency) while the read-out noise is minimized. The device characterization itself is obtained with different tests. The photon transfer curve (PTC) that allows to obtain the electronic gain, the linearity vs. light stimulus, the full-well capacity and the cosmetic defects. The read-out noise, the dark current, the stability vs. temperature and the light remanence.

  1. Object recognition through turbulence with a modified plenoptic camera

    NASA Astrophysics Data System (ADS)

    Wu, Chensheng; Ko, Jonathan; Davis, Christopher

    2015-03-01

    Atmospheric turbulence adds accumulated distortion to images obtained by cameras and surveillance systems. When the turbulence grows stronger or when the object is further away from the observer, increasing the recording device resolution helps little to improve the quality of the image. Many sophisticated methods to correct the distorted images have been invented, such as using a known feature on or near the target object to perform a deconvolution process, or use of adaptive optics. However, most of the methods depend heavily on the object's location, and optical ray propagation through the turbulence is not directly considered. Alternatively, selecting a lucky image over many frames provides a feasible solution, but at the cost of time. In our work, we propose an innovative approach to improving image quality through turbulence by making use of a modified plenoptic camera. This type of camera adds a micro-lens array to a traditional high-resolution camera to form a semi-camera array that records duplicate copies of the object as well as "superimposed" turbulence at slightly different angles. By performing several steps of image reconstruction, turbulence effects will be suppressed to reveal more details of the object independently (without finding references near the object). Meanwhile, the redundant information obtained by the plenoptic camera raises the possibility of performing lucky image algorithmic analysis with fewer frames, which is more efficient. In our work, the details of our modified plenoptic cameras and image processing algorithms will be introduced. The proposed method can be applied to coherently illuminated object as well as incoherently illuminated objects. Our result shows that the turbulence effect can be effectively suppressed by the plenoptic camera in the hardware layer and a reconstructed "lucky image" can help the viewer identify the object even when a "lucky image" by ordinary cameras is not achievable.

  2. Transmission electron microscope CCD camera

    DOEpatents

    Downing, Kenneth H. (Lafayette, CA)

    1999-01-01

    In order to improve the performance of a CCD camera on a high voltage electron microscope, an electron decelerator is inserted between the microscope column and the CCD. This arrangement optimizes the interaction of the electron beam with the scintillator of the CCD camera while retaining optimization of the microscope optics and of the interaction of the beam with the specimen. Changing the electron beam energy between the specimen and camera allows both to be optimized.

  3. Particle friction angles in steep mountain channels

    NASA Astrophysics Data System (ADS)

    Prancevic, Jeff P.; Lamb, Michael P.

    2015-02-01

    Sediment transport rates in steep mountain channels are typically an order of magnitude lower than predicted by models developed for lowland rivers. One hypothesis for this observation is that particles are more stable in mountain channels due to particle-particle interlocking or bridging across the channel width. This hypothesis has yet to be tested, however, because we lack direct measurements of particle friction angles in steep mountain channels. Here we address this data gap by directly measuring the minimum force required to dislodge sediment (pebbles to boulders) and the sediment weight in mountain channels using a handheld force gauge. At eight sites in California, with reach-averaged bed angles ranging from 0.5 to 23 and channel widths ranging from 2 m to 16 m, we show that friction angles in natural streams average 68 and are 16 larger than those typically measured in laboratory experiments, which is likely due to particle interlocking and burial. Results also show that larger grains are disproportionately more stable than predicted by existing models and that grains organized into steps are twice as stable as grains outside of steps. However, the mean particle friction angle does not vary systematically with bed slope. These results do not support systematic increases in friction angle in steeper and narrower channels to explain the observed low sediment transport rates in mountain channels. Instead, the spatial pattern and grain-size dependence of particle friction angles may indirectly lower transport rates in steep, narrow channels by stabilizing large clasts and channel-spanning steps, which act as momentum sinks due to form drag.

  4. LRO Camera Imaging of Potential Landing Sites in the South Pole-Aitken Basin

    NASA Astrophysics Data System (ADS)

    Jolliff, B. L.; Wiseman, S. M.; Gibson, K. E.; Lauber, C.; Robinson, M.; Gaddis, L. R.; Scholten, F.; Oberst, J.; LROC Science; Operations Team

    2010-12-01

    We show results of WAC (Wide Angle Camera) and NAC (Narrow Angle Camera) imaging of candidate landing sites within the South Pole-Aitken (SPA) basin of the Moon obtained by the Lunar Reconnaissance Orbiter during the first full year of operation. These images enable a greatly improved delineation of geologic units, determination of unit thicknesses and stratigraphy, and detailed surface characterization that has not been possible with previous data. WAC imaging encompasses the entire SPA basin, located within an area ranging from ~ 130-250 degrees east longitude and ~15 degrees south latitude to the South Pole, at different incidence angles, with the specific range of incidence dependent on latitude. The WAC images show morphology and surface detail at better than 100 m per pixel, with spatial coverage and quality unmatched by previous data sets. NAC images reveal details at the sub-meter pixel scale that enable new ways to evaluate the origins and stratigraphy of deposits. Key among new results is the capability to discern extents of ancient volcanic deposits that are covered by later crater ejecta (cryptomare) [see Petro et al., this conference] using new, complementary color data from Kaguya and Chandrayaan-1. Digital topographic models derived from WAC and NAC geometric stereo coverage show broad intercrater-plains areas where slopes are acceptably low for high-probability safe landing [see Archinal et al., this conference]. NAC images allow mapping and measurement of small, fresh craters that excavated boulders and thus provide information on surface roughness and depth to bedrock beneath regolith and plains deposits. We use these data to estimate deposit thickness in areas of interest for landing and potential sample collection to better understand the possible provenance of samples. Also, small regions marked by fresh impact craters and their associated boulder fields are readily identified by their bright ejecta patterns and marked as lander keep-out zones. We will show examples of LROC data including those for Constellation sites on the SPA rim and interior, a site between Bose and Alder Craters, sites east of Bhabha Crater, and sites on and near the “Mafic Mound” [see Pieters et al., this conference]. Together the LROC data and complementary products provide essential information for ensuring identification of safe landing and sampling sites within SPA basin that has never before been available for a planetary mission.

  5. A Motionless Camera

    NASA Technical Reports Server (NTRS)

    1994-01-01

    Omniview, a motionless, noiseless, exceptionally versatile camera was developed for NASA as a receiving device for guiding space robots. The system can see in one direction and provide as many as four views simultaneously. Developed by Omniview, Inc. (formerly TRI) under a NASA Small Business Innovation Research (SBIR) grant, the system's image transformation electronics produce a real-time image from anywhere within a hemispherical field. Lens distortion is removed, and a corrected "flat" view appears on a monitor. Key elements are a high resolution charge coupled device (CCD), image correction circuitry and a microcomputer for image processing. The system can be adapted to existing installations. Applications include security and surveillance, teleconferencing, imaging, virtual reality, broadcast video and military operations. Omniview technology is now called IPIX. The company was founded in 1986 as TeleRobotics International, became Omniview in 1995, and changed its name to Interactive Pictures Corporation in 1997.

  6. Neutron Imaging Camera

    NASA Technical Reports Server (NTRS)

    Hunter, Stanley D.; DeNolfo, Georgia; Floyd, Sam; Krizmanic, John; Link, Jason; Son, Seunghee; Guardala, Noel; Skopec, Marlene; Stark, Robert

    2008-01-01

    We describe the Neutron Imaging Camera (NIC) being developed for DTRA applications by NASA/GSFC and NSWC/Carderock. The NIC is based on the Three-dimensional Track Imager (3-DTI) technology developed at GSFC for gamma-ray astrophysics applications. The 3-DTI, a large volume time-projection chamber, provides accurate, approximately 0.4 mm resolution. 3-D tracking of charged particles. The incident direction of fast neutrons, E(sub N) > 0.5 MeV. arc reconstructed from the momenta and energies of the proton and triton fragments resulting from 3He(n,p)3H interactions in the 3-DTI volume. We present angular and energy resolution performance of the NIC derived from accelerator tests.

  7. LSST Camera Electronics

    NASA Astrophysics Data System (ADS)

    Newcomer, F. Mitchell; Bailey, S.; Britton, C. L.; Felt, N.; Geary, J.; Hashimi, K.; Lebbolo, H.; Lebbolo, H.; Ning, Z.; O'Connor, P.; Oliver, J.; Radeka, V.; Sefri, R.; Tocut, V.; Van Berg, R.

    2009-01-01

    The 3.2 Gpixel LSST camera will be read out by means of 189 highly segmented 4K x 4K CCDs. A total of 3024 video channels will be processed by a modular, in-cryostat electronics package based on two custom multichannel analog ASICs now in development. Performance goals of 5 electrons noise, .01% electronic crosstalk, and 80 mW power dissipation per channel are targeted. The focal plane is organized as a set of 12K x 12K sub-mosaics ("rafts") with front end electronics housed in an enclosure falling within the footprint of the CCDs making up the raft. The assembly of CCDs, baseplate, electronics boards, and cooling components constitutes a self-contained and testable 144 Mpix imager ("raft tower"), and 21 identical raft towers make up the LSST science focal plane. Electronic, mechanical, and thermal prototypes are now undergoing testing and results will be presented at the meeting.

  8. Reflectance characteristics of the Viking lander camera reference test charts

    NASA Technical Reports Server (NTRS)

    Wall, S. D.; Burcher, E. E.; Jabson, D. J.

    1975-01-01

    Reference test charts provide radiometric, colorimetric, and spatial resolution references for the Viking lander cameras on Mars. Reflectance measurements of these references are described, including the absolute bidirectional reflectance of the radiometric references and the relative spectral reflectance of both radiometric and colorimetric references. Results show that the bidirection reflectance of the radiometric references is Lambertian to within + or - 7% for incidence angles between 20 deg and 60 deg, and that their spectral reflectance is constant with wavelength to within + or - 5% over the spectral range of the cameras. Estimated accuracy of the measurements is + or - 0.05 in relative spectral reflectance.

  9. 30. TACOMA NARROWS BRIDGE, LOOKING EAST THROUGH TOLL LANES, 29 ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    30. TACOMA NARROWS BRIDGE, LOOKING EAST THROUGH TOLL LANES, 29 AUGUST 1940. (ELDRIDGE, CLARK H. TACOMA NARROWS BRIDGE, TACOMA, WASHINGTON, FINAL REPORT ON DESIGN AND CONSTRUCTION, 1941) - Tacoma Narrows Bridge, Spanning Narrows at State Route 16, Tacoma, Pierce County, WA

  10. 31. TACOMA NARROWS BRIDGE, LOOKING WEST ACROSS TOLL PLAZA, 29 ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    31. TACOMA NARROWS BRIDGE, LOOKING WEST ACROSS TOLL PLAZA, 29 AUGUST 1940. (ELDRIDGE, CLARK M. TACOMA NARROWS BRIDGE, TACOMA, WASHINGTON, FINAL REPORT ON DESIGN AND CONSTRUCTION, 1941) - Tacoma Narrows Bridge, Spanning Narrows at State Route 16, Tacoma, Pierce County, WA

  11. Narrow-band radiation wavelength measurement by processing digital photographs in RAW format

    SciTech Connect

    Kraiskii, A V; Mironova, T V; Sultanov, T T

    2012-12-31

    The technique of measuring the mean wavelength of narrow-band radiation in the 455 - 625-nm range using the image of the emitting surface is presented. The data from the camera array unprocessed by the built-in processor (RAW format) are used. The method is applied for determining the parameters of response of holographic sensors. Depending on the wavelength and brightness of the image fragment, the mean square deviation of the wavelength amounts to 0.3 - 3 nm. (experimental techniques)

  12. Camera Calibration for Uav Application Using Sensor of Mobile Camera

    NASA Astrophysics Data System (ADS)

    Takahashi, Y.; Chikatsu, H.

    2015-05-01

    Recently, 3D measurements using small unmanned aerial vehicles (UAVs) have increased in Japan, because small type UAVs is easily available at low cost and the analysis software can be created the easily 3D models. However, small type UAVs have a problem: they have very short flight times and a small payload. In particular, as the payload of a small type UAV increases, its flight time decreases. Therefore, it is advantageous to use lightweight sensors in small type UAVs. A mobile camera is lightweight and has many sensors such as an accelerometer, a magnetic field, and a gyroscope. Moreover, these sensors can be used simultaneously. Therefore, the authors think that the problems of small UAVs can be solved using the mobile camera. The authors executed camera calibration using a test target for evaluating sensor values measured using a mobile camera. Consequently, the authors confirmed the same accuracy with normal camera calibration.

  13. Critical Heat Flux In Inclined Rectangular Narrow Long Channel

    SciTech Connect

    J. L. Rempe; S. W. Noh; Y. H. Kim; K. Y. Suh; F.B.Cheung; S. B. Kim

    2005-05-01

    In the TMI-2 accident, the lower part of the reactor pressure vessel had been overheated and then rather rapidly cooled down, as was later identified in a vessel investigation project. This accounted for the possibility of gap cooling feasibility. For this reason, several investigations were performed to determine the critical heat flux (CHF) from the standpoint of invessel retention. The experiments are conducted to investigate the general boiling phenomena, and the triggering mechanism for the CHF in a narrow gap using a 5 x 105 mm2 crevice type heater assembly and de-mineralized water. The test parameters include the gap size of 5 mm, and the surface orientation angles from the downward facing position (180o) to the vertical position (90o). The orientation angle affects the bubble layer and escape from the narrow gap. The CHF is less than that in a shorter channel, compared with the previous experiments having a heated length of 35 mmin the copper test section.

  14. New optical receiving system design for portable camera lidar

    NASA Astrophysics Data System (ADS)

    Qin, Laian; He, Feng; Jing, Xu; Tan, Fengfu

    2015-10-01

    For its better spectral response characterization, higher quantum efficiency and signal-to-noise ratio, camera is more and more used in atmospheric parameters measurement lidar. Camera lidars retrieval atmospheric parameters by analyzing the light column images acquired by the cameras and objectives through gathering the backscatter light of the laser beam. Lidars of this kind usually have higher spatial resolution and better real time performance. However, because of its limited depth of field (DOF), the measurement accuracy of the area out of the DOF is influenced by optical defocus in different degree. In the meantime, it is also not suitable for portable equipments for using small relative aperture receiving objective. Based on improving the design of the receiving objective, a new design scheme is proposed in this paper about improving the optical receiving system of the camera lidar. This scheme can improve the measurement accuracy of the area out of the DOF in traditional structure by using large DOF, large relative aperture offaxis objective and the special using mode of the camera. The optical receiving system designed according to this scheme is more compact and is especially suitable for portable instrument. Furthermore, the relation among the focus length, the distance between laser and objective and the installation angle is also analyzed in this paper. The formula is given at the same time. This scheme is carried out in camera lidar system in laboratory and the results are satisfactory.

  15. A view of the ET camera on STS-112

    NASA Technical Reports Server (NTRS)

    2002-01-01

    KENNEDY SPACE CENTER, FLA. - A closeup view of the camera mounted on the external tank of Space Shuttle Atlantis. The color video camera mounted to the top of Atlantis' external tank will provide a view of the front and belly of the orbiter and a portion of the solid rocket boosters (SRBs) and external tank during the launch of Atlantis on mission STS-112. It will offer the STS-112 team an opportunity to monitor the shuttle's performance from a new angle. The camera will be turned on fifteen minutes prior to launch and will show the orbiter and solid rocket boosters on the launch pad. The video will be downlinked from the external tank during flight to several NASA data-receiving sites and then relayed to the live television broadcast. The camera is expected to operate for about 15 minutes following liftoff. At liftoff, viewers will see the shuttle clearing the launch tower and, at two minutes after liftoff, see the right SRB separate from the external tank. When the external tank separates from Atlantis about eight minutes into the flight, the camera is expected to continue its live feed for about six more minutes although NASA may be unable to pick up the camera's signal because the tank may have moved out of range.

  16. A view of the ET camera on STS-112

    NASA Technical Reports Server (NTRS)

    2002-01-01

    KENNEDY SPACE CENTER, FLA. - A view of the camera mounted on the external tank of Space Shuttle Atlantis. The color video camera mounted to the top of Atlantis' external tank will provide a view of the front and belly of the orbiter and a portion of the solid rocket boosters (SRBs) and external tank during the launch of Atlantis on mission STS-112. It will offer the STS-112 team an opportunity to monitor the shuttle's performance from a new angle. The camera will be turned on fifteen minutes prior to launch and will show the orbiter and solid rocket boosters on the launch pad. The video will be downlinked from the external tank during flight to several NASA data-receiving sites and then relayed to the live television broadcast. The camera is expected to operate for about 15 minutes following liftoff. At liftoff, viewers will see the shuttle clearing the launch tower and, at two minutes after liftoff, see the right SRB separate from the external tank. When the external tank separates from Atlantis about eight minutes into the flight, the camera is expected to continue its live feed for about six more minutes although NASA may be unable to pick up the camera's signal because the tank may have moved out of range.

  17. Bundle Adjustment for Multi-Camera Systems with Points at Infinity

    NASA Astrophysics Data System (ADS)

    Schneider, J.; Schindler, F.; Lbe, T.; Frstner, W.

    2012-07-01

    We present a novel approach for a rigorous bundle adjustment for omnidirectional and multi-view cameras, which enables an efficient maximum-likelihood estimation with image and scene points at infinity. Multi-camera systems are used to increase the resolution, to combine cameras with different spectral sensitivities (Z/I DMC, Vexcel Ultracam) or - like omnidirectional cameras - to augment the effective aperture angle (Blom Pictometry, Rollei Panoscan Mark III). Additionally multi-camera systems gain in importance for the acquisition of complex 3D structures. For stabilizing camera orientations - especially rotations - one should generally use points at the horizon over long periods of time within the bundle adjustment that classical bundle adjustment programs are not capable of. We use a minimal representation of homogeneous coordinates for image and scene points. Instead of eliminating the scale factor of the homogeneous vectors by Euclidean normalization, we normalize the homogeneous coordinates spherically. This way we can use images of omnidirectional cameras with single-view point like fisheye cameras and scene points, which are far away or at infinity. We demonstrate the feasibility and the potential of our approach on real data taken with a single camera, the stereo camera FinePix Real 3D W3 from Fujifilm and the multi-camera system Ladybug 3 from Point Grey.

  18. Do narrow {Sigma}-hypernuclear states exist?

    SciTech Connect

    Chrien, R.E.

    1995-12-31

    Reports of narrow states in {Sigma}-hypernucleus production have appeared from time to time. The present experiment is a repeat of the first and seemingly most definitive such experiment, that on a target of {sup 9}Be, but with much better statistics. No narrow states were observed.

  19. Infants Experience Perceptual Narrowing for Nonprimate Faces

    ERIC Educational Resources Information Center

    Simpson, Elizabeth A.; Varga, Krisztina; Frick, Janet E.; Fragaszy, Dorothy

    2011-01-01

    Perceptual narrowing--a phenomenon in which perception is broad from birth, but narrows as a function of experience--has previously been tested with primate faces. In the first 6 months of life, infants can discriminate among individual human and monkey faces. Though the ability to discriminate monkey faces is lost after about 9 months, infants

  20. Infants Experience Perceptual Narrowing for Nonprimate Faces

    ERIC Educational Resources Information Center

    Simpson, Elizabeth A.; Varga, Krisztina; Frick, Janet E.; Fragaszy, Dorothy

    2011-01-01

    Perceptual narrowing--a phenomenon in which perception is broad from birth, but narrows as a function of experience--has previously been tested with primate faces. In the first 6 months of life, infants can discriminate among individual human and monkey faces. Though the ability to discriminate monkey faces is lost after about 9 months, infants…

  1. Digital synchroballistic schlieren camera for high-speed photography of bullets and rocket sleds

    NASA Astrophysics Data System (ADS)

    Buckner, Benjamin D.; L'Esperance, Drew

    2013-08-01

    A high-speed digital streak camera designed for simultaneous high-resolution color photography and focusing schlieren imaging is described. The camera uses a computer-controlled galvanometer scanner to achieve synchroballistic imaging through a narrow slit. Full color 20 megapixel images of a rocket sled moving at 480 m/s and of projectiles fired at around 400 m/s were captured, with high-resolution schlieren imaging in the latter cases, using conventional photographic flash illumination. The streak camera can achieve a line rate for streak imaging of up to 2.4 million lines/s.

  2. CPAPIR: a wide-field infrared camera for the Observatoire du Mont Megantic

    NASA Astrophysics Data System (ADS)

    Artigau, Etienne; Doyon, Rene; Vallee, Philippe; Riopel, Martin; Nadeau, Daniel

    2004-09-01

    CPAPIR is a wide-field infrared camera for use at the Observatoire du mont Megantic and CTIO 1.5 m telescopes. The camera will be primarily a survey instrument with a half-degree field of view, making it one of the most efficient of its kind. CPAPIR will provide broad and narrow band filters within its 0.8 to 2.5 ?m bandpass. The camera is based on a Hawaii-2 2048x2048 HgCdTe detector.

  3. Narrow band gap amorphous silicon semiconductors

    DOEpatents

    Madan, A.; Mahan, A.H.

    1985-01-10

    Disclosed is a narrow band gap amorphous silicon semiconductor comprising an alloy of amorphous silicon and a band gap narrowing element selected from the group consisting of Sn, Ge, and Pb, with an electron donor dopant selected from the group consisting of P, As, Sb, Bi and N. The process for producing the narrow band gap amorphous silicon semiconductor comprises the steps of forming an alloy comprising amorphous silicon and at least one of the aforesaid band gap narrowing elements in amount sufficient to narrow the band gap of the silicon semiconductor alloy below that of amorphous silicon, and also utilizing sufficient amounts of the aforesaid electron donor dopant to maintain the amorphous silicon alloy as an n-type semiconductor.

  4. Mars Exploration Rover engineering cameras

    USGS Publications Warehouse

    Maki, J.N.; Bell, J.F., III; Herkenhoff, K. E.; Squyres, S. W.; Kiely, A.; Klimesh, M.; Schwochert, M.; Litwin, T.; Willson, R.; Johnson, Aaron H.; Maimone, M.; Baumgartner, E.; Collins, A.; Wadsworth, M.; Elliot, S.T.; Dingizian, A.; Brown, D.; Hagerott, E.C.; Scherr, L.; Deen, R.; Alexander, D.; Lorre, J.

    2003-01-01

    NASA's Mars Exploration Rover (MER) Mission will place a total of 20 cameras (10 per rover) onto the surface of Mars in early 2004. Fourteen of the 20 cameras are designated as engineering cameras and will support the operation of the vehicles on the Martian surface. Images returned from the engineering cameras will also be of significant importance to the scientific community for investigative studies of rock and soil morphology. The Navigation cameras (Navcams, two per rover) are a mast-mounted stereo pair each with a 45?? square field of view (FOV) and an angular resolution of 0.82 milliradians per pixel (mrad/pixel). The Hazard Avoidance cameras (Hazcams, four per rover) are a body-mounted, front- and rear-facing set of stereo pairs, each with a 124?? square FOV and an angular resolution of 2.1 mrad/pixel. The Descent camera (one per rover), mounted to the lander, has a 45?? square FOV and will return images with spatial resolutions of ???4 m/pixel. All of the engineering cameras utilize broadband visible filters and 1024 x 1024 pixel detectors. Copyright 2003 by the American Geophysical Union.

  5. Airborne ballistic camera tracking systems

    NASA Technical Reports Server (NTRS)

    Redish, W. L.

    1976-01-01

    An operational airborne ballistic camera tracking system was tested for operational and data reduction feasibility. The acquisition and data processing requirements of the system are discussed. Suggestions for future improvements are also noted. A description of the data reduction mathematics is outlined. Results from a successful reentry test mission are tabulated. The test mission indicated that airborne ballistic camera tracking systems are feasible.

  6. Digital Cameras for Student Use.

    ERIC Educational Resources Information Center

    Simpson, Carol

    1997-01-01

    Describes the features, equipment and operations of digital cameras and compares three different digital cameras for use in education. Price, technology requirements, features, transfer software, and accessories for the Kodak DC25, Olympus D-200L and Casio QV-100 are presented in a comparison table. (AEF)

  7. The "All Sky Camera Network"

    ERIC Educational Resources Information Center

    Caldwell, Andy

    2005-01-01

    In 2001, the "All Sky Camera Network" came to life as an outreach program to connect the Denver Museum of Nature and Science (DMNS) exhibit "Space Odyssey" with Colorado schools. The network is comprised of cameras placed strategically at schools throughout Colorado to capture fireballs--rare events that produce meteorites. Meteorites have great

  8. An Educational PET Camera Model

    ERIC Educational Resources Information Center

    Johansson, K. E.; Nilsson, Ch.; Tegner, P. E.

    2006-01-01

    Positron emission tomography (PET) cameras are now in widespread use in hospitals. A model of a PET camera has been installed in Stockholm House of Science and is used to explain the principles of PET to school pupils as described here.

  9. The "All Sky Camera Network"

    ERIC Educational Resources Information Center

    Caldwell, Andy

    2005-01-01

    In 2001, the "All Sky Camera Network" came to life as an outreach program to connect the Denver Museum of Nature and Science (DMNS) exhibit "Space Odyssey" with Colorado schools. The network is comprised of cameras placed strategically at schools throughout Colorado to capture fireballs--rare events that produce meteorites. Meteorites have great…

  10. SEOS frame camera applications study

    NASA Technical Reports Server (NTRS)

    1974-01-01

    A research and development satellite is discussed which will provide opportunities for observation of transient phenomena that fall within the fixed viewing circle of the spacecraft. The evaluation of possible applications for frame cameras, for SEOS, are studied. The computed lens characteristics for each camera are listed.

  11. Camera artifacts in IUE spectra

    NASA Technical Reports Server (NTRS)

    Bruegman, O. W.; Crenshaw, D. M.

    1994-01-01

    This study of emission line mimicking features in the IUE cameras has produced an atlas of artifiacts in high-dispersion images with an accompanying table of prominent artifacts and a table of prominent artifacts in the raw images along with a medium image of the sky background for each IUE camera.

  12. 3. Elevation view of entire midsection using ultrawide angle lens. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    3. Elevation view of entire midsection using ultrawide angle lens. Note opened south doors and closed north doors. The following photo WA-203-C-4 is similar except the camera position was moved right to include the slope of the south end. - Puget Sound Naval Shipyard, Munitions Storage Bunker, Naval Ammunitions Depot, South of Campbell Trail, Bremerton, Kitsap County, WA

  13. Coherent infrared imaging camera (CIRIC)

    SciTech Connect

    Hutchinson, D.P.; Simpson, M.L.; Bennett, C.A.; Richards, R.K.; Emery, M.S.; Crutcher, R.I.; Sitter, D.N. Jr.; Wachter, E.A.; Huston, M.A.

    1995-07-01

    New developments in 2-D, wide-bandwidth HgCdTe (MCT) and GaAs quantum-well infrared photodetectors (QWIP) coupled with Monolithic Microwave Integrated Circuit (MMIC) technology are now making focal plane array coherent infrared (IR) cameras viable. Unlike conventional IR cameras which provide only thermal data about a scene or target, a coherent camera based on optical heterodyne interferometry will also provide spectral and range information. Each pixel of the camera, consisting of a single photo-sensitive heterodyne mixer followed by an intermediate frequency amplifier and illuminated by a separate local oscillator beam, constitutes a complete optical heterodyne receiver. Applications of coherent IR cameras are numerous and include target surveillance, range detection, chemical plume evolution, monitoring stack plume emissions, and wind shear detection.

  14. IMAX camera (12-IML-1)

    NASA Technical Reports Server (NTRS)

    1992-01-01

    The IMAX camera system is used to record on-orbit activities of interest to the public. Because of the extremely high resolution of the IMAX camera, projector, and audio systems, the audience is afforded a motion picture experience unlike any other. IMAX and OMNIMAX motion picture systems were designed to create motion picture images of superior quality and audience impact. The IMAX camera is a 65 mm, single lens, reflex viewing design with a 15 perforation per frame horizontal pull across. The frame size is 2.06 x 2.77 inches. Film travels through the camera at a rate of 336 feet per minute when the camera is running at the standard 24 frames/sec.

  15. Reading Angles in Maps

    ERIC Educational Resources Information Center

    Izard, Vronique; O'Donnell, Evan; Spelke, Elizabeth S.

    2014-01-01

    Preschool children can navigate by simple geometric maps of the environment, but the nature of the geometric relations they use in map reading remains unclear. Here, children were tested specifically on their sensitivity to angle. Forty-eight children (age 47:15-53:30months) were presented with fragments of geometric maps, in which angle sections

  16. Camera sensitivity study

    NASA Astrophysics Data System (ADS)

    Schlueter, Jonathan; Murphey, Yi L.; Miller, John W. V.; Shridhar, Malayappan; Luo, Yun; Khairallah, Farid

    2004-12-01

    As the cost/performance Ratio of vision systems improves with time, new classes of applications become feasible. One such area, automotive applications, is currently being investigated. Applications include occupant detection, collision avoidance and lane tracking. Interest in occupant detection has been spurred by federal automotive safety rules in response to injuries and fatalities caused by deployment of occupant-side air bags. In principle, a vision system could control airbag deployment to prevent this type of mishap. Employing vision technology here, however, presents a variety of challenges, which include controlling costs, inability to control illumination, developing and training a reliable classification system and loss of performance due to production variations due to manufacturing tolerances and customer options. This paper describes the measures that have been developed to evaluate the sensitivity of an occupant detection system to these types of variations. Two procedures are described for evaluating how sensitive the classifier is to camera variations. The first procedure is based on classification accuracy while the second evaluates feature differences.

  17. Omnidirectional narrow bandpass filters based on one-dimensional superconductor-dielectric photonic crystal heterostructors

    NASA Astrophysics Data System (ADS)

    Barvestani, Jamal

    2015-01-01

    By using transfer matrix method, narrow passbands of TE wave from one-dimensional superconductor-dielectric photonic crystal heterostructures are presented. Various superconductor within the two-fluid model are considered. Results show that by selecting proper width for superconductor and dielectric layers and proper materials selection, single narrow passband in visible region can be obtained. Behavior of these passbands versus the temperature of superconductors, external magnetic field and incident angle are considered. We have shown that it is possible to obtain omnidirectional passbands with examining temperature, the dilation factor of the half part of a heterostructure and the other parameters of the heterostrutures. These tunable narrow passband may be useful in designing of narrow band filters or multichannel filters.

  18. Star Identification Algorithm for Uncalibrated, Wide FOV Cameras

    NASA Astrophysics Data System (ADS)

    Ajdadi, Mohamad Javad; Ghafarzadeh, Mahdi; Taheri, Mojtaba; Mosadeq, Ehsan; Khakian Ghomi, Mahdi

    2015-06-01

    A novel method is proposed for star identification via uncalibrated cameras with wide fields of view (FOVs). In this approach some of the triangles created by the stars in the FOV are selected for pattern recognition. The triangles are selected considering the sensitivity of their interior angles to the calibration error. The algorithm is based on the intersection between sets of triangles that are found in the database for each selected triangle of the image. By this method, most of the image stars contribute to pattern recognition and thereby it is very robust against the noise and the calibration error. The algorithm is performed on 150 night sky images, which are taken by an uncalibrated camera in FOV of 114 12 with a success rate of 94% and no false positives. Based on the identification approach, an adaptive method is also developed for calibrating and obtaining the projection function of an uncalibrated camera.

  19. Overview of Neutrino Mixing Models and Their Mixing Angle Predictions

    SciTech Connect

    Albright, Carl H.

    2009-11-01

    An overview of neutrino-mixing models is presented with emphasis on the types of horizontal flavor and vertical family symmetries that have been invoked. Distributions for the mixing angles of many models are displayed. Ways to differentiate among the models and to narrow the list of viable models are discussed.

  20. Vertical ocean reflectance at low altitudes for narrow laser beams

    NASA Astrophysics Data System (ADS)

    Crittenden, Eugene C., Jr.; Rodeback, G. W.; Milne, Edmund A.; Cooper, Alfred W.

    1991-09-01

    A narrow-beam laser altimeter was used to measure the reflected signal from the ocean surface as represented by the waters beneath the Golden Gate Bridge. This site allowed precise measurements as a function of angle from the vertical not possible from flying platforms. For short-wavelength water waves superimposed on swell, the signal amplitude probability distribution for the reflected signals showed periods of zero reflection, even for vertical incidence, apparently due to tipping of the water surface. The nonzero signals showed a distribution that could be fitted with an antilog-normal distribution. This is skewed toward higher signals than a normal (Gaussian) distribution. With incidence angle displaced from the vertical, the distribution shape was retained but with more frequent zero reflections. The decrease with angle of the average signal, including the zeroes, is well fitted with a Gram- Charlier distribution, as seen by earlier observers using photographic techniques which masked these details of the structure. For the simpler wave pattern due to a long sustained wind direction, the probability distribution is log-normal with no zero signal periods. At large angles from the vertical the log-normal distribution shifts toward exponential. For surface states intermediate between the above two extremes the distribution is often normal. The larger return signals resulting from the skew toward larger amplitudes from lognormal are more favorable for disposable laser altimeters than previously believed. Also, for an altimeter which may be swinging from a parachute or balloon, the return remains high at angles other than vertical. The presence of occasional zero return signal does somewhat degrade the accuracy of altitude measurement for a descending altimeter, but the signal available assures performance at larger altitudes than previously expected.

  1. Vision Sensors and Cameras

    NASA Astrophysics Data System (ADS)

    Hoefflinger, Bernd

    Silicon charge-coupled-device (CCD) imagers have been and are a specialty market ruled by a few companies for decades. Based on CMOS technologies, active-pixel sensors (APS) began to appear in 1990 at the 1 ?m technology node. These pixels allow random access, global shutters, and they are compatible with focal-plane imaging systems combining sensing and first-level image processing. The progress towards smaller features and towards ultra-low leakage currents has provided reduced dark currents and ?m-size pixels. All chips offer Mega-pixel resolution, and many have very high sensitivities equivalent to ASA 12.800. As a result, HDTV video cameras will become a commodity. Because charge-integration sensors suffer from a limited dynamic range, significant processing effort is spent on multiple exposure and piece-wise analog-digital conversion to reach ranges >10,000:1. The fundamental alternative is log-converting pixels with an eye-like response. This offers a range of almost a million to 1, constant contrast sensitivity and constant colors, important features in professional, technical and medical applications. 3D retino-morphic stacking of sensing and processing on top of each other is being revisited with sub-100 nm CMOS circuits and with TSV technology. With sensor outputs directly on top of neurons, neural focal-plane processing will regain momentum, and new levels of intelligent vision will be achieved. The industry push towards thinned wafers and TSV enables backside-illuminated and other pixels with a 100% fill-factor. 3D vision, which relies on stereo or on time-of-flight, high-speed circuitry, will also benefit from scaled-down CMOS technologies both because of their size as well as their higher speed.

  2. Mass movement slope streaks imaged by the Mars Orbiter Camera

    NASA Astrophysics Data System (ADS)

    Sullivan, Robert; Thomas, Peter; Veverka, Joseph; Malin, Michael; Edgett, Kenneth S.

    2001-10-01

    Narrow, fan-shaped dark streaks on steep Martian slopes were originally observed in Viking Orbiter images, but a definitive explanation was not possible because of resolution limitations. Pictures acquired by the Mars Orbiter Camera (MOC) aboard the Mars Global Surveyor (MGS) spacecraft show innumerable examples of dark slope streaks distributed widely, but not uniformly, across the brighter equatorial regions, as well as individual details of these features that were not visible in Viking Orbiter data. Dark slope streaks (as well as much rarer bright slope streaks) represent one of the most widespread and easily recognized styles of mass movement currently affecting the Martian surface. New dark streaks have formed since Viking and even during the MGS mission, confirming earlier suppositions that higher contrast dark streaks are younger, and fade (brighten) with time. The darkest slope streaks represent ~10% contrast with surrounding slope materials. No small outcrops supplying dark material (or bright material, for bright streaks) have been found at streak apexes. Digitate downslope ends indicate slope streak formation involves a ground-hugging flow subject to deflection by minor topographic obstacles. The model we favor explains most dark slope streaks as scars from dust avalanches following oversteepening of air fall deposits. This process is analogous to terrestrial avalanches of oversteepened dry, loose snow which produce shallow avalanche scars with similar morphologies. Low angles of internal friction typically 10-30 for terrestrial loess and clay materials suggest that mass movement of (low-cohesion) Martian dusty air fall is possible on a wide range of gradients. Martian gravity, presumed low density of the air fall deposits, and thin (unresolved by MOC) failed layer depths imply extremely low cohesive strength at time of failure, consistent with expectations for an air fall deposit of dust particles. As speed increases during a dust avalanche, a growing fraction of the avalanching dust particles acquires sufficient kinetic energy to be lost to the atmosphere in suspension, limiting the momentum of the descending avalanche front. The equilibrium speed, where rate of mass lost to the atmosphere is balanced by mass continually entrained as the avalanche front descends, decreases with decreasing gradient. This mechanism explains observations from MOC images indicating slope streaks formed with little reserve kinetic energy for run-outs on to valley floors and explains why large distal deposits of displaced material are not found at downslope streak ends. The mass movement process of dark (and bright) slope streak formation through dust avalanches involves renewable sources of dust only, leaving underlying slope materials unaffected. Areas where dark and bright slope streaks currently form and fade in cycles are closely correlated with low thermal inertia and probably represent regions where dust currently is accumulating, not just residing.

  3. Accuracy in fixing ship's positions by camera survey of bearings

    NASA Astrophysics Data System (ADS)

    Naus, Krzysztof; W??, Mariusz

    2011-01-01

    The paper presents the results of research on the possibilities of fixing ship position coordinates based on results of surveying bearings on navigational marks with the use of the CCD camera. Accuracy of the determination of ship position coordinates, expressed in terms of the mean error, was assumed to be the basic criterion of this estimation. The first part of the paper describes the method of the determination of the resolution and the mean error of the angle measurement, taken with a camera, and also the method of the determination of the mean error of position coordinates when two or more bearings were measured. There have been defined three software applications assigned for the development of navigational sea charts with accuracy areas mapped on. The second part contains the results of studying accuracy in fixing ship position coordinates, carried out in the Gulf of Gdansk, with the use of bearings taken obtained with the Rolleiflex and Sony cameras. The results are presented in a form of diagrams of the mean error of angle measurement, also in the form of navigational charts with accuracy fields mapped on. In the final part, basing on results obtained, the applicability of CCD cameras in automation of coastal navigation performance process is discussed.

  4. Laser angle sensor development

    NASA Technical Reports Server (NTRS)

    Pond, C. R.; Texeira, P. D.

    1980-01-01

    Electrical and optical parameters were developed for a two axis (pitch/roll) laser angle sensor. The laser source and detector were mounted in the plenum above the model. Two axis optical distortion measurements of flow characteristics in a 0.3 transonic cryogenic tunnel were made with a shearing interferometer. The measurement results provide a basis for estimating the optical parameters of the laser angle sensor. Experimental and analytical information was generated on model windows to cover the reflector. A two axis breadboard was assembled to evaluate different measurement concepts. The measurement results were used to develop a preliminary design of a laser angle sensor. Schematics and expected performance specifications are included.

  5. Observation of Planetary Motion Using a Digital Camera

    ERIC Educational Resources Information Center

    Meyn, Jan-Peter

    2008-01-01

    A digital SLR camera with a standard lens (50 mm focal length, f/1.4) on a fixed tripod is used to obtain photographs of the sky which contain stars up to 8[superscript m] apparent magnitude. The angle of view is large enough to ensure visual identification of the photograph with a large sky region in a stellar map. The resolution is sufficient to…

  6. Observation of Planetary Motion Using a Digital Camera

    ERIC Educational Resources Information Center

    Meyn, Jan-Peter

    2008-01-01

    A digital SLR camera with a standard lens (50 mm focal length, f/1.4) on a fixed tripod is used to obtain photographs of the sky which contain stars up to 8[superscript m] apparent magnitude. The angle of view is large enough to ensure visual identification of the photograph with a large sky region in a stellar map. The resolution is sufficient to

  7. Performance of new low-cost 1/3" security cameras for meteor surveillance

    NASA Astrophysics Data System (ADS)

    Samuels, Dave; Wray, James; Gural, Peter S.; Jenniskens, Peter

    2014-02-01

    It has been almost 5 years since the CAMS (Cameras for All-sky Meteor Surveillance) system specifications were designed for video meteor surveillance. CAMS has been based on a relatively expensive black-and-white Watec WAT-902H2 Ultimate camera, which uses a 1/2" sensor. In this paper, we investigate the ability of new, lower cost color cameras based on smaller 1/3" sensors to be able to perform adequately for CAMS. We did not expect them to equal or outperform the sensitivity for the same field of view of the Watec 1/2" camera, but the goal was to see if they could perform within the tolerances of the sensitivity requirements for the CAMS project. Their lower cost brings deployment of meteor surveillance cameras within reach of amateur astronomers and makes it possible to deploy many more cameras to increase yield. The lens focal length is matched to the elevation angle of the camera to maintain an image scale and spatial resolution close to that of the standard CAMS camera and lens combination, crucial for obtaining sufficiently accurate orbital elements. An all-sky array based on 16 such cameras, to be operated from a single computer, was built and the performance of individual cameras was tested.

  8. Narrow Vertical Caves: Mapping Volcanic Fissure Geometries

    NASA Astrophysics Data System (ADS)

    Parcheta, C.; Nash, J.; Parness, A.; Mitchell, K. L.; Pavlov, C. A.

    2015-10-01

    Volcanic conduits are difficult to quantify, but their geometry fundamentally influences how eruptions occur. We robotically map old fissure conduits - elongated narrow cracks in the ground that transported magma to the surface during an eruption.

  9. X-ray Pinhole Camera Measurements

    SciTech Connect

    Nelson, D. S.; Berninger, M. J.; Flores, P. A.; Good, D. E.; Henderson, D. J.; Hogge, K. W.; Huber, S. R.; Lutz, S. S.; Mitchell, S. E.; Howe, R. A.; Mitton, C. V.; Molina, I.; Bozman, D. R.; Cordova, S. R.; Mitchell, D. R.; Oliver, B. V.; Ormond, E. C.

    2013-07-01

    The development of the rod pinch diode [1] has led to high-resolution radiography for dynamic events such as explosive tests. Rod pinch diodes use a small diameter anode rod, which extends through the aperture of a cathode plate. Electrons borne off the aperture surface can self-insulate and pinch onto the tip of the rod, creating an intense, small x-ray source (Primary Pinch). This source has been utilized as the main diagnostic on numerous experiments that include high-value, single-shot events. In such applications there is an emphasis on machine reliability, x-ray reproducibility, and x-ray quality [2]. In tests with the baseline rod pinch diode, we have observed that an additional pinch (Secondary Pinch) occurs at the interface near the anode rod and the rod holder. This suggests that stray electrons exist that are not associated with the Primary Pinch. In this paper we present measurements on both pinches using an x-ray pinhole camera. The camera is placed downstream of the Primary Pinch at an angle of 60° with respect to the diode centerline. This diagnostic will be employed to diagnose x-ray reproducibility and quality. In addition, we will investigate the performance of hybrid diodes relating to the formation of the Primary and Secondary Pinches.

  10. Focal Plane Metrology for the LSST Camera

    SciTech Connect

    A Rasmussen, Andrew P.; Hale, Layton; Kim, Peter; Lee, Eric; Perl, Martin; Schindler, Rafe; Takacs, Peter; Thurston, Timothy; /SLAC

    2007-01-10

    Meeting the science goals for the Large Synoptic Survey Telescope (LSST) translates into a demanding set of imaging performance requirements for the optical system over a wide (3.5{sup o}) field of view. In turn, meeting those imaging requirements necessitates maintaining precise control of the focal plane surface (10 {micro}m P-V) over the entire field of view (640 mm diameter) at the operating temperature (T {approx} -100 C) and over the operational elevation angle range. We briefly describe the hierarchical design approach for the LSST Camera focal plane and the baseline design for assembling the flat focal plane at room temperature. Preliminary results of gravity load and thermal distortion calculations are provided, and early metrological verification of candidate materials under cold thermal conditions are presented. A detailed, generalized method for stitching together sparse metrology data originating from differential, non-contact metrological data acquisition spanning multiple (non-continuous) sensor surfaces making up the focal plane, is described and demonstrated. Finally, we describe some in situ alignment verification alternatives, some of which may be integrated into the camera's focal plane.

  11. Hot Wax Sweeps Debris From Narrow Passages

    NASA Technical Reports Server (NTRS)

    Ricklefs, Steven K.

    1990-01-01

    Safe and effective technique for removal of debris and contaminants from narrow passages involves entrainment of undesired material in thermoplastic casting material. Semisolid wax slightly below melting temperature pushed along passage by pressurized nitrogen to remove debris. Devised to clean out fuel passages in main combustion chamber of Space Shuttle main engine. Also applied to narrow, intricate passages in internal-combustion-engine blocks, carburetors, injection molds, and other complicated parts.

  12. Reading Angles in Maps

    PubMed Central

    Izard, Vronique; O'Donnell, Evan; Spelke, Elizabeth S.

    2013-01-01

    Preschool children can navigate by simple geometric maps of the environment, but the nature of the geometric relations they use in map reading remains unclear. Here, children were tested specifically on their sensitivity to angle. Forty-eight children (age 47:1553:30 months) were presented with fragments of geometric maps, in which angle sections appeared without any relevant length or distance information. Children were able to read these map fragments and compare 2D to 3D angles. However, this ability appeared both variable and fragile among the youngest children of the sample. These findings suggest that 4-year-old children begin to form an abstract concept of angle that applies both to 2D and 3D displays and that serves to interpret novel spatial symbols. PMID:23647223

  13. Angles, Time, and Proportion

    ERIC Educational Resources Information Center

    Pagni, David L.

    2005-01-01

    This article describes an investigation making connections between the time on an analog clock and the angle between the minute hand and the hour hand. It was posed by a middle school mathematics teacher. (Contains 8 tables and 6 figures.)

  14. An Inexpensive Digital Infrared Camera

    ERIC Educational Resources Information Center

    Mills, Allan

    2012-01-01

    Details are given for the conversion of an inexpensive webcam to a camera specifically sensitive to the near infrared (700-1000 nm). Some experiments and practical applications are suggested and illustrated. (Contains 9 figures.)

  15. 'Magic Angle Precession'

    SciTech Connect

    Binder, Bernd

    2008-01-21

    An advanced and exact geometric description of nonlinear precession dynamics modeling very accurately natural and artificial couplings showing Lorentz symmetry is derived. In the linear description it is usually ignored that the geometric phase of relativistic motion couples back to the orbital motion providing for a non-linear recursive precession dynamics. The high coupling strength in the nonlinear case is found to be a gravitomagnetic charge proportional to the precession angle and angular velocity generated by geometric phases, which are induced by high-speed relativistic rotations and are relevant to propulsion technologies but also to basic interactions. In the quantum range some magic precession angles indicating strong coupling in a phase-locked chaotic system are identified, emerging from a discrete time dynamical system known as the cosine map showing bifurcations at special precession angles relevant to heavy nuclei stability. The 'Magic Angle Precession' (MAP) dynamics can be simulated and visualized by cones rolling in or on each other, where the apex and precession angles are indexed by spin, charge or precession quantum numbers, and corresponding magic angles. The most extreme relativistic warping and twisting effect is given by the Dirac spinor half spin constellation with 'Hyperdiamond' MAP, which resembles quark confinement.

  16. Solid State Television Camera (CID)

    NASA Technical Reports Server (NTRS)

    Steele, D. W.; Green, W. T.

    1976-01-01

    The design, development and test are described of a charge injection device (CID) camera using a 244x248 element array. A number of video signal processing functions are included which maximize the output video dynamic range while retaining the inherently good resolution response of the CID. Some of the unique features of the camera are: low light level performance, high S/N ratio, antiblooming, geometric distortion, sequential scanning and AGC.

  17. The future of consumer cameras

    NASA Astrophysics Data System (ADS)

    Battiato, Sebastiano; Moltisanti, Marco

    2015-03-01

    In the last two decades multimedia, and in particular imaging devices (camcorders, tablets, mobile phones, etc.) have been dramatically diffused. Moreover the increasing of their computational performances, combined with an higher storage capability, allows them to process large amount of data. In this paper an overview of the current trends of consumer cameras market and technology will be given, providing also some details about the recent past (from Digital Still Camera up today) and forthcoming key issues.

  18. Optimization of sharp and viewing-angle-independent structural color.

    PubMed

    Hsu, Chia Wei; Miller, Owen D; Johnson, Steven G; Solja?i?, Marin

    2015-04-01

    Structural coloration produces some of the most brilliant colors in nature and has many applications. Motivated by the recently proposed transparent displays that are based on wavelength-selective scattering, here we consider the new problem of transparent structural color, where objects are transparent under omnidirectional broad-band illumination but scatter strongly with a directional narrow-band light source. Transparent structural color requires two competing properties, narrow bandwidth and broad viewing angle, that have not been demonstrated simultaneously previously. Here, we use numerical optimization to discover geometries where a sharp 7% bandwidth in scattering is achieved, yet the peak wavelength varies less than 1%, and the peak height and peak width vary less than 6% over broad viewing angles (0-90) under a directional illumination. Our model system consists of dipole scatterers arranged into several rings; interference among the scattered waves is optimized to yield the wavelength-selective and angle-insensitive response. PMID:25968779

  19. Fast camera objective designs for spectrograph of Mont Megantique telescope

    NASA Astrophysics Data System (ADS)

    Thibault, Simon; Wang, Min

    2004-02-01

    All-reflective optics is conventionally required for extended spectral region observations in astronomical spectrograph. But the spatial resolution is usually not good enough while the large-size CCD will be used for observation in all-reflective optics. In this paper, all-refractive optics has been investigated to design a fast (F/1.55) and wide angle camera objective for large spectral coverage, from UV to VIS and up to NIR, when a large-size CCD is used on the focal plane of the spectrograph of Mont Megantique telescope. The case of achromatic and apochromatic condition has been investigated for axial and lateral color controls. The new proposed solutions have been optimized from two to three different glass combinations in order to have higher throughputs for large spectral coverage, especially in UV region. The used components have been minimized to reduce the light inherent lost. The monochromatic aberrations have been corrected and controlled by using optimized lens bending and shapes to make the camera have the CCD pixel resolution. Ray tracing results displayed the good optical performance of the camera to cover from 350 nm to 1000 nm spectral region with high resolution. The broadband AR coating, enhanced on UV region, will be used on each surface of the lenses in the camera. Final throughputs for the designed camera has been estimated and given in the paper.

  20. Teleconferencing system using virtual camera

    NASA Astrophysics Data System (ADS)

    Shibuichi, Daisuke; Tanaka, Tsukasa; Terashima, Nobuyoshi; Tominaga, Hideyoshi

    2000-05-01

    Teleconferencing systems are becoming more popular because of advance in image processing and broadband network. Nevertheless, communicating with someone at a remote location through a teleconferencing system still presents problems because of the difficulty of establishing and maintaining eye contact. Eye contact is essential to having a natural dialog. The purpose of our study is to make eye contact possible during dialog by using image processing with no particular devices, such as color markers, sensors, which are equipped users with and IR cameras. Proposed teleconferencing system is composed of a computer, a display attached to the computer, and four cameras. We define virtual camera as the camera, which exists virtually in 3D space. By using the proposed method, we can acquire a front view of a person that is taken with the virtual camera. The image taken with virtual camera is generated by extracting a same feature point among four face images. Feature point sets among four face images are automatically corresponded by using Epipolar Plane Images (EPIs). The users can establish eye contact by acquiring the front face view, and moreover, they also can obtain various views of the image because 3D points of the object can be extracted from EPIs. Through these facilities, the proposed system will provide users with better communication than previous systems. In this paper, we describe the concept, implementation and the evaluation are described in various perspective.

  1. The virtual gamma camera room.

    PubMed

    Penrose, J M; Trowbridge, E A; Tindale, W B

    1996-05-01

    The installation of a gamma camera is time-consuming and costly and, once installed, the camera position is unlikely to be altered during its working life. Poor choice of camera position therefore has long-term consequences. Additional equipment such as collimators and carts, the operator's workstation and wall-mounted display monitors must also be situated to maximize access and ease of use. The layout of a gamma camera room can be optimized prior to installation by creating a virtual environment. Super-Scape VRT software running on an upgraded 486 PC microprocessor was used to create a 'virtual camera room'. The simulation included an operator's viewpoint and a controlled tour of the room. Equipment could be repositioned as required, allowing potential problems to be identified at the design stage. Access for bed-ridden patients, operator ergonomics, operator and patient visibility were addressed. The display can also be used for patient education. Creation of a virtual environment is a valuable tool which allows different camera systems to be compared interactively in terms of dimensions, extent of movement and use of a defined space. Such a system also has applications in radiopharmacy design and simulation. PMID:8736511

  2. Streak camera dynamic range optimization

    SciTech Connect

    Wiedwald, J.D.; Lerche, R.A.

    1987-09-01

    The LLNL optical streak camera is used by the Laser Fusion Program in a wide range of applications. Many of these applications require a large recorded dynamic range. Recent work has focused on maximizing the dynamic range of the streak camera recording system. For our streak cameras, image intensifier saturation limits the upper end of the dynamic range. We have developed procedures to set the image intensifier gain such that the system dynamic range is maximized. Specifically, the gain is set such that a single streak tube photoelectron is recorded with an exposure of about five times the recording system noise. This ensures detection of single photoelectrons, while not consuming intensifier or recording system dynamic range through excessive intensifier gain. The optimum intensifier gain has been determined for two types of film and for a lens-coupled CCD camera. We have determined that by recording the streak camera image with a CCD camera, the system is shot-noise limited up to the onset of image intensifier nonlinearity. When recording on film, the film determines the noise at high exposure levels. There is discussion of the effects of slit width and image intensifier saturation on dynamic range. 8 refs.

  3. Person re-identification over camera networks using multi-task distance metric learning.

    PubMed

    Ma, Lianyang; Yang, Xiaokang; Tao, Dacheng

    2014-08-01

    Person reidentification in a camera network is a valuable yet challenging problem to solve. Existing methods learn a common Mahalanobis distance metric by using the data collected from different cameras and then exploit the learned metric for identifying people in the images. However, the cameras in a camera network have different settings and the recorded images are seriously affected by variability in illumination conditions, camera viewing angles, and background clutter. Using a common metric to conduct person reidentification tasks on different camera pairs overlooks the differences in camera settings; however, it is very time-consuming to label people manually in images from surveillance videos. For example, in most existing person reidentification data sets, only one image of a person is collected from each of only two cameras; therefore, directly learning a unique Mahalanobis distance metric for each camera pair is susceptible to over-fitting by using insufficiently labeled data. In this paper, we reformulate person reidentification in a camera network as a multitask distance metric learning problem. The proposed method designs multiple Mahalanobis distance metrics to cope with the complicated conditions that exist in typical camera networks. We address the fact that these Mahalanobis distance metrics are different but related, and learned by adding joint regularization to alleviate over-fitting. Furthermore, by extending, we present a novel multitask maximally collapsing metric learning (MtMCML) model for person reidentification in a camera network. Experimental results demonstrate that formulating person reidentification over camera networks as multitask distance metric learning problem can improve performance, and our proposed MtMCML works substantially better than other current state-of-the-art person reidentification methods. PMID:24956368

  4. A Robotic Wide-Angle H-alpha Survey of the Southern Sky

    NASA Astrophysics Data System (ADS)

    Rosing, W.; Gaustad, J. E.; McCullough, P. R.; Van Buren, D.

    2000-12-01

    We have completed a wide-angle imaging survey of the entire southern sky (? < +10 degrees) at 656.3 nm wavelength, the H? emission line of hydrogen. Each image of the survey covers an area of the sky 13 degrees square at an angular resolution of approximately one arcminute, and reaches a sensitivity level of 0.5 Rayleigh (3 x 10-18 erg cm-2 s-1 arcsec-2 ), corresponding to an emission measure of 1 cm-6 pc, and to a brightness temperature for microwave free-free emission of 3 ? K at 30 GHz. The survey was done with a CCD detector, a commercial camera lens, and narrow-band interference filters, mounted on a robotic telescope platform located at Cerro Tololo Inter-American Observatory in Chile. This work was supported by grants from Las Cumbres Observatory, NSF, NASA, JPL, Dudley Observatory, the Fund for Astrophysical Research, Research Corporation, the University of Illinois at Urbana-Champaign, and Swarthmore College.

  5. Design, modeling and performance of a hybrid portable gamma camera

    NASA Astrophysics Data System (ADS)

    Smith, Leon Eric

    The combination of a mechanically-collimated gamma-ray camera with an electronically-collimated gamma camera offers both the high efficiency and good angular resolution typical in a mechanically-collimated camera for lower photon energies and the uncoupling of spatial resolution and efficiency provided by an electronically-collimated camera at higher energies. The design, construction, performance modeling and measured performance of the Hybrid Portable Gamma Camera (HPGC) are presented here. Intended for industrial use, the HPGC offers good angular resolution and efficiency over a broad energy range (50 keV to 2 MeV) by combining a MURA coded aperture camera with a Compton scatter camera in a single system. The HPGC consists of two detector modules: (1) a NaI(Tl) scintillator with Anger logic readout and (2) a CsI(Na) pixellated crystal viewed by a position-sensitive photomultiplier tube. Analytical calculations of angular resolution components and efficiency for the HPGC were compared to Monte Carlo calculations of the same quantities. The predicted angular resolution performance for on-axis point sources, a central scattering angle of 45sp and a detector separation distance of 35 cm ranges from 3.5-6sp FWHM over the sensitive energy range. The mechanical collimation intrinsic efficiency for energies up to 800 keV varies from 0.50 to 0.05 while the electronic collimation intrinsic efficiency for energies above 400 keV is 7.010sp{-4} to 510sp{-5}. The experimentally measured angular resolution and efficiency values show good agreement with the modeling predictions for incident energies of 412 keV and 662 keV. Although work has been done on mechanical collimation cameras and electronic collimation cameras operating independently, no truly hybrid imaging system has been constructed that uses the same gamma ray for both mechanical collimation and electronic collimation information. This dissertation compares the relative information per photon for three imaging modalities: mechanical collimation, electronic collimation and hybrid collimation. The analysis is done for point sources at two incident energies (412 keV and 662 keV) in the medium energy range of operation for the HPGC (400 keV to 800 keV) where neither mechanical collimation nor electronic collimation performs particularly well acting independently. A tool from estimation theory called resolution-variance analysis is used to compare the three modalities. Results show that hybrid collimation is superior to mechanical and electronic collimation at both 412 keV and 662 keV over the resolution range likely to be used for such a camera.

  6. The MC and LFC cameras. [metric camera (MC); large format camera (LFC)

    NASA Technical Reports Server (NTRS)

    Norton, Clarice L.; Schroeder, Manfried; Mollberg, Bernard

    1986-01-01

    The characteristics of the shuttle-borne Large Format Camera are listed. The LFC focal plane format was 23 by 46 cm, double the usual size, thereby acquiring approximately double the ground area. Forward motion compensation was employed. With the stable platform (shuttle) it was possible to use the slow exposure, high resolution, Kodak aerial films; 3414 and 3412 black and white, SO-242 color, and SO-131 aerochrome infrared. The camera was designed to maintain stability during varying temperature extremes of space.

  7. Surface nanobubble contact angles

    NASA Astrophysics Data System (ADS)

    Borkent, Bram; de Beer, Sissi; Mugele, Frieder; Lohse, Detlef

    2009-11-01

    Previous AFM experiments on surface nanobubbles have suggested an anomalously large contact angle ? of the bubbles (typically 160,o measured through the water) and a possible size dependence ?(R). Here we determine ?(R) for nanobubbles on smooth highly orientated pyrolytic graphite (HOPG) with a variety of different cantilevers. It is found that ?(R) is constant within the experimental error, down to bubbles as small as R=20,nm, and its value is equal to 1194,o. This result, which is the lowest contact angle for surface nanobubbles found so far, is very reproducible and independent of the cantilever type used, provided that the cantilever is clean and the HOPG surface is smooth. In contrast we find that, for a particular set of cantilevers, the surface can become relatively rough due to precipitated matter from the cantilever onto the substrate, in which case larger nanoscopic contact angles (150,o) show up.

  8. Photometric Calibration of Consumer Video Cameras

    NASA Technical Reports Server (NTRS)

    Suggs, Robert; Swift, Wesley, Jr.

    2007-01-01

    Equipment and techniques have been developed to implement a method of photometric calibration of consumer video cameras for imaging of objects that are sufficiently narrow or sufficiently distant to be optically equivalent to point or line sources. Heretofore, it has been difficult to calibrate consumer video cameras, especially in cases of image saturation, because they exhibit nonlinear responses with dynamic ranges much smaller than those of scientific-grade video cameras. The present method not only takes this difficulty in stride but also makes it possible to extend effective dynamic ranges to several powers of ten beyond saturation levels. The method will likely be primarily useful in astronomical photometry. There are also potential commercial applications in medical and industrial imaging of point or line sources in the presence of saturation.This development was prompted by the need to measure brightnesses of debris in amateur video images of the breakup of the Space Shuttle Columbia. The purpose of these measurements is to use the brightness values to estimate relative masses of debris objects. In most of the images, the brightness of the main body of Columbia was found to exceed the dynamic ranges of the cameras. A similar problem arose a few years ago in the analysis of video images of Leonid meteors. The present method is a refined version of the calibration method developed to solve the Leonid calibration problem. In this method, one performs an endto- end calibration of the entire imaging system, including not only the imaging optics and imaging photodetector array but also analog tape recording and playback equipment (if used) and any frame grabber or other analog-to-digital converter (if used). To automatically incorporate the effects of nonlinearity and any other distortions into the calibration, the calibration images are processed in precisely the same manner as are the images of meteors, space-shuttle debris, or other objects that one seeks to analyze. The light source used to generate the calibration images is an artificial variable star comprising a Newtonian collimator illuminated by a light source modulated by a rotating variable neutral- density filter. This source acts as a point source, the brightness of which varies at a known rate. A video camera to be calibrated is aimed at this source. Fixed neutral-density filters are inserted in or removed from the light path as needed to make the video image of the source appear to fluctuate between dark and saturated bright. The resulting video-image data are analyzed by use of custom software that determines the integrated signal in each video frame and determines the system response curve (measured output signal versus input brightness). These determinations constitute the calibration, which is thereafter used in automatic, frame-by-frame processing of the data from the video images to be analyzed.

  9. Vibration detection and calibration method used to remote sensing optical camera

    NASA Astrophysics Data System (ADS)

    Li, Qi; Dong, Wende; Xu, Zhihai; Feng, Huajun

    2013-09-01

    In order to obtain sharp remote sensing images, the image stabilization technology of space camera and the remote sensing image restoration technology are usually used now. Vibration detection is the key to realize these technologies: an image stabilization system needs the displacement vector derived from vibration detection to drive the compensation mechanism; and the remote sensing image restoration technology needs the vibration displacement vector to construct the point spread function (PSF). Vibration detection not only can be used to improve image quality of panchromatic camera, infrared cameras and other optical camera, also is motion compensation basis of satellite radar equipment. In this paper we have constructed a vibration measuring method based on Fiber optic gyro (FOG). FOG is a device sensitive to angular velocity or angular displacement. High-precision FOG can be used to measure the jitter angle of the optic axis of a space camera fixed on satellite platform. According to the measured data, the vibration displacement vector of the imaging plane can be calculated. Consequently the vibration data provide a basis for image stabilization of space camera and restoration of remote sensing images. We simulated the vibration of a space camera by using a piezoelectric ceramic deflection platform, and calibrated vibration measurement by using laser beam and a high-speed linear array camera. We compared the feedback output of the deflection platform, the FOG measured data and the calibrated data of the linear array camera, and obtained a calibration accuracy better than 1.5 ?rad.

  10. Cross-ratio-based line scan camera calibration using a planar pattern

    NASA Astrophysics Data System (ADS)

    Li, Dongdong; Wen, Gongjian; Qiu, Shaohua

    2016-01-01

    A flexible new technique is proposed to calibrate the geometric model of line scan cameras. In this technique, the line scan camera is rigidly coupled to a calibrated frame camera to establish a pair of stereo cameras. The linear displacements and rotation angles between the two cameras are fixed but unknown. This technique only requires the pair of stereo cameras to observe a specially designed planar pattern shown at a few (at least two) different orientations. At each orientation, a stereo pair is obtained including a linear array image and a frame image. Radial distortion of the line scan camera is modeled. The calibration scheme includes two stages. First, point correspondences are established from the pattern geometry and the projective invariance of cross-ratio. Second, with a two-step calibration procedure, the intrinsic parameters of the line scan camera are recovered from several stereo pairs together with the rigid transform parameters between the pair of stereo cameras. Both computer simulation and real data experiments are conducted to test the precision and robustness of the calibration algorithm, and very good calibration results have been obtained. Compared with classical techniques which use three-dimensional calibration objects or controllable moving platforms, our technique is affordable and flexible in close-range photogrammetric applications.

  11. Narrowing of intersensory speech perception in infancy

    PubMed Central

    Pons, Ferran; Lewkowicz, David J.; Soto-Faraco, Salvador; Sebastin-Galls, Nria

    2009-01-01

    The conventional view is that perceptual/cognitive development is an incremental process of acquisition. Several striking findings have revealed, however, that the sensitivity to non-native languages, faces, vocalizations, and music that is present early in life declines as infants acquire experience with native perceptual inputs. In the language domain, the decline in sensitivity is reflected in a process of perceptual narrowing that is thought to play a critical role during the acquisition of a native-language phonological system. Here, we provide evidence that such a decline also occurs in infant response to multisensory speech. We found that infant intersensory response to a non-native phonetic contrast narrows between 6 and 11 months of age, suggesting that the perceptual system becomes increasingly more tuned to key native-language audiovisual correspondences. Our findings lend support to the notion that perceptual narrowing is a domain-general as well as a pan-sensory developmental process. PMID:19541648

  12. Digitization of Metric Camera and Large Format Camera space photographs

    NASA Astrophysics Data System (ADS)

    Lohmann, Peter; Luhmann, Thomas

    1986-11-01

    Black and white images of the space shuttle experiments Metric Camera and Large Format Camera were digitized. Investigations show that it is possible to digitize space photographs at an accuracy which corresponds to the resolution of the film. It is demonstrated that the choice of a too small pixel size leads to a loss in the detectability of details within the images because of the apparent film grain. The reseau-scanner Rollei RS1 can cope with the high requirements in terms of geometric accuracy and resolution necessary for photogrammetric applications.

  13. The Clementine longwave infrared camera

    SciTech Connect

    Priest, R.E.; Lewis, I.T.; Sewall, N.R.; Park, H.S.; Shannon, M.J.; Ledebuhr, A.G.; Pleasance, L.D.; Massie, M.A.; Metschuleit, K.

    1995-04-01

    The Clementine mission provided the first ever complete, systematic surface mapping of the moon from the ultra-violet to the near-infrared regions. More than 1.7 million images of the moon, earth and space were returned from this mission. The longwave-infrared (LWIR) camera supplemented the UV/Visible and near-infrared mapping cameras providing limited strip coverage of the moon, giving insight to the thermal properties of the soils. This camera provided {approximately}100 m spatial resolution at 400 km periselene, and a 7 km across-track swath. This 2.1 kg camera using a 128 x 128 Mercury-Cadmium-Telluride (MCT) FPA viewed thermal emission of the lunar surface and lunar horizon in the 8.0 to 9.5 {micro}m wavelength region. A description of this light-weight, low power LWIR camera along with a summary of lessons learned is presented. Design goals and preliminary on-orbit performance estimates are addressed in terms of meeting the mission`s primary objective for flight qualifying the sensors for future Department of Defense flights.

  14. Video camera use at nuclear power plants

    SciTech Connect

    Estabrook, M.L.; Langan, M.O.; Owen, D.E. )

    1990-08-01

    A survey of US nuclear power plants was conducted to evaluate video camera use in plant operations, and determine equipment used and the benefits realized. Basic closed circuit television camera (CCTV) systems are described and video camera operation principles are reviewed. Plant approaches for implementing video camera use are discussed, as are equipment selection issues such as setting task objectives, radiation effects on cameras, and the use of disposal cameras. Specific plant applications are presented and the video equipment used is described. The benefits of video camera use --- mainly reduced radiation exposure and increased productivity --- are discussed and quantified. 15 refs., 6 figs.

  15. Efficient, Narrow-Pass-Band Optical Filters

    NASA Technical Reports Server (NTRS)

    Sandford, Stephen P.

    1996-01-01

    Optical filters with both narrow pass bands and high efficiencies fabricated to design specifications. Offer tremendous improvements in performance for number of optical (including infrared) systems. In fiber-optic and free-space communication systems, precise frequency discrimination afforded by narrow pass bands of filters provide higher channel capacities. In active and passive remote sensors like lidar and gas-filter-correlation radiometers, increased efficiencies afforded by filters enhance detection of small signals against large background noise. In addition, sizes, weights, and power requirements of many optical and infrared systems reduced by taking advantage of gains in signal-to-noise ratios delivered by filters.

  16. WIDE-FIELD ASTRONOMICAL MULTISCALE CAMERAS

    SciTech Connect

    Marks, Daniel L.; Brady, David J.

    2013-05-15

    In order to produce sufficiently low aberrations with a large aperture, telescopes have a limited field of view. Because of this narrow field, large areas of the sky at a given time are unobserved. We propose several telescopes based on monocentric reflective, catadioptric, and refractive objectives that may be scaled to wide fields of view and achieve 1.''1 resolution, which in most locations is the practical seeing limit of the atmosphere. The reflective and Schmidt catadioptric objectives have relatively simple configurations and enable large fields to be captured at the expense of the obscuration of the mirror by secondary optics, a defect that may be managed by image plane design. The refractive telescope design does not have an obscuration but the objective has substantial bulk. The refractive design is a 38 gigapixel camera which consists of a single monocentric objective and 4272 microcameras. Monocentric multiscale telescopes, with their wide fields of view, may observe phenomena that might otherwise be unnoticed, such as supernovae, glint from orbital space debris, and near-earth objects.

  17. Wide field camera observations of Baade's Window

    NASA Technical Reports Server (NTRS)

    Holtzman, Jon A.; Light, R. M.; Baum, William A.; Worthey, Guy; Faber, S. M.; Hunter, Deidre A.; O'Neil, Earl J., Jr.; Kreidl, Tobias J.; Groth, E. J.; Westphal, James A.

    1993-01-01

    We have observed a field in Baade's Window using the Wide Field Camera (WFC) of the Hubble Space Telescope (HST) and obtain V- and I-band photometry down to V approximately 22.5. These data go several magnitudes fainter than previously obtained from the ground. The location of the break in the luminosity function suggests that there are a significant number of intermediate age (less than 10 Gyr) stars in the Galactic bulge. This conclusion rests on the assumptions that the extinction towards our field is similar to that seen in other parts of Baade's Window, that the distance to the bulge is approximately 8 kpc, and that we can determine fairly accurate zero points for the HST photometry. Changes in any one of these assumptions could increase the inferred age, but a conspiracy of lower reddening, a shorter distance to the bulge, and/or photometric zero-point errors would be needed to imply a population entirely older than 10 Gyr. We infer an initial mass function slope for the main-sequence stars, and find that it is consistent with that measured in the solar neighborhood; unfortunately, the slope is poorly constrained because we sample only a narrow range of stellar mass and because of uncertainties in the observed luminosity function at the faint end.

  18. The GISMO-2 Bolometer Camera

    NASA Technical Reports Server (NTRS)

    Staguhn, Johannes G.; Benford, Dominic J.; Fixsen, Dale J.; Hilton, Gene; Irwin, Kent D.; Jhabvala, Christine A.; Kovacs, Attila; Leclercq, Samuel; Maher, Stephen F.; Miller, Timothy M.; Moseley, Samuel H.; Sharp, Elemer H.; Wollack, Edward J.

    2012-01-01

    We present the concept for the GISMO-2 bolometer camera) which we build for background-limited operation at the IRAM 30 m telescope on Pico Veleta, Spain. GISM0-2 will operate Simultaneously in the 1 mm and 2 mm atmospherical windows. The 1 mm channel uses a 32 x 40 TES-based Backshort Under Grid (BUG) bolometer array, the 2 mm channel operates with a 16 x 16 BUG array. The camera utilizes almost the entire full field of view provided by the telescope. The optical design of GISM0-2 was strongly influenced by our experience with the GISMO 2 mm bolometer camera which is successfully operating at the 30m telescope. GISMO is accessible to the astronomical community through the regular IRAM call for proposals.

  19. Dark Energy Camera for Blanco

    SciTech Connect

    Binder, Gary A.; /Caltech /SLAC

    2010-08-25

    In order to make accurate measurements of dark energy, a system is needed to monitor the focus and alignment of the Dark Energy Camera (DECam) to be located on the Blanco 4m Telescope for the upcoming Dark Energy Survey. One new approach under development is to fit out-of-focus star images to a point spread function from which information about the focus and tilt of the camera can be obtained. As a first test of a new algorithm using this idea, simulated star images produced from a model of DECam in the optics software Zemax were fitted. Then, real images from the Mosaic II imager currently installed on the Blanco telescope were used to investigate the algorithm's capabilities. A number of problems with the algorithm were found, and more work is needed to understand its limitations and improve its capabilities so it can reliably predict camera alignment and focus.

  20. Cameras for semiconductor process control

    NASA Technical Reports Server (NTRS)

    Porter, W. A.; Parker, D. L.

    1977-01-01

    The application of X-ray topography to semiconductor process control is described, considering the novel features of the high speed camera and the difficulties associated with this technique. The most significant results on the effects of material defects on device performance are presented, including results obtained using wafers processed entirely within this institute. Defects were identified using the X-ray camera and correlations made with probe data. Also included are temperature dependent effects of material defects. Recent applications and improvements of X-ray topographs of silicon-on-sapphire and gallium arsenide are presented with a description of a real time TV system prototype and of the most recent vacuum chuck design. Discussion is included of our promotion of the use of the camera by various semiconductor manufacturers.

  1. Perceptual Color Characterization of Cameras

    PubMed Central

    Vazquez-Corral, Javier; Connah, David; Bertalmío, Marcelo

    2014-01-01

    Color camera characterization, mapping outputs from the camera sensors to an independent color space, such as XY Z, is an important step in the camera processing pipeline. Until now, this procedure has been primarily solved by using a 3 × 3 matrix obtained via a least-squares optimization. In this paper, we propose to use the spherical sampling method, recently published by Finlayson et al., to perform a perceptual color characterization. In particular, we search for the 3 × 3 matrix that minimizes three different perceptual errors, one pixel based and two spatially based. For the pixel-based case, we minimize the CIE ΔE error, while for the spatial-based case, we minimize both the S-CIELAB error and the CID error measure. Our results demonstrate an improvement of approximately 3% for the ΔE error, 7% for the S-CIELAB error and 13% for the CID error measures. PMID:25490586

  2. Camera-on-a-Chip

    NASA Technical Reports Server (NTRS)

    1999-01-01

    Jet Propulsion Laboratory's research on a second generation, solid-state image sensor technology has resulted in the Complementary Metal- Oxide Semiconductor Active Pixel Sensor (CMOS), establishing an alternative to the Charged Coupled Device (CCD). Photobit Corporation, the leading supplier of CMOS image sensors, has commercialized two products of their own based on this technology: the PB-100 and PB-300. These devices are cameras on a chip, combining all camera functions. CMOS "active-pixel" digital image sensors offer several advantages over CCDs, a technology used in video and still-camera applications for 30 years. The CMOS sensors draw less energy, they use the same manufacturing platform as most microprocessors and memory chips, and they allow on-chip programming of frame size, exposure, and other parameters.

  3. The Rainbow Angle.

    ERIC Educational Resources Information Center

    Sims, B.

    1978-01-01

    Two articles in the "Scientific American" form the background of this note. The rainbow angle for the primary bow of a monochromatic Cartesian rainbow is calculated. Special projects for senior high school students could be patterned after this quantitative study. (MP)

  4. Casting and Angling.

    ERIC Educational Resources Information Center

    Little, Mildred J.; Bunting, Camille

    The self-contained packet contains background information, lesson plans, 15 transparency and student handout masters, drills and games, 2 objective examinations, and references for teaching a 15-day unit on casting and angling to junior high and senior high school students, either as part of a regular physical education program or as a club

  5. Casting and Angling.

    ERIC Educational Resources Information Center

    Smith, Julian W.

    As part of a series of books and pamphlets on outdoor education, this manual consists of easy-to-follow instructions for fishing activities dealing with casting and angling. The manual may be used as a part of the regular physical education program in schools and colleges or as a club activity for the accomplished weekend fisherman or the

  6. A testbed for wide-field, high-resolution, gigapixel-class cameras.

    PubMed

    Kittle, David S; Marks, Daniel L; Son, Hui S; Kim, Jungsang; Brady, David J

    2013-05-01

    The high resolution and wide field of view (FOV) of the AWARE (Advanced Wide FOV Architectures for Image Reconstruction and Exploitation) gigapixel class cameras present new challenges in calibration, mechanical testing, and optical performance evaluation. The AWARE system integrates an array of micro-cameras in a multiscale design to achieve gigapixel sampling at video rates. Alignment and optical testing of the micro-cameras is vital in compositing engines, which require pixel-level accurate mappings over the entire array of cameras. A testbed has been developed to automatically calibrate and measure the optical performance of the entire camera array. This testbed utilizes translation and rotation stages to project a ray into any micro-camera of the AWARE system. A spatial light modulator is projected through a telescope to form an arbitrary object space pattern at infinity. This collimated source is then reflected by an elevation stage mirror for pointing through the aperture of the objective into the micro-optics and eventually the detector of the micro-camera. Different targets can be projected with the spatial light modulator for measuring the modulation transfer function (MTF) of the system, fiducials in the overlap regions for registration and compositing, distortion mapping, illumination profiles, thermal stability, and focus calibration. The mathematics of the testbed mechanics are derived for finding the positions of the stages to achieve a particular incident angle into the camera, along with calibration steps for alignment of the camera and testbed coordinate axes. Measurement results for the AWARE-2 gigapixel camera are presented for MTF, focus calibration, illumination profile, fiducial mapping across the micro-camera for registration and distortion correction, thermal stability, and alignment of the camera on the testbed. PMID:23742532

  7. Narrow-Band Applications of Communications Satellites.

    ERIC Educational Resources Information Center

    Cowlan, Bert; Horowitz, Andrew

    This paper attempts to describe the advantages of "narrow-band" applications of communications satellites for education. It begins by discussing the general controversy surrounding the use of satellites in education, by placing the concern within the larger context of the general debate over the uses of new technologies in education, and by

  8. Narrow Feshbach Dance of Two Trapped Atoms

    NASA Astrophysics Data System (ADS)

    Lopez Valdez, Nicolas; Timmermans, Eddy; Tsai, Shan-Wen

    2012-06-01

    Near a narrow Feshbach resonance (with magnetic field width 10 mG or smaller) the ultra-cold atom interactions acquire an effective range that can be comparable to the average inter-particle distance. Although requiring a more accurate magnetic field control than their broad counterparts, the narrow Feshbach resonances can free cold atom physics from its straightjacket of the contact interaction paradigm. The finite-range effects can give rise to roton features in the phonon dispersion of dilute Bose-Einstein condensates (BEC's) and BEC's can support a ground state with modulated density patterns that breaks translational symmetry. We show that the finite range interaction is the consequence of the time-delay in atom-atom collisions. The narrow regime is also the parameter region in which the interacting atoms can spend a significant fraction of their time in the spin-rearranged (also called ``closed'') channel. To study the interaction physics we describe two atoms in a harmonic trap, interacting near a narrow resonance. We find the fraction of time that the atoms spend in the closed channel at fixed magnetic field and we study the time evolution of this system under conditions of a time-varying magnetic field.

  9. Tunable, narrow bandwidth 83 nm light source

    SciTech Connect

    Egger, H.; Hawkins, R. T.; Bokor, J.; Pummer, H.; Rothschild, M.; Rhodes, C. K.

    1980-01-01

    Up to 40 MW (peak power) of coherent, tunable xuv light, with a bandwidth < 10/sup -2/ cm/sup -1/ and absolute frequency control to within 4 x 10/sup -2/ cm/sup -1/, has been generated by frequency tripling of a narrow bandwidth KrF laser in xenon.

  10. Narrow vision after view-broadening travel.

    PubMed

    Melo, Mariana de Mendona; Ciriano, Jose P Martinez; van Genderen, Perry J J

    2008-01-01

    Loss of vision is a threatening presentation of disease. We describe a case of acute idiopathic blind spot enlargement in a 26-year-old male traveler who presented with narrow vision after a journey to Indonesia. Although the patient used mefloquine at time of presentation, we were unable to retrieve sound data incriminating mefloquine in this rare eye disorder. PMID:18666929

  11. Method for shaping and aiming narrow beams. [sonar mapping and target identification

    NASA Technical Reports Server (NTRS)

    Heyser, R. C. (inventor)

    1981-01-01

    A sonar method and apparatus is discribed which utilizes a linear frequency chirp in a transmitter/receiver having a correlator to synthesize a narrow beamwidth pattern from otherwise broadbeam transducers when there is relative velocity between the transmitter/receiver and the target. The chirp is so produced in a generator in bandwidth, B, and time, T, as to produce a time bandwidth product, TB, that is increased for a narrower angle. A replica of the chirp produced in a generator is time delayed and Doppler shifted for use as a reference in the receiver for correlation of received chirps from targets. This reference is Doppler shifted to select targets preferentially, thereby to not only synthesize a narrow beam but also aim the beam in azimuth and elevation.

  12. Exploring the Moon at High-Resolution: First Results From the Lunar Reconnaissance Orbiter Camera (LROC)

    NASA Astrophysics Data System (ADS)

    Robinson, Mark; Hiesinger, Harald; McEwen, Alfred; Jolliff, Brad; Thomas, Peter C.; Turtle, Elizabeth; Eliason, Eric; Malin, Mike; Ravine, A.; Bowman-Cisneros, Ernest

    The Lunar Reconnaissance Orbiter (LRO) spacecraft was launched on an Atlas V 401 rocket from the Cape Canaveral Air Force Station Launch Complex 41 on June 18, 2009. After spending four days in Earth-Moon transit, the spacecraft entered a three month commissioning phase in an elliptical 30×200 km orbit. On September 15, 2009, LRO began its planned one-year nominal mapping mission in a quasi-circular 50 km orbit. A multi-year extended mission in a fixed 30×200 km orbit is optional. The Lunar Reconnaissance Orbiter Camera (LROC) consists of a Wide Angle Camera (WAC) and two Narrow Angle Cameras (NACs). The WAC is a 7-color push-frame camera, which images the Moon at 100 and 400 m/pixel in the visible and UV, respectively, while the two NACs are monochrome narrow-angle linescan imagers with 0.5 m/pixel spatial resolution. LROC was specifically designed to address two of the primary LRO mission requirements and six other key science objectives, including 1) assessment of meter-and smaller-scale features in order to select safe sites for potential lunar landings near polar resources and elsewhere on the Moon; 2) acquire multi-temporal synoptic 100 m/pixel images of the poles during every orbit to unambiguously identify regions of permanent shadow and permanent or near permanent illumination; 3) meter-scale mapping of regions with permanent or near-permanent illumination of polar massifs; 4) repeat observations of potential landing sites and other regions to derive high resolution topography; 5) global multispectral observations in seven wavelengths to characterize lunar resources, particularly ilmenite; 6) a global 100-m/pixel basemap with incidence angles (60° -80° ) favorable for morphological interpretations; 7) sub-meter imaging of a variety of geologic units to characterize their physical properties, the variability of the regolith, and other key science questions; 8) meter-scale coverage overlapping with Apollo-era panoramic images (1-2 m/pixel) to document the number of small impacts since 1971-1972. LROC allows us to determine the recent impact rate of bolides in the size range of 0.5 to 10 meters, which is currently not well known. Determining the impact rate at these sizes enables engineering remediation measures for future surface operations and interplanetary travel. The WAC has imaged nearly the entire Moon in seven wavelengths. A preliminary global WAC stereo-based topographic model is in preparation [1] and global color processing is underway [2]. As the mission progresses repeat global coverage will be obtained as lighting conditions change providing a robust photometric dataset. The NACs are revealing a wealth of morpho-logic features at the meter scale providing the engineering and science constraints needed to support future lunar exploration. All of the Apollo landing sites have been imaged, as well as the majority of robotic landing and impact sites. Through the use of off-nadir slews a collection of stereo pairs is being acquired that enable 5-m scale topographic mapping [3-7]. Impact mor-phologies (terraces, impact melt, rays, etc) are preserved in exquisite detail at all Copernican craters and are enabling new studies of impact mechanics and crater size-frequency distribution measurements [8-12]. Other topical studies including, for example, lunar pyroclastics, domes, and tectonics are underway [e.g., 10-17]. The first PDS data release of LROC data will be in March 2010, and will include all images from the commissioning phase and the first 3 months of the mapping phase. [1] Scholten et al. (2010) 41st LPSC, #2111; [2] Denevi et al. (2010a) 41st LPSC, #2263; [3] Beyer et al. (2010) 41st LPSC, #2678; [4] Archinal et al. (2010) 41st LPSC, #2609; [5] Mattson et al. (2010) 41st LPSC, #1871; [6] Tran et al. (2010) 41st LPSC, #2515; [7] Oberst et al. (2010) 41st LPSC, #2051; [8] Bray et al. (2010) 41st LPSC, #2371; [9] Denevi et al. (2010b) 41st LPSC, #2582; [10] Hiesinger et al. (2010a) 41st LPSC, #2278; [11] Hiesinger et al. (2010b) 41st LPSC, #2304; [12] van der Bogert et al. (2010) 41st LPSC, #2165; [13] Plescia et al. (2010) 41st LPSC, #2160; [14] Lawrence et al. (2010) 41st LPSC, #1906; [15] Gaddis et al. (2010) 41st LPSC, #2059; [16] Watters et al. (2010) 41st LPSC, #1863; [17] Garry et al. (2010) 41st LPSC, #2278.

  13. A Different Angle on Perspective

    ERIC Educational Resources Information Center

    Frantz, Marc

    2012-01-01

    When a plane figure is photographed from different viewpoints, lengths and angles appear distorted. Hence it is often assumed that lengths, angles, protractors, and compasses have no place in projective geometry. Here we describe a sense in which certain angles are preserved by projective transformations. These angles can be constructed with…

  14. A Different Angle on Perspective

    ERIC Educational Resources Information Center

    Frantz, Marc

    2012-01-01

    When a plane figure is photographed from different viewpoints, lengths and angles appear distorted. Hence it is often assumed that lengths, angles, protractors, and compasses have no place in projective geometry. Here we describe a sense in which certain angles are preserved by projective transformations. These angles can be constructed with

  15. Boiling Visualization and Critical Heat Flux Phenomena In Narrow Rectangular Gap

    SciTech Connect

    J. J. Kim; Y. H. Kim; S. J. Kim; S. W. Noh; K. Y. Suh; J. Rempe; F. B. Cheung; S. B. Kim

    2004-12-01

    An experimental study was performed to investifate the pool boling critical hear flux (CHF) on one-dimensional inclined rectangular channels with narrow gaps by changing the orientation of a copper test heater assembly. In a pool of saturated water at atmospheric pressure, the test parameters include the gap sizes of 1,2,5, and 10 mm, andthe surface orientation angles from the downward facing position (180 degrees) to the vertical position (90 degress) respectively.

  16. Gamma-ray camera flyby

    SciTech Connect

    2010-01-01

    Animation based on an actual classroom demonstration of the prototype CCI-2 gamma-ray camera's ability to image a hidden radioactive source, a cesium-137 line source, in three dimensions. For more information see http://newscenter.lbl.gov/feature-stories/2010/06/02/applied-nuclear-physics/.

  17. The Camera Comes to Court.

    ERIC Educational Resources Information Center

    Floren, Leola

    After the Lindbergh kidnapping trial in 1935, the American Bar Association sought to eliminate electronic equipment from courtroom proceedings. Eventually, all but two states adopted regulations applying that ban to some extent, and a 1965 Supreme Court decision encouraged the banning of television cameras at trials as well. Currently, some states

  18. High speed multiwire photon camera

    NASA Technical Reports Server (NTRS)

    Lacy, Jeffrey L. (Inventor)

    1991-01-01

    An improved multiwire proportional counter camera having particular utility in the field of clinical nuclear medicine imaging. The detector utilizes direct coupled, low impedance, high speed delay lines, the segments of which are capacitor-inductor networks. A pile-up rejection test is provided to reject confused events otherwise caused by multiple ionization events occuring during the readout window.

  19. High speed multiwire photon camera

    NASA Technical Reports Server (NTRS)

    Lacy, Jeffrey L. (Inventor)

    1989-01-01

    An improved multiwire proportional counter camera having particular utility in the field of clinical nuclear medicine imaging. The detector utilizes direct coupled, low impedance, high speed delay lines, the segments of which are capacitor-inductor networks. A pile-up rejection test is provided to reject confused events otherwise caused by multiple ionization events occurring during the readout window.

  20. OSIRIS camera barrel optomechanical design

    NASA Astrophysics Data System (ADS)

    Farah, Alejandro; Tejada, Carlos; Gonzalez, Jesus; Cobos, Francisco J.; Sanchez, Beatriz; Fuentes, Javier; Ruiz, Elfego

    2004-09-01

    A Camera Barrel, located in the OSIRIS imager/spectrograph for the Gran Telescopio Canarias (GTC), is described in this article. The barrel design has been developed by the Institute for Astronomy of the University of Mexico (IA-UNAM), in collaboration with the Institute for Astrophysics of Canarias (IAC), Spain. The barrel is being manufactured by the Engineering Center for Industrial Development (CIDESI) at Queretaro, Mexico. The Camera Barrel includes a set of eight lenses (three doublets and two singlets), with their respective supports and cells, as well as two subsystems: the Focusing Unit, which is a mechanism that modifies the first doublet relative position; and the Passive Displacement Unit (PDU), which uses the third doublet as thermal compensator to maintain the camera focal length and image quality when the ambient temperature changes. This article includes a brief description of the scientific instrument; describes the design criteria related with performance justification; and summarizes the specifications related with misalignment errors and generated stresses. The Camera Barrel components are described and analytical calculations, FEA simulations and error budgets are also included.

  1. Measuring Distances Using Digital Cameras

    ERIC Educational Resources Information Center

    Kendal, Dave

    2007-01-01

    This paper presents a generic method of calculating accurate horizontal and vertical object distances from digital images taken with any digital camera and lens combination, where the object plane is parallel to the image plane or tilted in the vertical plane. This method was developed for a project investigating the size, density and spatial…

  2. Camera assisted multimodal user interaction

    NASA Astrophysics Data System (ADS)

    Hannuksela, Jari; Silvén, Olli; Ronkainen, Sami; Alenius, Sakari; Vehviläinen, Markku

    2010-01-01

    Since more processing power, new sensing and display technologies are already available in mobile devices, there has been increased interest in building systems to communicate via different modalities such as speech, gesture, expression, and touch. In context identification based user interfaces, these independent modalities are combined to create new ways how the users interact with hand-helds. While these are unlikely to completely replace traditional interfaces, they will considerably enrich and improve the user experience and task performance. We demonstrate a set of novel user interface concepts that rely on built-in multiple sensors of modern mobile devices for recognizing the context and sequences of actions. In particular, we use the camera to detect whether the user is watching the device, for instance, to make the decision to turn on the display backlight. In our approach the motion sensors are first employed for detecting the handling of the device. Then, based on ambient illumination information provided by a light sensor, the cameras are turned on. The frontal camera is used for face detection, while the back camera provides for supplemental contextual information. The subsequent applications triggered by the context can be, for example, image capturing, or bar code reading.

  3. Characterization of a PET Camera Optimized for ProstateImaging

    SciTech Connect

    Huber, Jennifer S.; Choong, Woon-Seng; Moses, William W.; Qi,Jinyi; Hu, Jicun; Wang, G.C.; Wilson, David; Oh, Sang; Huesman, RonaldH.; Derenzo, Stephen E.

    2005-11-11

    We present the characterization of a positron emission tomograph for prostate imaging that centers a patient between a pair of external curved detector banks (ellipse: 45 cm minor, 70 cm major axis). The distance between detector banks adjusts to allow patient access and to position the detectors as closely as possible for maximum sensitivity with patients of various sizes. Each bank is composed of two axial rows of 20 HR+ block detectors for a total of 80 detectors in the camera. The individual detectors are angled in the transaxial plane to point towards the prostate to reduce resolution degradation in that region. The detectors are read out by modified HRRT data acquisition electronics. Compared to a standard whole-body PET camera, our dedicated-prostate camera has the same sensitivity and resolution, less background (less randoms and lower scatter fraction) and a lower cost. We have completed construction of the camera. Characterization data and reconstructed images of several phantoms are shown. Sensitivity of a point source in the center is 946 cps/mu Ci. Spatial resolution is 4 mm FWHM in the central region.

  4. Payload topography camera of Chang'e-3

    NASA Astrophysics Data System (ADS)

    Yu, Guo-Bin; Liu, En-Hai; Zhao, Ru-Jin; Zhong, Jie; Zhou, Xiang-Dong; Zhou, Wu-Lin; Wang, Jin; Chen, Yuan-Pei; Hao, Yong-Jie

    2015-11-01

    Chang'e-3 was China's first soft-landing lunar probe that achieved a successful roving exploration on the Moon. A topography camera functioning as the lander's “eye” was one of the main scientific payloads installed on the lander. It was composed of a camera probe, an electronic component that performed image compression, and a cable assembly. Its exploration mission was to obtain optical images of the lunar topography in the landing zone for investigation and research. It also observed rover movement on the lunar surface and finished taking pictures of the lander and rover. After starting up successfully, the topography camera obtained static images and video of rover movement from different directions, 360° panoramic pictures of the lunar surface around the lander from multiple angles, and numerous pictures of the Earth. All images of the rover, lunar surface, and the Earth were clear, and those of the Chinese national flag were recorded in true color. This paper describes the exploration mission, system design, working principle, quality assessment of image compression, and color correction of the topography camera. Finally, test results from the lunar surface are provided to serve as a reference for scientific data processing and application.

  5. Angles in the Sky?

    NASA Astrophysics Data System (ADS)

    Behr, Bradford

    2005-09-01

    Tycho Brahe lived and worked in the late 1500s before the telescope was invented. He made highly accurate observations of the positions of planets, stars, and comets using large angle-measuring devices of his own design. You can use his techniques to observe the sky as well. For example, the degree, a common unit of measurement in astronomy, can be measured by holding your fist at arm's length up to the sky. Open your fist and observe the distance across the sky covered by the width of your pinky fingernail. That is, roughly, a degree! After some practice, and knowing that one degree equals four minutes, you can measure elapsed time by measuring the angle of the distance that the Moon appears to have moved and multiplying that number by four. You can also figure distances and sizes of things. These are not precise measurements, but rough estimates that can give you a "close-enough" answer.

  6. New data on separation and position angle of selected binaries

    NASA Astrophysics Data System (ADS)

    Muller, Rafael J.; Lopez, Andy J.; Torres, Brian S.; Mendoza, Lizyan; Vergara, Nelson; Cersosimo, Juan; Martinez, Luis

    2015-01-01

    We report on a sample of the data aquired on may 2012 at the 31 inch NURO telescope at Anderson Mesa near Flagstaff, Arizona pertaining the separation and position angle of selected binary stars. A CCD camera coupled to the NURO telescope allows for a simple and straightforward procedure for obtaining the images of the binaries. Analysis of the images is straigthforward and both direct and software methodology yield the separation and position angle of the binary stars. The data obtained is suitable for insertion in the Washington Double Star Catalog of the US Naval Observatory.

  7. Optically trapped atomic resonant devices for narrow linewidth spectral imaging

    NASA Astrophysics Data System (ADS)

    Qian, Lipeng

    This thesis focuses on the development of atomic resonant devices for spectroscopic applications. The primary emphasis is on the imaging properties of optically thick atomic resonant fluorescent filters and their applications. In addition, this thesis presents a new concept for producing very narrow linewidth light as from an atomic vapor lamp pumped by a nanosecond pulse system. This research was motivated by application for missile warning system, and presents an innovative approach to a wide angle, ultra narrow linewidth imaging filter using a potassium vapor cell. The approach is to image onto and collect the fluorescent photons emitted from the surface of an optically thick potassium vapor cell, generating a 2 GHz pass-band imaging filter. This linewidth is narrow enough to fall within a Fraunhefer dark zone in the solar spectrum, thus make the detection solar blind. Experiments are conducted to measure the absorption line shape of the potassium resonant filter, the quantum efficiency of the fluorescent behavior, and the resolution of the fluorescent image. Fluorescent images with different spatial frequency components are analyzed by using a discrete Fourier transform, and the imaging capability of the fluorescent filter is described by its Modulation Transfer Function. For the detection of radiation that is spectrally broader than the linewidth of the potassium imaging filter, the fluorescent image is seen to be blurred by diffuse fluorescence from the slightly off resonant photons. To correct this, an ultra-thin potassium imaging filter is developed and characterized. The imaging property of the ultra-thin potassium imaging cell is tested with a potassium seeded flame, yielding a resolution image of ˜ 20 lines per mm. The physics behind the atomic resonant fluorescent filter is radiation trapping. The diffusion process of the resonant photons trapped in the atomic vapor is theoretically described in this thesis. A Monte Carlo method is used to simulate the absorption and fluorescence. The optimum resolution of the fluorescent image is predicted by simulation. Radiation trapping is also shown to be useful for the generation of ultra-narrow linewidth light from an atomic vapor flash lamp. A 2 nanosecond, high voltage pulse is used to excite low pressure mercury vapor mixed with noble gases, producing high intensity emission at the mercury resonant line at 253.7 nm. With a nanosecond pumping time and high electrical current, the radiation intensity of the mercury discharge is increased significantly compared to a normal glow discharge lamp, while simultaneously suppressing the formation of an arc discharge. By avoiding the arc discharge, discrete spectral lines of mercury were kept at narrow bandwidth. Due to radiation trapping, the emission linewidth from the nanosecond mercury lamp decreases with time and produces ultra-narrow linewidth emission 100 ns after of the excitation, this linewidth is verified by absorption measurements through low pressure mercury absorption filter. The lamp is used along with mercury absorption filters for spectroscopic applications, including Filtered Rayleigh Scattering with different CO2 pressures and Raman scattering from methanol.

  8. Laser angle sensor

    NASA Technical Reports Server (NTRS)

    Pond, C. R.; Texeira, P. D.

    1985-01-01

    A laser angle measurement system was designed and fabricated for NASA Langley Research Center. The instrument is a fringe counting interferometer that monitors the pitch attitude of a model in a wind tunnel. A laser source and detector are mounted above the model. Interference fringes are generated by a small passive element on the model. The fringe count is accumulated and displayed by a processor in the wind tunnel control room. This report includes optical and electrical schematics, system maintenance and operation procedures.

  9. Small Angle Neutron Scattering

    SciTech Connect

    Urban, Volker S

    2012-01-01

    Small Angle Neutron Scattering (SANS) probes structural details at the nanometer scale in a non-destructive way. This article gives an introduction to scientists who have no prior small-angle scattering knowledge, but who seek a technique that allows elucidating structural information in challenging situations that thwart approaches by other methods. SANS is applicable to a wide variety of materials including metals and alloys, ceramics, concrete, glasses, polymers, composites and biological materials. Isotope and magnetic interactions provide unique methods for labeling and contrast variation to highlight specific structural features of interest. In situ studies of a material s responses to temperature, pressure, shear, magnetic and electric fields, etc., are feasible as a result of the high penetrating power of neutrons. SANS provides statistical information on significant structural features averaged over the probed sample volume, and one can use SANS to quantify with high precision the structural details that are observed, for example, in electron microscopy. Neutron scattering is non-destructive; there is no need to cut specimens into thin sections, and neutrons penetrate deeply, providing information on the bulk material, free from surface effects. The basic principles of a SANS experiment are fairly simple, but the measurement, analysis and interpretation of small angle scattering data involves theoretical concepts that are unique to the technique and that are not widely known. This article includes a concise description of the basics, as well as practical know-how that is essential for a successful SANS experiment.

  10. Measurement of the surface wavelength distribution of narrow-band radiation by a colorimetric method

    SciTech Connect

    Kraiskii, A V; Mironova, T V; Sultanov, T T

    2010-09-10

    A method is suggested for determining the wavelength of narrow-band light from a digital photograph of a radiating surface. The digital camera used should be appropriately calibrated. The accuracy of the wavelength measurement is better than 1 nm. The method was tested on the yellow doublet of mercury spectrum and on the adjacent continuum of the incandescent lamp radiation spectrum. By means of the method suggested the homogeneity of holographic sensor swelling was studied in stationary and transient cases. (laser applications and other topics in quantum electronics)

  11. Contact Angle Measurements Using a Simplified Experimental Setup

    ERIC Educational Resources Information Center

    Lamour, Guillaume; Hamraoui, Ahmed; Buvailo, Andrii; Xing, Yangjun; Keuleyan, Sean; Prakash, Vivek; Eftekhari-Bafrooei, Ali; Borguet, Eric

    2010-01-01

    A basic and affordable experimental apparatus is described that measures the static contact angle of a liquid drop in contact with a solid. The image of the drop is made with a simple digital camera by taking a picture that is magnified by an optical lens. The profile of the drop is then processed with ImageJ free software. The ImageJ contact

  12. Contact Angle Measurements Using a Simplified Experimental Setup

    ERIC Educational Resources Information Center

    Lamour, Guillaume; Hamraoui, Ahmed; Buvailo, Andrii; Xing, Yangjun; Keuleyan, Sean; Prakash, Vivek; Eftekhari-Bafrooei, Ali; Borguet, Eric

    2010-01-01

    A basic and affordable experimental apparatus is described that measures the static contact angle of a liquid drop in contact with a solid. The image of the drop is made with a simple digital camera by taking a picture that is magnified by an optical lens. The profile of the drop is then processed with ImageJ free software. The ImageJ contact…

  13. Recent advances in digital camera optics

    NASA Astrophysics Data System (ADS)

    Ishiguro, Keizo

    2012-10-01

    The digital camera market has extremely expanded in the last ten years. The zoom lens for digital camera is especially the key determining factor of the camera body size and image quality. Its technologies have been based on several analog technological progresses including the method of aspherical lens manufacturing and the mechanism of image stabilization. Panasonic is one of the pioneers of both technologies. I will introduce the previous trend in optics of zoom lens as well as original optical technologies of Panasonic digital camera "LUMIX", and in addition optics in 3D camera system. Besides, I would like to suppose the future trend in digital cameras.

  14. A wide-field infrared camera for the Observatoire du mont Mgantic

    NASA Astrophysics Data System (ADS)

    Artigau, Etienne; Doyon, Rene; Nadeau, Daniel; Vallee, Philippe; Thibault, Simon

    2003-03-01

    A wide-field near-infrared (0.8 2.4 ?m) camera for the 1.6 m telescope of the Observatoire du mont Mgantic (OMM), is currently under construction at the Universit de Montral. The field of view is 30' 30' and will have very little distortion. The optics comprise 8 spherical cryogenic lenses. The instrument features two filter wheels with provision for 10 filters including broad band I, z, J, H, K and other narrow-band filters. The camera is based on a 2048 2048 HgCdTe Hawaii-2 detector driven by a 32-output SDSU-II controller operating at ~250 kHz.

  15. Directional reflectance of vegetation measured by a calibrated digital camera

    NASA Astrophysics Data System (ADS)

    Dymond, John R.; Trotter, Craig M.

    1997-06-01

    Obtaining directional reflectance information for vegetation canopies is often an expensive and time-consuming process. We present here a simple approach based on the use of an inexpensive digital camera equipped with a wide-angle lens. By the imaging of a large homogeneous area, a single image captures multiple views of a vegetation canopy. This gives a directional reflectance distribution fully sampled for view direction and free of variations in Sun elevation and azimuth. We determined the radiometric response of the camera sensor CCD s at the focal point and then extended this calibration to the full CCD array by using averaged images of clear blue sky. We evaluated the utility of the system by obtaining directional reflectance distributions of two vegetation targets, grass ( Lolium spp) and pine forest ( Pinus radiata ), for red visible light. The precision of the derived biangular pattern of reflectance was 7 .

  16. Search for atmospheric holes with the Viking cameras

    SciTech Connect

    Frank, L.A.; Sigwarth, J.B.; Craven, J.D. )

    1989-12-01

    Images taken with the two ultraviolet cameras on board the Viking spacecraft were examined for evidence of transient decreases of Earth's ultraviolet dayglow. Comparison of near-limb observations of dayglow intensities with those at smaller angles to the nadir with the camera sensitive to OI 130.4 nm emissions supports the existence of transient decreases in the near-nadir dayglow. However, the amount of near-nadir imaging is severely limited and only several significant events are found. More decisive confirmation of the existence of such transient decreases must await a larger survey from another spacecraft. The diameters of these regions as detected with Viking are {approximately}50 to 100 km. Occurrence frequencies, intensity decreases, and dimensions for these clusters of darkened pixels are similar to those previously reported for such events, or atmospheric holes, as seen in images of the ultraviolet dayglow with Dynamics Explorer 1.

  17. Interference-induced angle-independent acoustical transparency

    SciTech Connect

    Qi, Lehua; Yu, Gaokun Wang, Ning; Wang, Xinlong; Wang, Guibo

    2014-12-21

    It is revealed that the Fano-like interference leads to the extraordinary acoustic transmission through a slab metamaterial of thickness much smaller than the wavelength, with each unit cell consisting of a Helmholtz resonator and a narrow subwavelength slit. More importantly, both the theoretical analysis and experimental measurement show that the angle-independent acoustical transparency can be realized by grafting a Helmholtz resonator and a quarter-wave resonator to the wall of a narrow subwavelength slit in each unit cell of a slit array. The observed phenomenon results from the interferences between the waves propagating in the slit, those re-radiated by the Helmholtz resonator, and those re-radiated by the quarter-wave resonator. The proposed design may find its applications in designing angle-independent acoustical filters and controlling the phase of the transmitted waves.

  18. Creep turns linear in narrow ferromagnetic nanostrips

    NASA Astrophysics Data System (ADS)

    Leliaert, Jonathan; van de Wiele, Ben; Vansteenkiste, Arne; Laurson, Lasse; Durin, Gianfranco; Dupré, Luc; van Waeyenberge, Bartel

    2016-02-01

    The motion of domain walls in magnetic materials is a typical example of a creep process, usually characterised by a stretched exponential velocity-force relation. By performing large-scale micromagnetic simulations, and analyzing an extended 1D model which takes the effects of finite temperatures and material defects into account, we show that this creep scaling law breaks down in sufficiently narrow ferromagnetic strips. Our analysis of current-driven transverse domain wall motion in disordered Permalloy nanostrips reveals instead a creep regime with a linear dependence of the domain wall velocity on the applied field or current density. This originates from the essentially point-like nature of domain walls moving in narrow, line- like disordered nanostrips. An analogous linear relation is found also by analyzing existing experimental data on field-driven domain wall motion in perpendicularly magnetised media.

  19. Current Propagation in Narrow Bipolar Pulses

    NASA Astrophysics Data System (ADS)

    Watson, S. S.; Marshall, T. C.

    2005-12-01

    We model the observed electric fields of a particular narrow bipolar pulse (NBP) published in Eack [2004]. We assume an exponential growth of current carriers due to a runaway breakdown avalanche and show that this leads to a corresponding increase in current. With specific input values for discharge altitude, length, current, and propagation velocity, the model does a good job of reproducing the observed near and far electric field. The ability of the model to reproduce the observed electric fields is an indication that our assumptions concerning the runaway avalanche may be correct, and this indication is further strengthened by the inability of the simple transmission line model to reproduce simultaneously both the near and far electric fields. Eack, K. B. (2004), Electrical characteristics of narrow bipolar events, Geophys. Res. Lett., 31, L20102, doi:10.1029/2004/GL021117.

  20. Narrow-Line Seyfert 1 Galaxies

    NASA Technical Reports Server (NTRS)

    Leighly, Karen M.

    2000-01-01

    The primary work during this year has been the analysis and interpretation of our HST spectra from two extreme Narrow-line Seyfert 1 galaxies (NLS1s) Infrared Astronomy Satellite (IRAS) 13224-3809 and 1H 0707-495. This work has been presented as an invited talk at the workshop entitled "Observational and theoretical progress in the Study of Narrow-line Seyfert 1 Galaxies" held in Bad Honnef, Germany December 8-11, as a contributed talk at the January 2000 AAS meeting in Atlanta, Georgia, and as a contributed talk at the workshop "Probing the Physics of Active Galactic Nuclei by Multiwavelength Monitoring" held at Goddard Space Flight Center June 20-22, 2000.

  1. Creep turns linear in narrow ferromagnetic nanostrips

    PubMed Central

    Leliaert, Jonathan; Van de Wiele, Ben; Vansteenkiste, Arne; Laurson, Lasse; Durin, Gianfranco; Dupré, Luc; Van Waeyenberge, Bartel

    2016-01-01

    The motion of domain walls in magnetic materials is a typical example of a creep process, usually characterised by a stretched exponential velocity-force relation. By performing large-scale micromagnetic simulations, and analyzing an extended 1D model which takes the effects of finite temperatures and material defects into account, we show that this creep scaling law breaks down in sufficiently narrow ferromagnetic strips. Our analysis of current-driven transverse domain wall motion in disordered Permalloy nanostrips reveals instead a creep regime with a linear dependence of the domain wall velocity on the applied field or current density. This originates from the essentially point-like nature of domain walls moving in narrow, line- like disordered nanostrips. An analogous linear relation is found also by analyzing existing experimental data on field-driven domain wall motion in perpendicularly magnetised media. PMID:26843125

  2. Creep turns linear in narrow ferromagnetic nanostrips.

    PubMed

    Leliaert, Jonathan; Van de Wiele, Ben; Vansteenkiste, Arne; Laurson, Lasse; Durin, Gianfranco; Dupré, Luc; Van Waeyenberge, Bartel

    2016-01-01

    The motion of domain walls in magnetic materials is a typical example of a creep process, usually characterised by a stretched exponential velocity-force relation. By performing large-scale micromagnetic simulations, and analyzing an extended 1D model which takes the effects of finite temperatures and material defects into account, we show that this creep scaling law breaks down in sufficiently narrow ferromagnetic strips. Our analysis of current-driven transverse domain wall motion in disordered Permalloy nanostrips reveals instead a creep regime with a linear dependence of the domain wall velocity on the applied field or current density. This originates from the essentially point-like nature of domain walls moving in narrow, line- like disordered nanostrips. An analogous linear relation is found also by analyzing existing experimental data on field-driven domain wall motion in perpendicularly magnetised media. PMID:26843125

  3. Lytro camera technology: theory, algorithms, performance analysis

    NASA Astrophysics Data System (ADS)

    Georgiev, Todor; Yu, Zhan; Lumsdaine, Andrew; Goma, Sergio

    2013-03-01

    The Lytro camera is the first implementation of a plenoptic camera for the consumer market. We consider it a successful example of the miniaturization aided by the increase in computational power characterizing mobile computational photography. The plenoptic camera approach to radiance capture uses a microlens array as an imaging system focused on the focal plane of the main camera lens. This paper analyzes the performance of Lytro camera from a system level perspective, considering the Lytro camera as a black box, and uses our interpretation of Lytro image data saved by the camera. We present our findings based on our interpretation of Lytro camera file structure, image calibration and image rendering; in this context, artifacts and final image resolution are discussed.

  4. HDTV light camera in triax version

    NASA Astrophysics Data System (ADS)

    Delmas, F.; Tichit, D.

    1992-08-01

    The history of color television cameras has been marked by three major breakthroughs: The possibility of remoting camera heads up to I km from the OB van (1971) - Portable cameras (1976); - The emergence of CCDs (1986). Professionals expect high-definition cameras to offer all the features and benefits connected with remoting camera heads and portable operation. Thomson Broadcast HD 1250 Light camera offers the same kind of operational flexibility as a standard television camera. It is a lightweight, portable unit and can be connected to the 08 van by a triax link more than 1 km long. The basic characteristics of this camera are described below; special emphasis is placed on the original features that make the HD 1250 Light a pioneer in its field.

  5. Preliminary Mapping of Permanently Shadowed and Sunlit Regions Using the Lunar Reconnaissance Orbiter Camera (LROC)

    NASA Astrophysics Data System (ADS)

    Speyerer, E.; Koeber, S.; Robinson, M. S.

    2010-12-01

    The spin axis of the Moon is tilted by only 1.5 (compared with the Earth's 23.5), leaving some areas near the poles in permanent shadow while other nearby regions remain sunlit for a majority of the year. Theory, radar data, neutron measurements, and Lunar CRater Observation and Sensing Satellite (LCROSS) observations suggest that volatiles may be present in the cold traps created inside these permanently shadowed regions. While areas of near permanent illumination are prime locations for future lunar outposts due to benign thermal conditions and near constant solar power. The Lunar Reconnaissance Orbiter (LRO) has two imaging systems that provide medium and high resolution views of the poles. During almost every orbit the LROC Wide Angle Camera (WAC) acquires images at 100 m/pixel of the polar region (80 to 90 north and south latitude). In addition, the LROC Narrow Angle Camera (NAC) targets selected regions of interest at 0.7 to 1.5 m/pixel [Robinson et al., 2010]. During the first 11 months of the nominal mission, LROC acquired almost 6,000 WAC images and over 7,300 NAC images of the polar region (i.e., within 2 of pole). By analyzing this time series of WAC and NAC images, regions of permanent shadow and permanent, or near-permanent illumination can be quantified. The LROC Team is producing several reduced data products that graphically illustrate the illumination conditions of the polar regions. Illumination movie sequences are being produced that show how the lighting conditions change over a calendar year. Each frame of the movie sequence is a polar stereographic projected WAC image showing the lighting conditions at that moment. With the WACs wide field of view (~100 km at an altitude of 50 km), each frame has repeat coverage between 88 and 90 at each pole. The same WAC images are also being used to develop multi-temporal illumination maps that show the percent each 100 m 100 m area is illuminated over a period of time. These maps are derived by stacking all the WAC frames, selecting a threshold to determine if the surface is illuminated, and summing the resulting binary images. In addition, mosaics of NAC images are also being produced for regions of interest at a scale of 0.7 to 1.5 m/pixel. The mosaics produced so far have revealed small illuminated surfaces on the tens of meters scale that were previously thought to be shadowed during that time. The LROC dataset of the polar regions complements previous illumination analysis of Clementine images [Bussey et al., 1999], Kaguya topography [Bussey et al., 2010], and the current efforts underway by the Lunar Orbiter Laser Altimeter (LOLA) Team [Mazarico et al., 2010] and provide an important new dataset for science and exploration. References: Bussey et al. (1999), Illumination conditions at the lunar south pole, Geophysical Research Letters, 26(9), 1187-1190. Bussey et al. (2010), Illumination conditions of the south pole of the Moon derived from Kaguya topography, Icarus, 208, 558-564. Mazarico et al. (2010), Illumination of the lunar poles from the Lunar Orbiter Laser Altimeter (LOLA) Topography Data, paper presented at 41st LPSC, Houston, TX. Robinson et al. (2010), Lunar Reconnaissance Orbiter Camera (LROC) Instrument Overview, Space Sci Rev, 150, 81-124.

  6. Laser angle measurement system

    NASA Technical Reports Server (NTRS)

    Pond, C. R.; Texeira, P. D.; Wilbert, R. E.

    1980-01-01

    The design and fabrication of a laser angle measurement system is described. The instrument is a fringe counting interferometer that monitors the pitch attitude of a model in a wind tunnel. A laser source and detector are mounted above the mode. Interference fringes are generated by a small passive element on the model. The fringe count is accumulated and displayed by a processor in the wind tunnel control room. Optical and electrical schematics, system maintenance and operation procedures are included, and the results of a demonstration test are given.

  7. Ortho-Rectification of Narrow Band Multi-Spectral Imagery Assisted by Dslr RGB Imagery Acquired by a Fixed-Wing Uas

    NASA Astrophysics Data System (ADS)

    Rau, J.-Y.; Jhan, J.-P.; Huang, C.-Y.

    2015-08-01

    Miniature Multiple Camera Array (MiniMCA-12) is a frame-based multilens/multispectral sensor composed of 12 lenses with narrow band filters. Due to its small size and light weight, it is suitable to mount on an Unmanned Aerial System (UAS) for acquiring high spectral, spatial and temporal resolution imagery used in various remote sensing applications. However, due to its wavelength range is only 10 nm that results in low image resolution and signal-to-noise ratio which are not suitable for image matching and digital surface model (DSM) generation. In the meantime, the spectral correlation among all 12 bands of MiniMCA images are low, it is difficult to perform tie-point matching and aerial triangulation at the same time. In this study, we thus propose the use of a DSLR camera to assist automatic aerial triangulation of MiniMCA-12 imagery and to produce higher spatial resolution DSM for MiniMCA12 ortho-image generation. Depending on the maximum payload weight of the used UAS, these two kinds of sensors could be collected at the same time or individually. In this study, we adopt a fixed-wing UAS to carry a Canon EOS 5D Mark2 DSLR camera and a MiniMCA-12 multi-spectral camera. For the purpose to perform automatic aerial triangulation between a DSLR camera and the MiniMCA-12, we choose one master band from MiniMCA-12 whose spectral range has overlap with the DSLR camera. However, all lenses of MiniMCA-12 have different perspective centers and viewing angles, the original 12 channels have significant band misregistration effect. Thus, the first issue encountered is to reduce the band misregistration effect. Due to all 12 MiniMCA lenses being frame-based, their spatial offsets are smaller than 15 cm and all images are almost 98% overlapped, we thus propose a modified projective transformation (MPT) method together with two systematic error correction procedures to register all 12 bands of imagery on the same image space. It means that those 12 bands of images acquired at the same exposure time will have same interior orientation parameters (IOPs) and exterior orientation parameters (EOPs) after band-to-band registration (BBR). Thus, in the aerial triangulation stage, the master band of MiniMCA-12 was treated as a reference channel to link with DSLR RGB images. It means, all reference images from the master band of MiniMCA-12 and all RGB images were triangulated at the same time with same coordinate system of ground control points (GCP). Due to the spatial resolution of RGB images is higher than the MiniMCA-12, the GCP can be marked on the RGB images only even they cannot be recognized on the MiniMCA images. Furthermore, a one meter gridded digital surface model (DSM) is created by the RGB images and applied to the MiniMCA imagery for ortho-rectification. Quantitative error analyses show that the proposed BBR scheme can achieve 0.33 pixels of average misregistration residuals length and the co-registration errors among 12 MiniMCA ortho-images and between MiniMCA and Canon RGB ortho-images are all less than 0.6 pixels. The experimental results demonstrate that the proposed method is robust, reliable and accurate for future remote sensing applications.

  8. LDEF yaw and pitch angle estimates

    NASA Technical Reports Server (NTRS)

    Banks, Bruce A.; Gebauer, Linda

    1992-01-01

    Quantification of the LDEF yaw and pitch misorientations is crucial to the knowledge of atomic oxygen exposure of samples placed on LDEF. Video camera documentation of the LDEF spacecraft prior to grapple attachment, atomic oxygen shadows on experiment trays and longerons, and a pinhole atomic oxygen camera placed on LDEF provided sources of documentation of the yaw and pitch misorientation. Based on uncertainty-weighted averaging of data, the LDEF yaw offset was found to be 8.1 plus or minus 0.6 degrees, allowing higher atomic oxygen exposure of row 12 than initially anticipated. The LDEF pitch angle offset was found to be 0.8 plus or minus 0.4 degrees, such that the space end was tipped forward toward the direction of travel. The resulting consequences of the yaw and pitch misorientation of LDEF on the atomic oxygen fluence is a factor of 2.16 increase for samples located on row 12, and a factor of 1.18 increase for samples located on the space end compared to that which would be expected for perfect orientation.

  9. The Dark Energy Camera (DECam)

    NASA Astrophysics Data System (ADS)

    DePoy, D. L.; Abbott, T.; Annis, J.; Antonik, M.; Barceló, M.; Bernstein, R.; Bigelow, B.; Brooks, D.; Buckley-Geer, E.; Campa, J.; Cardiel, L.; Castander, F.; Castilla, J.; Cease, H.; Chappa, S.; Dede, E.; Derylo, G.; Diehl, H. T.; Doel, P.; DeVicente, J.; Estrada, J.; Finley, D.; Flaugher, B.; Gaztanaga, E.; Gerdes, D.; Gladders, M.; Guarino, V.; Gutierrez, G.; Hamilton, J.; Haney, M.; Holland, S.; Honscheid, K.; Huffman, D.; Karliner, I.; Kau, D.; Kent, S.; Kozlovsky, M.; Kubik, D.; Kuehn, K.; Kuhlmann, S.; Kuk, K.; Leger, F.; Lin, H.; Martinez, G.; Martinez, M.; Merritt, W.; Mohr, J.; Moore, P.; Moore, T.; Nord, B.; Ogando, R.; Olsen, J.; Onal, B.; Peoples, J.; Qian, T.; Roe, N.; Sanchez, E.; Scarpine, V.; Schmidt, R.; Schmitt, R.; Schubnell, M.; Schultz, K.; Selen, M.; Shaw, T.; Simaitis, V.; Slaughter, J.; Smith, C.; Spinka, H.; Stefanik, A.; Stuermer, W.; Talaga, R.; Tarle, G.; Thaler, J.; Tucker, D.; Walker, A.; Worswick, S.; Zhao, A.

    2008-07-01

    We describe the Dark Energy Camera (DECam), which will be the primary instrument used in the Dark Energy Survey. DECam will be a 3 sq. deg. mosaic camera mounted at the prime focus of the Blanco 4m telescope at the Cerro-Tololo International Observatory (CTIO). DECam includes a large mosaic CCD focal plane, a five element optical corrector, five filters (g,r,i,z,Y), and the associated infrastructure for operation in the prime focus cage. The focal plane consists of 62 2K x 4K CCD modules (0.27"/pixel) arranged in a hexagon inscribed within the roughly 2.2 degree diameter field of view. The CCDs will be 250 micron thick fully-depleted CCDs that have been developed at the Lawrence Berkeley National Laboratory (LBNL). Production of the CCDs and fabrication of the optics, mechanical structure, mechanisms, and control system for DECam are underway; delivery of the instrument to CTIO is scheduled for 2010.

  10. Electronographic cameras for space astronomy.

    NASA Technical Reports Server (NTRS)

    Carruthers, G. R.; Opal, C. B.

    1972-01-01

    Magnetically-focused electronographic cameras have been under development at the Naval Research Laboratory for use in far-ultraviolet imagery and spectrography, primarily in astronomical and optical-geophysical observations from sounding rockets and space vehicles. Most of this work has been with cameras incorporating internal optics of the Schmidt or wide-field all-reflecting types. More recently, we have begun development of electronographic spectrographs incorporating an internal concave grating, operating at normal or grazing incidence. We also are developing electronographic image tubes of the conventional end-window-photo-cathode type, for far-ultraviolet imagery at the focus of a large space telescope, with image formats up to 120 mm in diameter.

  11. Combustion pinhole-camera system

    DOEpatents

    Witte, A.B.

    1982-05-19

    A pinhole camera system is described utilizing a sealed optical-purge assembly which provides optical access into a coal combustor or other energy conversion reactors. The camera system basically consists of a focused-purge pinhole optical port assembly, a conventional TV vidicon receiver, an external, variable density light filter which is coupled electronically to the vidicon automatic gain control (agc). The key component of this system is the focused-purge pinhole optical port assembly which utilizes a purging inert gas to keep debris from entering the port and a lens arrangement which transfers the pinhole to the outside of the port assembly. One additional feature of the port assembly is that it is not flush with the interior of the combustor.

  12. A 10-microm infrared camera.

    PubMed

    Arens, J F; Jernigan, J G; Peck, M C; Dobson, C A; Kilk, E; Lacy, J; Gaalema, S

    1987-09-15

    An IR camera has been built at the University of California at Berkeley for astronomical observations. The camera has been used primarily for high angular resolution imaging at mid-IR wavelengths. It has been tested at the University of Arizona 61- and 90-in. telescopes near Tucson and the NASA Infrared Telescope Facility on Mauna Kea, HI. In the observations the system has been used as an imager with interference coated and Fabry-Perot filters. These measurements have demonstrated a sensitivity consistent with photon shot noise, showing that the system is limited by the radiation from the telescope and atmosphere. Measurements of read noise, crosstalk, and hysteresis have been made in our laboratory. PMID:20490151

  13. Optimising Camera Traps for Monitoring Small Mammals

    PubMed Central

    Glen, Alistair S.; Cockburn, Stuart; Nichols, Margaret; Ekanayake, Jagath; Warburton, Bruce

    2013-01-01

    Practical techniques are required to monitor invasive animals, which are often cryptic and occur at low density. Camera traps have potential for this purpose, but may have problems detecting and identifying small species. A further challenge is how to standardise the size of each cameras field of view so capture rates are comparable between different places and times. We investigated the optimal specifications for a low-cost camera trap for small mammals. The factors tested were 1) trigger speed, 2) passive infrared vs. microwave sensor, 3) white vs. infrared flash, and 4) still photographs vs. video. We also tested a new approach to standardise each cameras field of view. We compared the success rates of four camera trap designs in detecting and taking recognisable photographs of captive stoats (Mustelaerminea), feral cats (Felis catus) and hedgehogs (Erinaceuseuropaeus). Trigger speeds of 0.22.1 s captured photographs of all three target species unless the animal was running at high speed. The camera with a microwave sensor was prone to false triggers, and often failed to trigger when an animal moved in front of it. A white flash produced photographs that were more readily identified to species than those obtained under infrared light. However, a white flash may be more likely to frighten target animals, potentially affecting detection probabilities. Video footage achieved similar success rates to still cameras but required more processing time and computer memory. Placing two camera traps side by side achieved a higher success rate than using a single camera. Camera traps show considerable promise for monitoring invasive mammal control operations. Further research should address how best to standardise the size of each cameras field of view, maximise the probability that an animal encountering a camera trap will be detected, and eliminate visible or audible cues emitted by camera traps. PMID:23840790

  14. ISO camera array development status

    NASA Technical Reports Server (NTRS)

    Sibille, F.; Cesarsky, C.; Agnese, P.; Rouan, D.

    1989-01-01

    A short outline is given of the Infrared Space Observatory Camera (ISOCAM), one of the 4 instruments onboard the Infrared Space Observatory (ISO), with the current status of its two 32x32 arrays, an InSb charge injection device (CID) and a Si:Ga direct read-out (DRO), and the results of the in orbit radiation simulation with gamma ray sources. A tentative technique for the evaluation of the flat fielding accuracy is also proposed.

  15. Graphic design of pinhole cameras

    NASA Technical Reports Server (NTRS)

    Edwards, H. B.; Chu, W. P.

    1979-01-01

    The paper describes a graphic technique for the analysis and optimization of pinhole size and focal length. The technique is based on the use of the transfer function of optical elements described by Scott (1959) to construct the transfer function of a circular pinhole camera. This transfer function is the response of a component or system to a pattern of lines having a sinusoidally varying radiance at varying spatial frequencies. Some specific examples of graphic design are presented.

  16. Variable angle correlation spectroscopy

    SciTech Connect

    Lee, Y K

    1994-05-01

    In this dissertation, a novel nuclear magnetic resonance (NMR) technique, variable angle correlation spectroscopy (VACSY) is described and demonstrated with {sup 13}C nuclei in rapidly rotating samples. These experiments focus on one of the basic problems in solid state NMR: how to extract the wealth of information contained in the anisotropic component of the NMR signal while still maintaining spectral resolution. Analysis of the anisotropic spectral patterns from poly-crystalline systems reveal information concerning molecular structure and dynamics, yet in all but the simplest of systems, the overlap of spectral patterns from chemically distinct sites renders the spectral analysis difficult if not impossible. One solution to this problem is to perform multi-dimensional experiments where the high-resolution, isotropic spectrum in one dimension is correlated with the anisotropic spectral patterns in the other dimensions. The VACSY technique incorporates the angle between the spinner axis and the static magnetic field as an experimental parameter that may be incremented during the course of the experiment to help correlate the isotropic and anisotropic components of the spectrum. The two-dimensional version of the VACSY experiments is used to extract the chemical shift anisotropy tensor values from multi-site organic molecules, study molecular dynamics in the intermediate time regime, and to examine the ordering properties of partially oriented samples. The VACSY technique is then extended to three-dimensional experiments to study slow molecular reorientations in a multi-site polymer system.

  17. Calibration of angle standards

    NASA Astrophysics Data System (ADS)

    Henrique Brum Vieira, Luiz; Stone, Jack; Viliesid, Miguel; Gastaldi, Bruno R.; Przybylska, Joanna; Chaudhary, K. P.

    2015-01-01

    In 2000, a key comparison, CCL-K3 (optical polygon and angle blocks) was started, piloted by NMISA. Based on it, in 2007, the SIM metrological region started a SIM.L-K3 key comparison piloted by INMETRO. The results of this regional comparison (RMO key comparison) contribute to the Mutual Recognition Arrangement (MRA) between the national metrology institutes of the Metre Convention. It is linked with the CCL-K3 key comparison via laboratories that participated in both the CIPM and the RMO comparisons. This common participation establishes the link between the comparisons and ensures equivalence of national metrology institutes, according to the MRA between NMIs. The SIM NMIs that took part in the CCL-K3 were NIST, NRC and CENAM. However, NRC withdrew from it. GUM from Poland (EURAMET) and NPLI from India (APMP) were invited to participate in the SIM.L-K3 key comparison. The circulation of artefacts (a 12 faces polygon and 4 angle blocks) started in 2008 and was completed in 2009. Main text. To reach the main text of this paper, click on Final Report. Note that this text is that which appears in Appendix B of the BIPM key comparison database kcdb.bipm.org/. The final report has been peer-reviewed and approved for publication by the CCL, according to the provisions of the CIPM Mutual Recognition Arrangement (CIPM MRA).

  18. Glancing angle RF sheaths

    NASA Astrophysics Data System (ADS)

    D'Ippolito, D. A.; Myra, J. R.

    2013-10-01

    RF sheaths occur in tokamaks when ICRF waves encounter conducting boundaries. The sheath plays an important role in determining the efficiency of ICRF heating, the impurity influxes from the edge plasma, and the plasma-facing component damage. An important parameter in sheath theory is the angle ? between the equilibrium B field and the wall. Recent work with 1D and 2D sheath models has shown that the rapid variation of ? around a typical limiter can lead to enhanced sheath potentials and localized power deposition (hot spots) when the B field is near glancing incidence. The physics model used to obtain these results does not include some glancing-angle effects, e.g. possible modification of the angular dependence of the Child-Langmuir law and the role of the magnetic pre-sheath. Here, we report on calculations which explore these effects, with the goal of improving the fidelity of the rf sheath BC used in analytical and numerical calculations. Work supported by US DOE grants DE-FC02-05ER54823 and DE-FG02-97ER54392.

  19. Motion detection with camera shake

    NASA Astrophysics Data System (ADS)

    Kazui, Masato; Itoh, Masaya; Yaemori, Hiroki; Takauji, Hidenori; Kaneko, Shun'ichi

    2009-05-01

    A method for detecting an object's motion in images that suffer from camera shake or images with camera egomotion is proposed. This approach is based on edge orientation codes and on the entropy calculated from a histogram of the edge orientation codes. Here, entropy is extended to spatio-temporal entropy. We consider that the spatio-temporal entropy calculated from time-series orientation codes can represent motion complexity, e.g., the motion of a pedestrian. Our method can reject false positives caused by camera shake or background motion. Before the motion filtering, object candidates are detected by a frame-subtraction-based method. After the filtering, over-detected candidates are evaluated using the spatio-temporal entropy, and false positives are then rejected by a threshold. This method could reject 79 to 96 [%] of all false positives in road roller and escalator scenes. The motion filtering decreased the detection rate somewhat because of motion coherency or small apparent motion of a target. In such cases, we need to introduce a tracking method such as Particle Filter or Mean Shift Tracker. The running speed of our method is 32 to 46 ms per frame with a 160120 pixel image on an Intel Pentium 4 CPU at 2.8 GHz. We think that this is fast enough for real-time detection. In addition, our method can be used as pre-processing for classifiers based on support vector machines or Boosting.

  20. The MVACS Robotic Arm Camera

    NASA Astrophysics Data System (ADS)

    Keller, H. U.; Hartwig, H.; Kramm, R.; Koschny, D.; Markiewicz, W. J.; Thomas, N.; Fernades, M.; Smith, P. H.; Reynolds, R.; Lemmon, M. T.; Weinberg, J.; Marcialis, R.; Tanner, R.; Boss, B. J.; Oquest, C.; Paige, D. A.

    2001-08-01

    The Robotic Arm Camera (RAC) is one of the key instruments newly developed for the Mars Volatiles and Climate Surveyor payload of the Mars Polar Lander. This lightweight instrument employs a front lens with variable focus range and takes images at distances from 11 mm (image scale 1:1) to infinity. Color images with a resolution of better than 50 ?m can be obtained to characterize the Martian soil. Spectral information of nearby objects is retrieved through illumination with blue, green, and red lamp sets. The design and performance of the camera are described in relation to the science objectives and operation. The RAC uses the same CCD detector array as the Surface Stereo Imager and shares the readout electronics with this camera. The RAC is mounted at the wrist of the Robotic Arm and can characterize the contents of the scoop, the samples of soil fed to the Thermal Evolved Gas Analyzer, the Martian surface in the vicinity of the lander, and the interior of trenches dug out by the Robotic Arm. It can also be used to take panoramic images and to retrieve stereo information with an effective baseline surpassing that of the Surface Stereo Imager by about a factor of 3.

  1. Unassisted 3D camera calibration

    NASA Astrophysics Data System (ADS)

    Atanassov, Kalin; Ramachandra, Vikas; Nash, James; Goma, Sergio R.

    2012-03-01

    With the rapid growth of 3D technology, 3D image capture has become a critical part of the 3D feature set on mobile phones. 3D image quality is affected by the scene geometry as well as on-the-device processing. An automatic 3D system usually assumes known camera poses accomplished by factory calibration using a special chart. In real life settings, pose parameters estimated by factory calibration can be negatively impacted by movements of the lens barrel due to shaking, focusing, or camera drop. If any of these factors displaces the optical axes of either or both cameras, vertical disparity might exceed the maximum tolerable margin and the 3D user may experience eye strain or headaches. To make 3D capture more practical, one needs to consider unassisted (on arbitrary scenes) calibration. In this paper, we propose an algorithm that relies on detection and matching of keypoints between left and right images. Frames containing erroneous matches, along with frames with insufficiently rich keypoint constellations, are detected and discarded. Roll, pitch yaw , and scale differences between left and right frames are then estimated. The algorithm performance is evaluated in terms of the remaining vertical disparity as compared to the maximum tolerable vertical disparity.

  2. Coaxial fundus camera for opthalmology

    NASA Astrophysics Data System (ADS)

    de Matos, Luciana; Castro, Guilherme; Castro Neto, Jarbas C.

    2015-09-01

    A Fundus Camera for ophthalmology is a high definition device which needs to meet low light illumination of the human retina, high resolution in the retina and reflection free image1. Those constraints make its optical design very sophisticated, but the most difficult to comply with is the reflection free illumination and the final alignment due to the high number of non coaxial optical components in the system. Reflection of the illumination, both in the objective and at the cornea, mask image quality, and a poor alignment make the sophisticated optical design useless. In this work we developed a totally axial optical system for a non-midriatic Fundus Camera. The illumination is performed by a LED ring, coaxial with the optical system and composed of IR of visible LEDs. The illumination ring is projected by the objective lens in the cornea. The Objective, LED illuminator, CCD lens are coaxial making the final alignment easily to perform. The CCD + capture lens module is a CCTV camera with autofocus and Zoom built in, added to a 175 mm focal length doublet corrected for infinity, making the system easily operated and very compact.

  3. Speckle Camera Imaging of the Planet Pluto

    NASA Astrophysics Data System (ADS)

    Howell, Steve B.; Horch, Elliott P.; Everett, Mark E.; Ciardi, David R.

    2012-10-01

    We have obtained optical wavelength (692 nm and 880 nm) speckle imaging of the planet Pluto and its largest moon Charon. Using our DSSI speckle camera attached to the Gemini North 8 m telescope, we collected high resolution imaging with an angular resolution of ~20 mas, a value at the Gemini-N telescope diffraction limit. We have produced for this binary system the first speckle reconstructed images, from which we can measure not only the orbital separation and position angle for Charon, but also the diameters of the two bodies. Our measurements of these parameters agree, within the uncertainties, with the current best values for Pluto and Charon. The Gemini-N speckle observations of Pluto are presented to illustrate the capabilities of our instrument and the robust production of high accuracy, high spatial resolution reconstructed images. We hope our results will suggest additional applications of high resolution speckle imaging for other objects within our solar system and beyond. Based on observations obtained at the Gemini Observatory, which is operated by the Association of Universities for Research in Astronomy, Inc., under a cooperative agreement with the National Science Foundation on behalf of the Gemini partnership: the National Science Foundation (United States), the Science and Technology Facilities Council (United Kingdom), the National Research Council (Canada), CONICYT (Chile), the Australian Research Council (Australia), Ministrio da Cincia, Tecnologia e Inovao (Brazil) and Ministerio de Ciencia, Tecnologa e Innovacin Productiva (Argentina).

  4. Cooling the Dark Energy Camera instrument

    NASA Astrophysics Data System (ADS)

    Schmitt, R. L.; Cease, H.; DePoy, D.; Diehl, H. T.; Estrada, J.; Flaugher, B.; Kuhlmann, S.; Onal, Birce; Stefanik, A.

    2008-07-01

    DECam, camera for the Dark Energy Survey (DES), is undergoing general design and component testing. For an overview see DePoy, et al in these proceedings. For a description of the imager, see Cease, et al in these proceedings. The CCD instrument will be mounted at the prime focus of the CTIO Blanco 4m telescope. The instrument temperature will be 173K with a heat load of 113W. In similar applications, cooling CCD instruments at the prime focus has been accomplished by three general methods. Liquid nitrogen reservoirs have been constructed to operate in any orientation, pulse tube cryocoolers have been used when tilt angles are limited and Joule-Thompson or Stirling cryocoolers have been used with smaller heat loads. Gifford-MacMahon cooling has been used at the Cassegrain but not at the prime focus. For DES, the combined requirements of high heat load, temperature stability, low vibration, operation in any orientation, liquid nitrogen cost and limited space available led to the design of a pumped, closed loop, circulating nitrogen system. At zenith the instrument will be twelve meters above the pump/cryocooler station. This cooling system expected to have a 10,000 hour maintenance interval. This paper will describe the engineering basis including the thermal model, unbalanced forces, cooldown time, the single and two-phase flow model.

  5. Cooling the dark energy camera instrument

    SciTech Connect

    Schmitt, R.L.; Cease, H.; DePoy, D.; Diehl, H.T.; Estrada, J.; Flaugher, B.; Kuhlmann, S.; Onal, Birce; Stefanik, A.; /Fermilab

    2008-06-01

    DECam, camera for the Dark Energy Survey (DES), is undergoing general design and component testing. For an overview see DePoy, et al in these proceedings. For a description of the imager, see Cease, et al in these proceedings. The CCD instrument will be mounted at the prime focus of the CTIO Blanco 4m telescope. The instrument temperature will be 173K with a heat load of 113W. In similar applications, cooling CCD instruments at the prime focus has been accomplished by three general methods. Liquid nitrogen reservoirs have been constructed to operate in any orientation, pulse tube cryocoolers have been used when tilt angles are limited and Joule-Thompson or Stirling cryocoolers have been used with smaller heat loads. Gifford-MacMahon cooling has been used at the Cassegrain but not at the prime focus. For DES, the combined requirements of high heat load, temperature stability, low vibration, operation in any orientation, liquid nitrogen cost and limited space available led to the design of a pumped, closed loop, circulating nitrogen system. At zenith the instrument will be twelve meters above the pump/cryocooler station. This cooling system expected to have a 10,000 hour maintenance interval. This paper will describe the engineering basis including the thermal model, unbalanced forces, cooldown time, the single and two-phase flow model.

  6. Bio-inspired hemispherical compound eye camera

    NASA Astrophysics Data System (ADS)

    Xiao, Jianliang; Song, Young Min; Xie, Yizhu; Malyarchuk, Viktor; Jung, Inhwa; Choi, Ki-Joong; Liu, Zhuangjian; Park, Hyunsung; Lu, Chaofeng; Kim, Rak-Hwan; Li, Rui; Crozier, Kenneth B.; Huang, Yonggang; Rogers, John A.

    2014-03-01

    Compound eyes in arthropods demonstrate distinct imaging characteristics from human eyes, with wide angle field of view, low aberrations, high acuity to motion and infinite depth of field. Artificial imaging systems with similar geometries and properties are of great interest for many applications. However, the challenges in building such systems with hemispherical, compound apposition layouts cannot be met through established planar sensor technologies and conventional optics. We present our recent progress in combining optics, materials, mechanics and integration schemes to build fully functional artificial compound eye cameras. Nearly full hemispherical shapes (about 160 degrees) with densely packed artificial ommatidia were realized. The number of ommatidia (180) is comparable to those of the eyes of fire ants and bark beetles. The devices combine elastomeric compound optical elements with deformable arrays of thin silicon photodetectors, which were fabricated in the planar geometries and then integrated and elastically transformed to hemispherical shapes. Imaging results and quantitative ray-tracing-based simulations illustrate key features of operation. These general strategies seem to be applicable to other compound eye devices, such as those inspired by moths and lacewings (refracting superposition eyes), lobster and shrimp (reflecting superposition eyes), and houseflies (neural superposition eyes).

  7. 16 CFR 501.1 - Camera film.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 16 Commercial Practices 1 2014-01-01 2014-01-01 false Camera film. 501.1 Section 501.1 Commercial Practices FEDERAL TRADE COMMISSION RULES, REGULATIONS, STATEMENT OF GENERAL POLICY OR INTERPRETATION AND... 500 § 501.1 Camera film. Camera film packaged and labeled for retail sale is exempt from the...

  8. 16 CFR 501.1 - Camera film.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 16 Commercial Practices 1 2013-01-01 2013-01-01 false Camera film. 501.1 Section 501.1 Commercial Practices FEDERAL TRADE COMMISSION RULES, REGULATIONS, STATEMENT OF GENERAL POLICY OR INTERPRETATION AND... 500 § 501.1 Camera film. Camera film packaged and labeled for retail sale is exempt from the...

  9. 16 CFR 501.1 - Camera film.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 16 Commercial Practices 1 2012-01-01 2012-01-01 false Camera film. 501.1 Section 501.1 Commercial Practices FEDERAL TRADE COMMISSION RULES, REGULATIONS, STATEMENT OF GENERAL POLICY OR INTERPRETATION AND... 500 § 501.1 Camera film. Camera film packaged and labeled for retail sale is exempt from the...

  10. 16 CFR 501.1 - Camera film.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 16 Commercial Practices 1 2011-01-01 2011-01-01 false Camera film. 501.1 Section 501.1 Commercial... 500 § 501.1 Camera film. Camera film packaged and labeled for retail sale is exempt from the net... should be expressed, provided: (a) The net quantity of contents on packages of movie film and bulk...

  11. 16 CFR 501.1 - Camera film.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 16 Commercial Practices 1 2010-01-01 2010-01-01 false Camera film. 501.1 Section 501.1 Commercial... 500 § 501.1 Camera film. Camera film packaged and labeled for retail sale is exempt from the net... should be expressed, provided: (a) The net quantity of contents on packages of movie film and bulk...

  12. An auto-focusing CCD camera mount

    NASA Astrophysics Data System (ADS)

    Arbour, R. W.

    1994-08-01

    The traditional methods of focusing a CCD camera are either time consuming, difficult or, more importantly, indecisive. This paper describes a device designed to allow the observer to be confident that the camera will always be properly focused by sensing a selected star image and automatically adjusting the camera's focal position.

  13. 21 CFR 886.1120 - Ophthalmic camera.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 8 2014-04-01 2014-04-01 false Ophthalmic camera. 886.1120 Section 886.1120 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES OPHTHALMIC DEVICES Diagnostic Devices § 886.1120 Ophthalmic camera. (a) Identification. An ophthalmic camera is an AC-powered device...

  14. Silicone Contamination Camera for Developed for Shuttle Payloads

    NASA Technical Reports Server (NTRS)

    1996-01-01

    On many shuttle missions, silicone contamination from unknown sources from within or external to the shuttle payload bay has been a chronic problem plaguing experiment payloads. There is currently a wide range of silicone usage on the shuttle. Silicones are used to coat the shuttle tiles to enhance their ability to shed rain, and over 100 kg of RTV 560 silicone is used to seal white tiles to the shuttle surfaces. Silicones are also used in electronic components, potting compounds, and thermal control blankets. Efforts to date to identify and eliminate the sources of silicone contamination have not been highly successful and have created much controversy. To identify the sources of silicone contamination on the space shuttle, the NASA Lewis Research Center developed a contamination camera. This specially designed pinhole camera utilizes low-Earth-orbit atomic oxygen to develop a picture that identifies sources of silicone contamination on shuttle-launched payloads. The volatile silicone species travel through the aperture of the pinhole camera, and since volatile silicone species lose their hydrocarbon functionalities under atomic oxygen attack, the silicone adheres to the substrate as SiO_x. This glassy deposit should be spatially arranged in the image of the sources of silicone contamination. To view the contamination image, one can use ultrasensitive thickness measurement techniques, such as scanning variable-angle ellipsometry, to map the surface topography of the camera's substrate. The demonstration of a functional contamination camera would resolve the controversial debate concerning the amount and location of contamination sources, would allow corrective actions to be taken, and would demonstrate a useful tool for contamination documentation on future shuttle payloads, with near negligible effect on cost and weight.

  15. Image quality testing of assembled IR camera modules

    NASA Astrophysics Data System (ADS)

    Winters, Daniel; Erichsen, Patrik

    2013-10-01

    Infrared (IR) camera modules for the LWIR (8-12_m) that combine IR imaging optics with microbolometer focal plane array (FPA) sensors with readout electronics are becoming more and more a mass market product. At the same time, steady improvements in sensor resolution in the higher priced markets raise the requirement for imaging performance of objectives and the proper alignment between objective and FPA. This puts pressure on camera manufacturers and system integrators to assess the image quality of finished camera modules in a cost-efficient and automated way for quality control or during end-of-line testing. In this paper we present recent development work done in the field of image quality testing of IR camera modules. This technology provides a wealth of additional information in contrast to the more traditional test methods like minimum resolvable temperature difference (MRTD) which give only a subjective overall test result. Parameters that can be measured are image quality via the modulation transfer function (MTF) for broadband or with various bandpass filters on- and off-axis and optical parameters like e.g. effective focal length (EFL) and distortion. If the camera module allows for refocusing the optics, additional parameters like best focus plane, image plane tilt, auto-focus quality, chief ray angle etc. can be characterized. Additionally, the homogeneity and response of the sensor with the optics can be characterized in order to calculate the appropriate tables for non-uniformity correction (NUC). The technology can also be used to control active alignment methods during mechanical assembly of optics to high resolution sensors. Other important points that are discussed are the flexibility of the technology to test IR modules with different form factors, electrical interfaces and last but not least the suitability for fully automated measurements in mass production.

  16. A method for measuring the base angle of axicon lens based on chromatic dispersion

    NASA Astrophysics Data System (ADS)

    Zhang, Yunbo; Zeng, Aijun; Wang, Ying; Huang, Huijie

    2015-07-01

    A method for measuring the base angle of axicon lens is presented. This method utilizes two coaxial laser beams with different wavelengths. When the two laser beams passing through the axicon lens, there will be a small divergence angle between them resulted from chromatic dispersion. After collected by an achromatic lens, these two laser beams will generate two spots on an image camera. The base angle can be figured out with the distance between two spots recorded by the image sensor. Furthermore, this method can also be used to calculate the cone angle of axicon lens.

  17. Pinhole Camera For Viewing Electron Beam Materials Processing

    NASA Astrophysics Data System (ADS)

    Rushford, M. C.; Kuzmenko, P. J.

    1986-10-01

    A very rugged, compact (4x4x10 inches), gas purged "PINHOLE CAMERA" has been developed for viewing electron beam materials processing (e.g. melting or vaporizing metal). The video image is computer processed, providing dimensional and temperature measurements of objects within the field of view, using an IBM PC. The "pinhole camera" concept is similar to a TRW optics system for viewing into a coal combustor through a 2 mm hole. Gas is purged through the hole to repel particulates from optical surfaces. In our system light from the molten metal passes through the 2 mm hole "PINHOLE", reflects off an aluminum coated glass substrate and passes through a window into a vacuum tight container holding the camera and optics at atmospheric pressure. The mirror filters out X rays which pass through the AL layer and are absorbed in the glass mirror substrate. Since metallic coatings are usually reflective, the image quality is not severely degraded by small amounts of vapor that overcome the gas purge to reach the mirror. Coating thicknesses of up to 2 microns can be tolerated. The mirror is the only element needing occasional servicing. We used a telescope eyepiece as a convenient optical design, but with the traditional optical path reversed. The eyepiece images a scene through a small entrance aperture onto an image plane where a CCD camera is placed. Since the iris of the eyepiece is fixed and the scene intensity varies it was necessary to employ a variable neutral density filter for brightness control. Devices used for this purpose include PLZT light valve from Motorola, mechanically rotated linear polarizer sheets, and nematic liquid crystal light valves. These were placed after the mirror and entrance aperture but before the lens to operate as a voltage variable neutral density filter. The molten metal surface temp being viewed varies from 4000 to 1200 degrees Kelvin. The resultant intensity change (at 488 nm with 10 nm bandwidth) is seven orders of magnitude. This surface intensity variation is contrast reduced if the observation wavelength is a narrow band as far red as high intensity blooming will allow an observable picture. A three eyepiece camera allows an image plane where photo gray glass functions as a neutral density filter only over the high intensity portion of the image, thus reducing blooming. This system is enclosed in a water-cooled housing which can dissipate 15 watts/cm2, keeping the camera below 40 degrees Celsius. Single frames of video output are acquired for feature enhancement and location by a Data Translation DT2803 image processing board housed in an IBM PC.

  18. Switchable viewing angle display with a compact directional backlight and striped diffuser.

    PubMed

    Wang, Yi-Jun; Lu, Jian-Gang; Chao, Wei-Chung; Shieh, Han-Ping D

    2015-08-10

    A compact high-directionality backlight module combined with a striped diffuser is proposed to achieve an adjustable viewing angle for eco-display. The micro-prisms on the compact light guide plate guide the emitting rays to the normal viewing angle, whereas a set of striped diffusers scatter the rays to a wide viewing angle. View cones of ± 10° / ± 55° were obtained for narrow/wide viewing modes with 88% / 85% uniformity of spatial luminance, respectively. Compared with the conventional backlight, the optical efficiencies were increased by factors of 1.47 and 1.38 in narrow and wide viewing modes, respectively. In addition, only 5% of power consumption was needed when the backlight worked in private narrow viewing mode to maintain the same luminance as that of a conventional backlight. PMID:26367992

  19. Mini gamma camera, camera system and method of use

    DOEpatents

    Majewski, Stanislaw (Grafton, VA); Weisenberger, Andrew G. (Grafton, VA); Wojcik, Randolph F. (Yorktown, VA)

    2001-01-01

    A gamma camera comprising essentially and in order from the front outer or gamma ray impinging surface: 1) a collimator, 2) a scintillator layer, 3) a light guide, 4) an array of position sensitive, high resolution photomultiplier tubes, and 5) printed circuitry for receipt of the output of the photomultipliers. There is also described, a system wherein the output supplied by the high resolution, position sensitive photomultipiler tubes is communicated to: a) a digitizer and b) a computer where it is processed using advanced image processing techniques and a specific algorithm to calculate the center of gravity of any abnormality observed during imaging, and c) optional image display and telecommunications ports.

  20. A systematic evaluation of intraoral cameras.

    PubMed

    Johnson, L A

    1994-11-01

    Although intraoral cameras are being used with increasing frequency in dentistry, this is the first systematic evaluation of the technology. This paper summarizes the evaluation of five intraoral cameras and discusses techniques that practitioners can use in conducting their own evaluations. Additionally, issues specific to intraoral camera evaluations, including image quality, ease of use and the differences between analog and digital cameras, are summarized. Finally, a call is made for an ongoing series of intraoral camera evaluations that would provide practitioners with the information they need to make a purchase decision. PMID:7760160

  1. Passive Millimeter Wave Camera (PMMWC) at TRW

    NASA Technical Reports Server (NTRS)

    1997-01-01

    Engineers at TRW, Redondo Beach, California, inspect the Passive Millimeter Wave Camera, a weather-piercing camera designed to see through fog, clouds, smoke and dust. Operating in the millimeter wave portion of the electromagnetic spectrum, the camera creates visual-like video images of objects, people, runways, obstacles and the horizon. A demonstration camera (shown in photo) has been completed and is scheduled for checkout tests and flight demonstration. Engineer (left) holds a compact, lightweight circuit board containing 40 complete radiometers, including antenna, monolithic millimeter wave integrated circuit (MMIC) receivers and signal processing and readout electronics that forms the basis for the camera's 1040-element focal plane array.

  2. Passive Millimeter Wave Camera (PMMWC) at TRW

    NASA Technical Reports Server (NTRS)

    1997-01-01

    Engineers at TRW, Redondo Beach, California, inspect the Passive Millimeter Wave Camera, a weather-piercing camera designed to 'see' through fog, clouds, smoke and dust. Operating in the millimeter wave portion of the electromagnetic spectrum, the camera creates visual-like video images of objects, people, runways, obstacles and the horizon. A demonstration camera (shown in photo) has been completed and is scheduled for checkout tests and flight demonstration. Engineer (left) holds a compact, lightweight circuit board containing 40 complete radiometers, including antenna, monolithic millimeter wave integrated circuit (MMIC) receivers and signal processing and readout electronics that forms the basis for the camera's 1040-element focal plane array.

  3. HHEBBES! All sky camera system: status update

    NASA Astrophysics Data System (ADS)

    Bettonvil, F.

    2015-01-01

    A status update is given of the HHEBBES! All sky camera system. HHEBBES!, an automatic camera for capturing bright meteor trails, is based on a DSLR camera and a Liquid Crystal chopper for measuring the angular velocity. Purpose of the system is to a) recover meteorites; b) identify origin/parental bodies. In 2015, two new cameras were rolled out: BINGO! -alike HHEBBES! also in The Netherlands-, and POgLED, in Serbia. BINGO! is a first camera equipped with a longer focal length fisheye lens, to further increase the accuracy. Several minor improvements have been done and the data reduction pipeline was used for processing two prominent Dutch fireballs.

  4. Heterodyne Interferometer Angle Metrology

    NASA Technical Reports Server (NTRS)

    Hahn, Inseob; Weilert, Mark A.; Wang, Xu; Goullioud, Renaud

    2010-01-01

    A compact, high-resolution angle measurement instrument has been developed that is based on a heterodyne interferometer. The common-path heterodyne interferometer metrology is used to measure displacements of a reflective target surface. In the interferometer setup, an optical mask is used to sample the measurement laser beam reflecting back from a target surface. Angular rotations, around two orthogonal axes in a plane perpendicular to the measurement- beam propagation direction, are determined simultaneously from the relative displacement measurement of the target surface. The device is used in a tracking telescope system where pitch and yaw measurements of a flat mirror were simultaneously performed with a sensitivity of 0.1 nrad, per second, and a measuring range of 0.15 mrad at a working distance of an order of a meter. The nonlinearity of the device is also measured less than one percent over the measurement range.

  5. Gain Narrowing in Few-Atom Systems

    SciTech Connect

    Savels, Tom; Mosk, Allard P.; Lagendijk, Ad

    2007-03-09

    Using a density matrix approach, we study the simplest systems that display both gain and feedback: clusters of 2 to 5 atoms, one of which is pumped. The other atoms supply feedback through multiple scattering of light. We show that, if the atoms are in each other's near field, the system exhibits large gain narrowing and spectral mode redistribution. The observed phenomena are more pronounced if the feedback is enhanced. Our system is to our knowledge the simplest exactly solvable microscopic system which shows the approach to laser oscillation.

  6. The Flow of Gases in Narrow Channels

    NASA Technical Reports Server (NTRS)

    Rasmussen, R E H

    1951-01-01

    Measurements were made of the flow of gases through various narrow channels a few microns wide at average pressures from 0.00003 to 40 cm. Hg. The flow rate, defined as the product of pressure and volume rate of flow at unit pressure difference, first decreased linearly with decrease in mean pressure in the channel, in agreement with laminar-flow theory, reached a minimum when the mean path length was approximately equal to the channel width, and then increased to a constant value. The product of flow rate and square root of molecular number was approximately the same function of mean path length for all gases for a given channel.

  7. On the absolute calibration of SO2 cameras

    NASA Astrophysics Data System (ADS)

    Lbcke, P.; Bobrowski, N.; Illing, S.; Kern, C.; Alvarez Nieves, J. M.; Vogel, L.; Zielcke, J.; Delgado Granados, H.; Platt, U.

    2013-03-01

    Sulphur dioxide emission rate measurements are an important tool for volcanic monitoring and eruption risk assessment. The SO2 camera technique remotely measures volcanic emissions by analysing the ultraviolet absorption of SO2 in a narrow spectral window between 300 and 320 nm using solar radiation scattered in the atmosphere. The SO2 absorption is selectively detected by mounting band-pass interference filters in front of a two-dimensional, UV-sensitive CCD detector. One important step for correct SO2 emission rate measurements that can be compared with other measurement techniques is a correct calibration. This requires conversion from the measured optical density to the desired SO2 column density (CD). The conversion factor is most commonly determined by inserting quartz cells (cuvettes) with known amounts of SO2 into the light path. Another calibration method uses an additional narrow field-of-view Differential Optical Absorption Spectroscopy system (NFOV-DOAS), which measures the column density simultaneously in a small area of the camera's field-of-view. This procedure combines the very good spatial and temporal resolution of the SO2 camera technique with the more accurate column densities obtainable from DOAS measurements. This work investigates the uncertainty of results gained through the two commonly used, but quite different, calibration methods (DOAS and calibration cells). Measurements with three different instruments, an SO2 camera, a NFOV-DOAS system and an Imaging DOAS (I-DOAS), are presented. We compare the calibration-cell approach with the calibration from the NFOV-DOAS system. The respective results are compared with measurements from an I-DOAS to verify the calibration curve over the spatial extent of the image. The results show that calibration cells, while working fine in some cases, can lead to an overestimation of the SO2 CD by up to 60% compared with CDs from the DOAS measurements. Besides these errors of calibration, radiative transfer effects (e.g. light dilution, multiple scattering) can significantly influence the results of both instrument types. The measurements presented in this work were taken at Popocatpetl, Mexico, between 1 March 2011 and 4 March 2011. Average SO2 emission rates between 4.00 and 14.34 kg s-1 were observed.

  8. Characterization of the Series 1000 Camera System

    SciTech Connect

    Kimbrough, J; Moody, J; Bell, P; Landen, O

    2004-04-07

    The National Ignition Facility requires a compact network addressable scientific grade CCD camera for use in diagnostics ranging from streak cameras to gated x-ray imaging cameras. Due to the limited space inside the diagnostic, an analog and digital input/output option in the camera controller permits control of both the camera and the diagnostic by a single Ethernet link. The system consists of a Spectral Instruments Series 1000 camera, a PC104+ controller, and power supply. The 4k by 4k CCD camera has a dynamic range of 70 dB with less than 14 electron read noise at a 1MHz readout rate. The PC104+ controller includes 16 analog inputs, 4 analog outputs and 16 digital input/output lines for interfacing to diagnostic instrumentation. A description of the system and performance characterization is reported.

  9. Characterization of the series 1000 camera system

    SciTech Connect

    Kimbrough, J.R.; Moody, J.D.; Bell, P.M.; Landen, O.L.

    2004-10-01

    The National Ignition Facility requires a compact network addressable scientific grade charge coupled device (CCD) camera for use in diagnostics ranging from streak cameras to gated x-ray imaging cameras. Due to the limited space inside the diagnostic, an analog and digital input/output option in the camera controller permits control of both the camera and the diagnostic by a single Ethernet link. The system consists of a Spectral Instruments Series 1000 camera, a PC104+ controller, and power supply. The 4k by 4k CCD camera has a dynamic range of 70 dB with less than 14 electron read noise at a 1 MHz readout rate. The PC104+ controller includes 16 analog inputs, four analog outputs, and 16 digital input/output lines for interfacing to diagnostic instrumentation. A description of the system and performance characterization is reported.

  10. Gamma cameras--state of the art.

    PubMed

    Zimmerman, R E

    1979-01-01

    The growth of nuclear medicine has paralleled the development and acceptance of gamma cameras, which are stationary instruments that produce an image of an isotope distribution. The most successful gamma camera is based on a large, single crystal of the scintillator sodium iodide coupled to an array of photomultiplier tubes and electronic networks to determine the location of a gamma ray. Current cameras are capable of spatial resolution of 4.0 mm or better. Other types of gamma cameras have been developed, including the multi-crystal camera and a limited number of instruments using gas detectors and semiconductor detectors. The scintillation type of gamma camera will continue to be the dominant camera for the next 5 to 10 years, and will continue to gradually improve. Most significant improvements will result from application of digital electronics to position circuits. PMID:440176

  11. Rover mast calibration, exact camera pointing, and camara handoff for visual target tracking

    NASA Technical Reports Server (NTRS)

    Kim, Won S.; Ansar, Adnan I.; Steele, Robert D.

    2005-01-01

    This paper presents three technical elements that we have developed to improve the accuracy of the visual target tracking for single-sol approach-and-instrument placement in future Mars rover missions. An accurate, straightforward method of rover mast calibration is achieved by using a total station, a camera calibration target, and four prism targets mounted on the rover. The method was applied to Rocky8 rover mast calibration and yielded a 1.1-pixel rms residual error. Camera pointing requires inverse kinematic solutions for mast pan and tilt angles such that the target image appears right at the center of the camera image. Two issues were raised. Mast camera frames are in general not parallel to the masthead base frame. Further, the optical axis of the camera model in general does not pass through the center of the image. Despite these issues, we managed to derive non-iterative closed-form exact solutions, which were verified with Matlab routines. Actual camera pointing experiments aver 50 random target image paints yielded less than 1.3-pixel rms pointing error. Finally, a purely geometric method for camera handoff using stereo views of the target has been developed. Experimental test runs show less than 2.5 pixels error on high-resolution Navcam for Pancam-to-Navcam handoff, and less than 4 pixels error on lower-resolution Hazcam for Navcam-to-Hazcam handoff.

  12. Advanced camera image data acquisition system for Pi-of-the-Sky

    NASA Astrophysics Data System (ADS)

    Kwiatkowski, Maciej; Kasprowicz, Grzegorz; Pozniak, Krzysztof; Romaniuk, Ryszard; Wrochna, Grzegorz

    2008-11-01

    The paper describes a new generation of high performance, remote control, CCD cameras designed for astronomical applications. A completely new camera PCB was designed, manufactured, tested and commissioned. The CCD chip was positioned in a different way than previously resulting in better performance of the astronomical video data acquisition system. The camera was built using a low-noise, 4Mpixel CCD circuit by STA. The electronic circuit of the camera is highly parameterized and reconfigurable, as well as modular in comparison with the solution of first generation, due to application of open software solutions and FPGA circuit, Altera Cyclone EP1C6. New algorithms were implemented into the FPGA chip. There were used the following advanced electronic circuit in the camera system: microcontroller CY7C68013a (core 8051) by Cypress, image processor AD9826 by Analog Devices, GigEth interface RTL8169s by Realtec, memory SDRAM AT45DB642 by Atmel, CPU typr microprocessor ARM926EJ-S AT91SAM9260 by ARM and Atmel. Software solutions for the camera and its remote control, as well as image data acquisition are based only on the open source platform. There were used the following image interfaces ISI and API V4L2, data bus AMBA, AHB, INDI protocol. The camera will be replicated in 20 pieces and is designed for continuous on-line, wide angle observations of the sky in the research program Pi-of-the-Sky.

  13. NARROW-K-BAND OBSERVATIONS OF THE GJ 1214 SYSTEM

    SciTech Connect

    Colón, Knicole D.; Gaidos, Eric

    2013-10-10

    GJ 1214 is a nearby M dwarf star that hosts a transiting super-Earth-size planet, making this system an excellent target for atmospheric studies. Most studies find that the transmission spectrum of GJ 1214b is flat, which favors either a high mean molecular weight or cloudy/hazy hydrogen (H) rich atmosphere model. Photometry at short wavelengths (<0.7 μm) and in the K band can discriminate the most between these different atmosphere models for GJ 1214b, but current observations do not have sufficiently high precision. We present photometry of seven transits of GJ 1214b through a narrow K-band (2.141 μm) filter with the Wide Field Camera on the 3.8 m United Kingdom Infrared Telescope. Our photometric precision is typically 1.7 × 10{sup –3} (for a single transit), comparable with other ground-based observations of GJ 1214b. We measure a planet-star radius ratio of 0.1158 ± 0.0013, which, along with other studies, also supports a flat transmission spectrum for GJ 1214b. Since this does not exclude a scenario where GJ 1214b has an H-rich envelope with heavy elements that are sequestered below a cloud/haze layer, we compare K-band observations with models of H{sub 2} collision-induced absorption in an atmosphere for a range of temperatures. While we find no evidence for deviation from a flat spectrum (slope s = 0.0016 ± 0.0038), an H{sub 2}-dominated upper atmosphere (<60 mbar) cannot be excluded. More precise observations at <0.7 μm and in the K band, as well as a uniform analysis of all published data, would be useful for establishing more robust limits on atmosphere models for GJ 1214b.

  14. On the absolute calibration of SO2 cameras

    NASA Astrophysics Data System (ADS)

    Lbcke, P.; Bobrowski, N.; Illing, S.; Kern, C.; Alvarez Nieves, J. M.; Vogel, L.; Zielcke, J.; Delgado Granados, H.; Platt, U.

    2012-09-01

    Sulphur dioxide emission flux measurements are an important tool for volcanic monitoring and eruption risk assessment. The SO2 camera technique remotely measures volcanic emissions by analysing the ultraviolet absorption of SO2 in a narrow spectral window between 305 nm and 320 nm using solar radiation scattered in the atmosphere. The SO2 absorption is selectively detected by mounting band-pass interference filters in front of a two-dimensional, UV-sensitive CCD detector. While this approach is simple and delivers valuable insights into the two-dimensional SO2 distribution, absolute calibration has proven to be difficult. An accurate calibration of the SO2 camera (i.e., conversion from optical density to SO2 column density, CD) is crucial to obtain correct SO2 CDs and flux measurements that are comparable to other measurement techniques and can be used for volcanological applications. The most common approach for calibrating SO2 camera measurements is based on inserting quartz cells (cuvettes) containing known amounts of SO2 into the light path. It has been found, however, that reflections from the windows of the calibration cell can considerably affect the signal measured by the camera. Another possibility for calibration relies on performing simultaneous measurements in a small area of the camera's field-of-view (FOV) by a narrow-field-of-view Differential Optical Absorption Spectroscopy (NFOV-DOAS) system. This procedure combines the very good spatial and temporal resolution of the SO2 camera technique with the more accurate column densities obtainable from DOAS measurements. This work investigates the uncertainty of results gained through the two commonly used, but quite different calibration methods (DOAS and calibration cells). Measurements with three different instruments, an SO2 camera, a NFOV-DOAS system and an Imaging DOAS (IDOAS), are presented. We compare the calibration-cell approach with the calibration from the NFOV-DOAS system. The respective results are compared with measurements from an IDOAS to verify the calibration curve over the spatial extend of the image. Our results show that calibration cells can lead to an overestimation of the SO2 CD by up to 60% compared with CDs from the DOAS measurements. Besides these errors of calibration, radiative transfer effects (e.g. light dilution, multiple scattering) can significantly influence the results of both instrument types. These effects can lead to an even more significant overestimation or, depending on the measurement conditions, an underestimation of the true CD. Previous investigations found that possible errors can be more than an order of magnitude. However, the spectral information from the DOAS measurements allows to correct for these radiative transfer effects. The measurement presented in this work were taken at Popocatpetl, Mexico, between 1 March 2011 and 4 March 2011. Average SO2 emission rates between 4.00 kg s-1 and 14.34 kg s-1 were observed.

  15. Equilibrium contact angle or the most-stable contact angle?

    PubMed

    Montes Ruiz-Cabello, F J; Rodrguez-Valverde, M A; Cabrerizo-Vlchez, M A

    2014-04-01

    It is well-established that the equilibrium contact angle in a thermodynamic framework is an "unattainable" contact angle. Instead, the most-stable contact angle obtained from mechanical stimuli of the system is indeed experimentally accessible. Monitoring the susceptibility of a sessile drop to a mechanical stimulus enables to identify the most stable drop configuration within the practical range of contact angle hysteresis. Two different stimuli may be used with sessile drops: mechanical vibration and tilting. The most stable drop against vibration should reveal the changeless contact angle but against the gravity force, it should reveal the highest resistance to slide down. After the corresponding mechanical stimulus, once the excited drop configuration is examined, the focus will be on the contact angle of the initial drop configuration. This methodology needs to map significantly the static drop configurations with different stable contact angles. The most-stable contact angle, together with the advancing and receding contact angles, completes the description of physically realizable configurations of a solid-liquid system. Since the most-stable contact angle is energetically significant, it may be used in the Wenzel, Cassie or Cassie-Baxter equations accordingly or for the surface energy evaluation. PMID:24140073

  16. Narrow field electromagnetic sensor system and method

    DOEpatents

    McEwan, T.E.

    1996-11-19

    A narrow field electromagnetic sensor system and method of sensing a characteristic of an object provide the capability to realize a characteristic of an object such as density, thickness, or presence, for any desired coordinate position on the object. One application is imaging. The sensor can also be used as an obstruction detector or an electronic trip wire with a narrow field without the disadvantages of impaired performance when exposed to dirt, snow, rain, or sunlight. The sensor employs a transmitter for transmitting a sequence of electromagnetic signals in response to a transmit timing signal, a receiver for sampling only the initial direct RF path of the electromagnetic signal while excluding all other electromagnetic signals in response to a receive timing signal, and a signal processor for processing the sampled direct RF path electromagnetic signal and providing an indication of the characteristic of an object. Usually, the electromagnetic signal is a short RF burst and the obstruction must provide a substantially complete eclipse of the direct RF path. By employing time-of-flight techniques, a timing circuit controls the receiver to sample only the initial direct RF path of the electromagnetic signal while not sampling indirect path electromagnetic signals. The sensor system also incorporates circuitry for ultra-wideband spread spectrum operation that reduces interference to and from other RF services while allowing co-location of multiple electronic sensors without the need for frequency assignments. 12 figs.

  17. Narrow field electromagnetic sensor system and method

    DOEpatents

    McEwan, Thomas E.

    1996-01-01

    A narrow field electromagnetic sensor system and method of sensing a characteristic of an object provide the capability to realize a characteristic of an object such as density, thickness, or presence, for any desired coordinate position on the object. One application is imaging. The sensor can also be used as an obstruction detector or an electronic trip wire with a narrow field without the disadvantages of impaired performance when exposed to dirt, snow, rain, or sunlight. The sensor employs a transmitter for transmitting a sequence of electromagnetic signals in response to a transmit timing signal, a receiver for sampling only the initial direct RF path of the electromagnetic signal while excluding all other electromagnetic signals in response to a receive timing signal, and a signal processor for processing the sampled direct RF path electromagnetic signal and providing an indication of the characteristic of an object. Usually, the electromagnetic signal is a short RF burst and the obstruction must provide a substantially complete eclipse of the direct RF path. By employing time-of-flight techniques, a timing circuit controls the receiver to sample only the initial direct RF path of the electromagnetic signal while not sampling indirect path electromagnetic signals. The sensor system also incorporates circuitry for ultra-wideband spread spectrum operation that reduces interference to and from other RF services while allowing co-location of multiple electronic sensors without the need for frequency assignments.

  18. Studies of narrow autoionizing resonances in gadolinium

    SciTech Connect

    Bushaw, Bruce A.; Nortershauser, W.; Blaum, K.; Wendt, Klaus

    2003-06-30

    The autoionization (AI) spectrum of gadolinium between the first and second limits has been investigated by triple-resonance excitation with high-resolution cw lasers. A large number of narrow AI resonances have been observed and assigned total angular momentum J values. The resonances are further divided into members of AI Rydberg series converging to the second limit or other ''interloping'' levels. Fine structure in the Rydberg series has been identified and interpreted in terms of Jc j coupling. A number of detailed studies have been performed on the interloping resonances: These include lifetime determination by lineshape analysis, isotope shifts, hyperfine structure, and photoionization saturation parameters. The electronic structure of the interloping levels is discussed in terms of these studies. Linewidths generally decrease with increasing total angular momentum and the J = 7 resonances are extremely narrow with Lorentzian widths ranging from < 1 MHz up to 157 MHz. The strongest resonances are found to have cross-sections of {approx}10-12 cm{sup 2} and photoionization can be saturated with powers available from cw diode lasers.

  19. The Tololo All Sky Camera

    NASA Astrophysics Data System (ADS)

    Smith, R.; Walker, D.; Schwarz, H. E.

    An all sky camera using commercial parts has been assembled at Cerro Tololo Inter-American Observatory to monitor a wide range of atmospheric effects of interest to astronomers, including thin cirrus and contrails which are otherwise invisible on moonless nights, air glow variations, light pollution in the Sodium and Mercury bands, and aircraft lights. To enhance the visibility of clouds, raw images are divided by the corresponding clear sky image, inferred from images taken at the corresponding sidereal time on previous nights. Frames are taken every 30 seconds so that subsequent animation is smooth enough for the eye to pick out motion of clouds at or below the noise floor.

  20. Disequilibrium dihedral angles in dolerite sills

    USGS Publications Warehouse

    Holness, Marian B.; Richardson, Chris; Helz, Rosalind T.

    2012-01-01

    The geometry of clinopyroxene-plagioclase-plagioclase junctions in mafic rocks, measured by the median dihedral angle ?cpp, is created during solidification. In the solidifying Kilauea Iki (Hawaii) lava lake, the wider junctions between plagioclase grains are the first to be filled by pyroxene, followed by the narrower junctions. The final ?cpp, attained when all clinopyroxene-plagioclase-plagioclase junctions are formed, is 78 in the upper crust of the lake, and 85 in the lower solidification front. ?cpp in the 3.5-m-thick Traigh Bhn na Sgrra sill (Inner Hebrides) is everywhere 78. In the Whin Sill (northern England, 38 m thick) and the Portal Peak sill (Antarctica, 129 m thick), ?cpp varies symmetrically, with the lowest values at the margins. The 266-m-thick Basement Sill (Antarctica) has asymmetric variation of ?cpp, attributed to a complex filling history. The chilled margins of the Basement Sill are partially texturally equilibrated, with high ?cpp. The plagioclase grain size in the two widest sills varies asymmetrically, with the coarsest rocks found in the upper third. Both ?cpp and average grain size are functions of model crystallization times. ?cpp increases from 78 to a maximum of ?100 as the crystallization time increases from 1 to 500 yr. Because the use of grain size as a measure of crystallization time is dependent on an estimate of crystal growth rates, dihedral angles provide a more direct proxy for cooling rates in dolerites.

  1. High-speed measurement of nozzle swing angle of rocket engine based on monocular vision

    NASA Astrophysics Data System (ADS)

    Qu, Yufu; Yang, Haijuan

    2015-02-01

    A nozzle angle measurement system based on monocular vision is proposed to achieve high-speed and non-contact angle measurement of rocket engine nozzle. The measurement system consists of two illumination sources, a lens, a target board with spots, a high-speed camera, an image acquisition card and a PC. A target board with spots was fixed on the end of rocket engine nozzle. The image of the target board moved along with the rocket engine nozzle swing was captured by a high-speed camera and transferred to the PC by an image acquisition card. Then a data processing algorithm was utilized to acquire the swing angle of the engine nozzle. Experiment shows that the accuracy of swing angle measurement was 0.2° and the measurement frequency was up to 500Hz.

  2. Optimization of sharp and viewing-angle-independent structural color

    NASA Astrophysics Data System (ADS)

    Hsu, Chia Wei; Miller, Owen D.; Johnson, Steven G.; Solja?i?, Marin

    2015-04-01

    Structural coloration produces some of the most brilliant colors in nature and has many applications. However, the two competing properties of narrow bandwidth and broad viewing angle have not been achieved simultaneously in previous studies. Here, we use numerical optimization to discover geometries where a sharp 7% bandwidth in scattering is achieved, yet the peak wavelength varies less than 1%, and the peak height and peak width vary less than 6% over broad viewing angles (0--90$^\\circ$) under a directional illumination. Our model system consists of dipole scatterers arranged into several rings; interference among the scattered waves is optimized to yield the wavelength-selective and angle-insensitive response. Such designs can be useful for the recently proposed transparent displays that are based on wavelength-selective scattering.

  3. Limbus Impact on Off-angle Iris Degradation

    SciTech Connect

    Karakaya, Mahmut; Barstow, Del R; Santos-Villalobos, Hector J; Thompson, Joseph W; Bolme, David S; Boehnen, Chris Bensing

    2013-01-01

    The accuracy of iris recognition depends on the quality of data capture and is negatively affected by several factors such as angle, occlusion, and dilation. Off-angle iris recognition is a new research focus in biometrics that tries to address several issues including corneal refraction, complex 3D iris texture, and blur. In this paper, we present an additional significant challenge that degrades the performance of the off-angle iris recognition systems, called the limbus effect . The limbus is the region at the border of the cornea where the cornea joins the sclera. The limbus is a semitransparent tissue that occludes a side portion of the iris plane. The amount of occluded iris texture on the side nearest the camera increases as the image acquisition angle increases. Without considering the role of the limbus effect, it is difficult to design an accurate off-angle iris recognition system. To the best of our knowledge, this is the first work that investigates the limbus effect in detail from a biometrics perspective. Based on results from real images and simulated experiments with real iris texture, the limbus effect increases the hamming distance score between frontal and off-angle iris images ranging from 0.05 to 0.2 depending upon the limbus height.

  4. Optimum Projection Angle for Attaining Maximum Distance in a Soccer Punt Kick

    PubMed Central

    Linthorne, Nicholas P.; Patel, Dipesh S.

    2011-01-01

    To produce the greatest horizontal distance in a punt kick the ball must be projected at an appropriate angle. Here, we investigated the optimum projection angle that maximises the distance attained in a punt kick by a soccer goalkeeper. Two male players performed many maximum-effort kicks using projection angles of between 10 and 90. The kicks were recorded by a video camera at 100 Hz and a 2 D biomechanical analysis was conducted to obtain measures of the projection velocity, projection angle, projection height, ball spin rate, and foot velocity at impact. The players optimum projection angle was calculated by substituting mathematical equations for the relationships between the projection variables into the equations for the aerodynamic flight of a soccer ball. The calculated optimum projection angles were in agreement with the players preferred projection angles (40 and 44). In projectile sports even a small dependence of projection velocity on projection angle is sufficient to produce a substantial shift in the optimum projection angle away from 45. In the punt kicks studied here, the optimum projection angle was close to 45 because the projection velocity of the ball remained almost constant across all projection angles. This result is in contrast to throwing and jumping for maximum distance, where the projection velocity the athlete is able to achieve decreases substantially with increasing projection angle and so the optimum projection angle is well below 45. Key points The optimum projection angle that maximizes the distance of a punt kick by a soccer goalkeeper is about 45. The optimum projection angle is close to 45 because the projection velocity of the ball is almost the same at all projection angles. This result is in contrast to throwing and jumping for maximum distance, where the optimum projection angle is well below 45 because the projection velocity the athlete is able to achieve decreases substantially with increasing projection angle. PMID:24149315

  5. Digital cameras with designs inspired by the arthropod eye.

    PubMed

    Song, Young Min; Xie, Yizhu; Malyarchuk, Viktor; Xiao, Jianliang; Jung, Inhwa; Choi, Ki-Joong; Liu, Zhuangjian; Park, Hyunsung; Lu, Chaofeng; Kim, Rak-Hwan; Li, Rui; Crozier, Kenneth B; Huang, Yonggang; Rogers, John A

    2013-05-01

    In arthropods, evolution has created a remarkably sophisticated class of imaging systems, with a wide-angle field of view, low aberrations, high acuity to motion and an infinite depth of field. A challenge in building digital cameras with the hemispherical, compound apposition layouts of arthropod eyes is that essential design requirements cannot be met with existing planar sensor technologies or conventional optics. Here we present materials, mechanics and integration schemes that afford scalable pathways to working, arthropod-inspired cameras with nearly full hemispherical shapes (about 160 degrees). Their surfaces are densely populated by imaging elements (artificial ommatidia), which are comparable in number (180) to those of the eyes of fire ants (Solenopsis fugax) and bark beetles (Hylastes nigrinus). The devices combine elastomeric compound optical elements with deformable arrays of thin silicon photodetectors into integrated sheets that can be elastically transformed from the planar geometries in which they are fabricated to hemispherical shapes for integration into apposition cameras. Our imaging results and quantitative ray-tracing-based simulations illustrate key features of operation. These general strategies seem to be applicable to other compound eye devices, such as those inspired by moths and lacewings (refracting superposition eyes), lobster and shrimp (reflecting superposition eyes), and houseflies (neural superposition eyes). PMID:23636401

  6. 2. Photocopied July 1971 from photostat Jordan Narrows Folder #1, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    2. Photocopied July 1971 from photostat Jordan Narrows Folder #1, Engineering Department, Utah Power and Light Co., Salt Lake City, Utah. JORDAN NARROWS STATION. PLAN AND SECTION. - Salt Lake City Water & Electrical Power Company, Jordan Narrows Hydroelectric Plant, Jordan River, Riverton, Salt Lake County, UT

  7. Promoting L2 Vocabulary Learning through Narrow Reading

    ERIC Educational Resources Information Center

    Kang, Eun Young

    2015-01-01

    Krashen (2004) has advocated that narrow reading, i.e., reading a series of texts addressing one specific topic, is an effective method to grow vocabulary. While narrow reading has been championed to have many advantages for L2 vocabulary learning, there remains a relative dearth of empirical studies that test the impact of narrow reading on L2

  8. Promoting L2 Vocabulary Learning through Narrow Reading

    ERIC Educational Resources Information Center

    Kang, Eun Young

    2015-01-01

    Krashen (2004) has advocated that narrow reading, i.e., reading a series of texts addressing one specific topic, is an effective method to grow vocabulary. While narrow reading has been championed to have many advantages for L2 vocabulary learning, there remains a relative dearth of empirical studies that test the impact of narrow reading on L2…

  9. 33 CFR 117.561 - Kent Island Narrows.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 33 Navigation and Navigable Waters 1 2010-07-01 2010-07-01 false Kent Island Narrows. 117.561... DRAWBRIDGE OPERATION REGULATIONS Specific Requirements Maryland 117.561 Kent Island Narrows. The draw of the U.S. Route 50/301 bridge, mile 1.0, Kent Island Narrows, operates as follows: (a) From November...

  10. 33 CFR 117.561 - Kent Island Narrows.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 33 Navigation and Navigable Waters 1 2013-07-01 2013-07-01 false Kent Island Narrows. 117.561... DRAWBRIDGE OPERATION REGULATIONS Specific Requirements Maryland 117.561 Kent Island Narrows. The draw of the U.S. Route 50/301 bridge, mile 1.0, Kent Island Narrows, operates as follows: (a) From November...

  11. 33 CFR 117.561 - Kent Island Narrows.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 33 Navigation and Navigable Waters 1 2011-07-01 2011-07-01 false Kent Island Narrows. 117.561... DRAWBRIDGE OPERATION REGULATIONS Specific Requirements Maryland 117.561 Kent Island Narrows. The draw of the U.S. Route 50/301 bridge, mile 1.0, Kent Island Narrows, operates as follows: (a) From November...

  12. 33 CFR 117.561 - Kent Island Narrows.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 33 Navigation and Navigable Waters 1 2014-07-01 2014-07-01 false Kent Island Narrows. 117.561... DRAWBRIDGE OPERATION REGULATIONS Specific Requirements Maryland 117.561 Kent Island Narrows. The draw of the U.S. Route 50/301 bridge, mile 1.0, Kent Island Narrows, operates as follows: (a) From November...

  13. 33 CFR 117.561 - Kent Island Narrows.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 33 Navigation and Navigable Waters 1 2012-07-01 2012-07-01 false Kent Island Narrows. 117.561... DRAWBRIDGE OPERATION REGULATIONS Specific Requirements Maryland 117.561 Kent Island Narrows. The draw of the U.S. Route 50/301 bridge, mile 1.0, Kent Island Narrows, operates as follows: (a) From November...

  14. A frequency selective bolometer camera for measuring millimeter spectral energy distributions

    NASA Astrophysics Data System (ADS)

    Logan, Daniel William

    2009-06-01

    Bolometers are the most sensitive detectors for measuring millimeter and submillimeter wavelength astrophysical signals. Cameras comprised of arrays of bolometers have already made significant contributions to the field of astronomy. A challenge for bolometer cameras is obtaining observations at multiple wavelengths. Traditionally, observing in multiple bands requires a partial disassembly of the instrument to replace bandpass filters, a task which prevents immediate spectral interrogation of a source. More complex cameras have been constructed to observe in several bands using beam splitters and dichroic filters, but the added complexity leads to physically larger instruments with reduced efficiencies. The SPEctral Energy Distribution camera (SPEED) is a new type of bolometer camera designed to efficiently observe in multiple wavebands without the need for excess bandpass filters and beam splitters. SPEED is a ground-based millimeter-wave bolometer camera designed to observe at 2.1, 1.3, 1.1, and 0.85 mm simultaneously. SPEED makes use of a new type of bolometer, the frequency selective bolometer (FSB), to observe all of the wavebands within each of the camera's four pixels. FSBs incorporate frequency selective dipole surfaces as absorbing elements allowing each detector to absorb a single, narrow band of radiation and pass all other radiation with low loss. Each FSB also contains a superconducting transition-edge sensor (TES) that acts as a sensitive thermistor for measuring the temperature of the FSB. This thesis describes the development of the SPEED camera and FSB detectors. The design of the detectors used in the instrument is described as well as the the general optical performance of frequency selective dipole surfaces. Laboratory results of both the optical and thermal properties of millimeter- wave FSBs are also presented. The SPEED instrument and its components are highlighted and the optical design of the optics which couple SPEED to the Heinrich Hertz Telescope is given. This thesis concludes with an introduction to the jiggle mapping data analysis of bolometer instruments like SPEED.

  15. CCD Camera Lens Interface for Real-Time Theodolite Alignment

    NASA Technical Reports Server (NTRS)

    Wake, Shane; Scott, V. Stanley, III

    2012-01-01

    Theodolites are a common instrument in the testing, alignment, and building of various systems ranging from a single optical component to an entire instrument. They provide a precise way to measure horizontal and vertical angles. They can be used to align multiple objects in a desired way at specific angles. They can also be used to reference a specific location or orientation of an object that has moved. Some systems may require a small margin of error in position of components. A theodolite can assist with accurately measuring and/or minimizing that error. The technology is an adapter for a CCD camera with lens to attach to a Leica Wild T3000 Theodolite eyepiece that enables viewing on a connected monitor, and thus can be utilized with multiple theodolites simultaneously. This technology removes a substantial part of human error by relying on the CCD camera and monitors. It also allows image recording of the alignment, and therefore provides a quantitative means to measure such error.

  16. Cerebellopontine Angle Epidermoids

    PubMed Central

    Doyle, Karen Jo; De la Cruz, Antonio

    1996-01-01

    Epidermoids, or congenital cholesteatomas, constitute about 0.2% to 1.5% of intracranial tumors, and 3% to 5% of tumors of the cerebellopontine angle (CPA). We review the surgical management of CPA epidermoids in 13 patients at the House Ear Clinic for the years 1978 to 1993. There were seven male and six female patients, ranging in age from 27 to 59 years (average, 40 years). Tumors ranged in size from 3.5 cm to 7.0 cm, and the surgical approach was tailored to the tumor extent and location. All patients complained at presentation of unilateral hearing loss, and nine had poor speech discrimination (less than 50%) preoperatively. Serviceable hearing was preserved in two patients. Two patients presented with facial nerve symptoms, and four cases had postoperative permanent facial nerve paralysis (House-Brackmann Grade V or VI). There were no surgical deaths. Four patients required second surgeries to remove residual cholesteatoma. Compared with prior series, we describe a higher rate of total tumor removed, as well as a higher rate of second operations, indicating a more aggressive approach to these lesions. ImagesFigure 1Figure 2Figure 3 PMID:17170950

  17. An attentive multi-camera system

    NASA Astrophysics Data System (ADS)

    Napoletano, Paolo; Tisato, Francesco

    2014-03-01

    Intelligent multi-camera systems that integrate computer vision algorithms are not error free, and thus both false positive and negative detections need to be revised by a specialized human operator. Traditional multi-camera systems usually include a control center with a wall of monitors displaying videos from each camera of the network. Nevertheless, as the number of cameras increases, switching from a camera to another becomes hard for a human operator. In this work we propose a new method that dynamically selects and displays the content of a video camera from all the available contents in the multi-camera system. The proposed method is based on a computational model of human visual attention that integrates top-down and bottom-up cues. We believe that this is the first work that tries to use a model of human visual attention for the dynamic selection of the camera view of a multi-camera system. The proposed method has been experimented in a given scenario and has demonstrated its effectiveness with respect to the other methods and manually generated ground-truth. The effectiveness has been evaluated in terms of number of correct best-views generated by the method with respect to the camera views manually generated by a human operator.

  18. Multiview panoramic cameras using mirror pyramids.

    PubMed

    Tan, Kar-Han; Hua, Hong; Ahuja, Narendra

    2004-07-01

    A mirror pyramid consists of a set of planar mirror faces arranged around an axis of symmetry and inclined to form a pyramid. By strategically positioning a number of conventional cameras around a mirror pyramid, the viewpoints of the cameras' mirror images can be located at a single point within the pyramid and their optical axes pointed in different directions to effectively form a virtual camera with a panoramic field of view. Mirror pyramid-based panoramic cameras have a number of attractive properties, including single-viewpoint imaging, high resolution, and video rate capture. It is also possible to place multiple viewpoints within a single mirror pyramid, yielding compact designs for simultaneous multiview panoramic video rate imaging. Nalwa [4] first described some of the basic ideas behind mirror pyramid cameras. In this paper, we analyze the general class of multiview panoramic cameras, provide a method for designing these cameras, and present experimental results using a prototype we have developed to validate single-pyramid multiview designs. We first give a description of mirror pyramid cameras, including the imaging geometry, and investigate the relationship between the placement of viewpoints within the pyramid and the cameras' field of view (FOV), using simulations to illustrate the concepts. A method for maximizing sensor utilization in a mirror pyramid-based multiview panoramic camera is also presented. Images acquired using the experimental prototype for two viewpoints are shown. PMID:18579952

  19. Light field panorama by a plenoptic camera

    NASA Astrophysics Data System (ADS)

    Xue, Zhou; Baboulaz, Loic; Prandoni, Paolo; Vetterli, Martin

    2013-03-01

    Consumer-grade plenoptic camera Lytro draws a lot of interest from both academic and industrial world. However its low resolution in both spatial and angular domain prevents it from being used for fine and detailed light field acquisition. This paper proposes to use a plenoptic camera as an image scanner and perform light field stitching to increase the size of the acquired light field data. We consider a simplified plenoptic camera model comprising a pinhole camera moving behind a thin lens. Based on this model, we describe how to perform light field acquisition and stitching under two different scenarios: by camera translation or by camera translation and rotation. In both cases, we assume the camera motion to be known. In the case of camera translation, we show how the acquired light fields should be resampled to increase the spatial range and ultimately obtain a wider field of view. In the case of camera translation and rotation, the camera motion is calculated such that the light fields can be directly stitched and extended in the angular domain. Simulation results verify our approach and demonstrate the potential of the motion model for further light field applications such as registration and super-resolution.

  20. Optimising camera traps for monitoring small mammals.

    PubMed

    Glen, Alistair S; Cockburn, Stuart; Nichols, Margaret; Ekanayake, Jagath; Warburton, Bruce

    2013-01-01

    Practical techniques are required to monitor invasive animals, which are often cryptic and occur at low density. Camera traps have potential for this purpose, but may have problems detecting and identifying small species. A further challenge is how to standardise the size of each camera's field of view so capture rates are comparable between different places and times. We investigated the optimal specifications for a low-cost camera trap for small mammals. The factors tested were 1) trigger speed, 2) passive infrared vs. microwave sensor, 3) white vs. infrared flash, and 4) still photographs vs. video. We also tested a new approach to standardise each camera's field of view. We compared the success rates of four camera trap designs in detecting and taking recognisable photographs of captive stoats (Mustelaerminea), feral cats (Felis catus) and hedgehogs (Erinaceuseuropaeus). Trigger speeds of 0.2-2.1 s captured photographs of all three target species unless the animal was running at high speed. The camera with a microwave sensor was prone to false triggers, and often failed to trigger when an animal moved in front of it. A white flash produced photographs that were more readily identified to species than those obtained under infrared light. However, a white flash may be more likely to frighten target animals, potentially affecting detection probabilities. Video footage achieved similar success rates to still cameras but required more processing time and computer memory. Placing two camera traps side by side achieved a higher success rate than using a single camera. Camera traps show considerable promise for monitoring invasive mammal control operations. Further research should address how best to standardise the size of each camera's field of view, maximise the probability that an animal encountering a camera trap will be detected, and eliminate visible or audible cues emitted by camera traps. PMID:23840790

  1. Second-Generation Multi-Angle Imaging Spectroradiometer

    NASA Technical Reports Server (NTRS)

    Macenka, Steven; Hovland, Larry; Preston, Daniel; Zellers, Brian; Downing, Kevin

    2004-01-01

    A report discusses an early phase in the development of the MISR-2 C, a second, improved version of the Multi-angle Imaging SpectroRadiometer (MISR), which has been in orbit around the Earth aboard NASA's Terra spacecraft since 1999. Like the MISR, the MISR-2 would contain a pushbroom array of nine charge-coupled- device (CCD) cameras one aimed at the nadir and the others aimed at different angles sideways from the nadir. The major improvements embodied in the MISR-2 would be the following: A new folded-reflective-optics design would render the MISR-2 only a third as massive as the MISR. Smaller filters and electronic circuits would enable a reduction in volume to a sixth of that of the MISR. The MISR-2 would generate images in two infrared spectral bands in addition to the blue, green, red, and near-infrared spectral bands of the MISR. Miniature polarization filters would be incorporated to add a polarization-sensing capability. Calibration would be performed nonintrusively by use of a gimbaled tenth camera. The main accomplishment thus far has been the construction of an extremely compact all-reflective-optics CCD camera to demonstrate feasibility.

  2. Toward the camera rain gauge

    NASA Astrophysics Data System (ADS)

    Allamano, P.; Croci, A.; Laio, F.

    2015-03-01

    We propose a novel technique based on the quantitative detection of rain intensity from images, i.e., from pictures taken in rainy conditions. The method is fully analytical and based on the fundamentals of camera optics. A rigorous statistical framing of the technique allows one to obtain the rain rate estimates in terms of expected values and associated uncertainty. We show that the method can be profitably applied to real rain events, and we obtain promising results with errors of the order of ±25%. A precise quantification of the method's accuracy will require a more systematic and long-term comparison with benchmark measures. The significant step forward with respect to standard rain gauges resides in the possibility to retrieve measures at very high temporal resolution (e.g., 30 measures per minute) at a very low cost. Perspective applications include the possibility to dramatically increase the spatial density of rain observations by exporting the technique to crowdsourced pictures of rain acquired with cameras and smartphones.

  3. Angle-resolved scattering spectroscopy of explosives using an external cavity quantum cascade laser

    SciTech Connect

    Suter, Jonathan D.; Bernacki, Bruce E.; Phillips, Mark C.

    2012-04-01

    Investigation of angle-resolved scattering from solid explosives residues on a car door for non-contact sensing geometries. Illumination with a mid-infrared external cavity quantum cascade laser tuning between 7 and 8 microns was detected both with a sensitive single point detector and a hyperspectral imaging camera. Spectral scattering phenomena were discussed and possibilities for hyperspectral imaging at large scattering angles were outlined.

  4. Dynamic-angle spinning and double rotation of quadrupolar nuclei

    SciTech Connect

    Mueller, K.T. California Univ., Berkeley, CA . Dept. of Chemistry)

    1991-07-01

    Nuclear magnetic resonance (NMR) spectroscopy of quadrupolar nuclei is complicated by the coupling of the electric quadrupole moment of the nucleus to local variations in the electric field. The quadrupolar interaction is a useful source of information about local molecular structure in solids, but it tends to broaden resonance lines causing crowding and overlap in NMR spectra. Magic- angle spinning, which is routinely used to produce high resolution spectra of spin-{1/2} nuclei like carbon-13 and silicon-29, is incapable of fully narrowing resonances from quadrupolar nuclei when anisotropic second-order quadrupolar interactions are present. Two new sample-spinning techniques are introduced here that completely average the second-order quadrupolar coupling. Narrow resonance lines are obtained and individual resonances from distinct nuclear sites are identified. In dynamic-angle spinning (DAS) a rotor containing a powdered sample is reoriented between discrete angles with respect to high magnetic field. Evolution under anisotropic interactions at the different angles cancels, leaving only the isotropic evolution of the spin system. In the second technique, double rotation (DOR), a small rotor spins within a larger rotor so that the sample traces out a complicated trajectory in space. The relative orientation of the rotors and the orientation of the larger rotor within the magnetic field are selected to average both first- and second-order anisotropic broadening. The theory of quadrupolar interactions, coherent averaging theory, and motional narrowing by sample reorientation are reviewed with emphasis on the chemical shift anisotropy and second-order quadrupolar interactions experienced by half-odd integer spin quadrupolar nuclei. The DAS and DOR techniques are introduced and illustrated with application to common quadrupolar systems such as sodium-23 and oxygen-17 nuclei in solids.

  5. Focal lengths of Venus Monitoring Camera from limb locations

    NASA Astrophysics Data System (ADS)

    Limaye, Sanjay S.; Markiewicz, W. J.; Krauss, R.; Ignatiev, N.; Roatsch, T.; Matz, K. D.

    2015-08-01

    The Venus Monitoring Camera (VMC) carried by European Space Agency's Venus Express orbiter (Svedhem et al., 2007) consists of four optical units, each with a separate filter casting an image on a single CCD (Markiewicz et al., 2007a, 2007b). The desire to capture as much of the planet in a single frame during the spacecraft's 24 h, 0.84 eccentricity orbit led to optics with 18 field of view. Analysis of Venus images obtained by the VMC indicated that the computed limb radius and altitude of haze layers were somewhat inconsistent with prior knowledge and expectations. Possible causes include errors in the knowledge of image geometry, misalignment of the optic axis from the pointing direction, and optical distortion. These were explored and eliminated, leaving only deviations from the ground and pre-solar damage estimate of the focal length lengths as the most likely reason. We use the location of planet's limb to estimate the focal length of each camera using images of the planet when the orbiter was more than 20,000 km from planet center. The method relies on the limb radius to be constant at least over a small range of solar zenith angles. We were able to achieve better estimates for the focal lengths for all four cameras and also estimate small offsets to the boresight alignment. An outcome of this analysis is the finding that the slant unit optical depth varies more rapidly with solar zenith angle in the afternoon as compared to morning, with lowest values at local noon. A variation of this level is also observed with latitude. Both are indicative of the presence of overlying haze above the clouds, and the morning afternoon asymmetry suggests different photochemical processes in destruction and production of the haze.

  6. Driven polymer translocation through a narrow pore.

    PubMed Central

    Lubensky, D K; Nelson, D R

    1999-01-01

    Motivated by experiments in which a polynucleotide is driven through a proteinaceous pore by an electric field, we study the diffusive motion of a polymer threaded through a narrow channel with which it may have strong interactions. We show that there is a range of polymer lengths in which the system is approximately translationally invariant, and we develop a coarse-grained description of this regime. From this description, general features of the distribution of times for the polymer to pass through the pore may be deduced. We also introduce a more microscopic model. This model provides a physically reasonable scenario in which, as in experiments, the polymer's speed depends sensitively on its chemical composition, and even on its orientation in the channel. Finally, we point out that the experimental distribution of times for the polymer to pass through the pore is much broader than expected from simple estimates, and speculate on why this might be. PMID:10512806

  7. Robotic chair at steep and narrow stairways

    NASA Astrophysics Data System (ADS)

    Imazato, Masahiro; Yamaguchi, Masahiro; Moromugi, Shunji; Ishimatsu, Takakazu

    2007-12-01

    A robotic chair is developed to support mobility of elderly and disabled people living in the house where steep and narrow stairways are installed. In order to deal with such mobility problem the developed robotic chair has a compact original configuration. The robotic chair vertically moves by actuation of electric cylinders and horizontally moves by push-pull operation given by a care-giver. In order to navigate safely every action of the chair is checked by the operator. Up-and-down motions of the robotic chair on the stairway are executed through combinations of motor and cylinder actuations. Performance of the robotic chair was evaluated through two kinds of experiments. The excellent ability of the robotic chair could be confirmed through these experiments.

  8. Active Brownian motion in a narrow channel

    NASA Astrophysics Data System (ADS)

    Ao, X.; Ghosh, P. K.; Li, Y.; Schmid, G.; Hnggi, P.; Marchesoni, F.

    2014-12-01

    We review recent advances in rectification control of artificial microswimmers, also known as Janus particles, diffusing along narrow, periodically corrugated channels. The swimmer self-propulsion mechanism is modeled so as to incorporate a nonzero torque (propulsion chirality). We first summarize the effects of chirality on the autonomous current of microswimmers freely diffusing in channels of different geometries. In particular, left-right and upside-down asymmetric channels are shown to exhibit different transport properties. We then report new results on the dependence of the diffusivity of chiral microswimmers on the channel geometry and their own self-propulsion mechanism. The self-propulsion torque turns out to play a key role as a transport control parameter.

  9. Ultra-narrow metallic armchair graphene nanoribbons

    NASA Astrophysics Data System (ADS)

    Kimouche, Amina; Ervasti, Mikko M.; Drost, Robert; Halonen, Simo; Harju, Ari; Joensuu, Pekka M.; Sainio, Jani; Liljeroth, Peter

    2015-12-01

    Graphene nanoribbons (GNRs)--narrow stripes of graphene--have emerged as promising building blocks for nanoelectronic devices. Recent advances in bottom-up synthesis have allowed production of atomically well-defined armchair GNRs with different widths and doping. While all experimentally studied GNRs have exhibited wide bandgaps, theory predicts that every third armchair GNR (widths of N=3m+2, where m is an integer) should be nearly metallic with a very small bandgap. Here, we synthesize the narrowest possible GNR belonging to this family (five carbon atoms wide, N=5). We study the evolution of the electronic bandgap and orbital structure of GNR segments as a function of their length using low-temperature scanning tunnelling microscopy and density-functional theory calculations. Already GNRs with lengths of 5 nm reach almost metallic behaviour with ~100 meV bandgap. Finally, we show that defects (kinks) in the GNRs do not strongly modify their electronic structure.

  10. Indium nitride: A narrow gap semiconductor

    SciTech Connect

    Wu, J.; Walukiewicz, W.; Yu, K.M.; Ager III, J.W.; Haller, E.E.; Lu, H.; Schaff, W.J.

    2002-08-14

    The optical properties of wurtzite InN grown on sapphire substrates by molecular-beam epitaxy have been characterized by optical absorption, photoluminescence, and photomodulated reflectance techniques. All these three characterization techniques show an energy gap for InN between 0.7 and 0.8 eV, much lower than the commonly accepted value of 1.9 eV. The photoluminescence peak energy is found to be sensitive to the free electron concentration of the sample. The peak energy exhibits a very weak hydrostatic pressure dependence and a small, anomalous blueshift with increasing temperature. The bandgap energies of In-rich InGaN alloys were found to be consistent with the narrow gap of InN. The bandgap bowing parameter was determined to be 1.43 eV in InGaN.

  11. Ultra-narrow metallic armchair graphene nanoribbons.

    PubMed

    Kimouche, Amina; Ervasti, Mikko M; Drost, Robert; Halonen, Simo; Harju, Ari; Joensuu, Pekka M; Sainio, Jani; Liljeroth, Peter

    2015-01-01

    Graphene nanoribbons (GNRs)-narrow stripes of graphene-have emerged as promising building blocks for nanoelectronic devices. Recent advances in bottom-up synthesis have allowed production of atomically well-defined armchair GNRs with different widths and doping. While all experimentally studied GNRs have exhibited wide bandgaps, theory predicts that every third armchair GNR (widths of N=3m+2, where m is an integer) should be nearly metallic with a very small bandgap. Here, we synthesize the narrowest possible GNR belonging to this family (five carbon atoms wide, N=5). We study the evolution of the electronic bandgap and orbital structure of GNR segments as a function of their length using low-temperature scanning tunnelling microscopy and density-functional theory calculations. Already GNRs with lengths of 5?nm reach almost metallic behaviour with ?100?meV bandgap. Finally, we show that defects (kinks) in the GNRs do not strongly modify their electronic structure. PMID:26658960

  12. Nondecaying Hydrodynamic Interactions along Narrow Channels

    NASA Astrophysics Data System (ADS)

    Misiunas, Karolis; Pagliara, Stefano; Lauga, Eric; Lister, John R.; Keyser, Ulrich F.

    2015-07-01

    Particle-particle interactions are of paramount importance in every multibody system as they determine the collective behavior and coupling strength. Many well-known interactions such as electrostatic, van der Waals, or screened Coulomb interactions, decay exponentially or with negative powers of the particle spacing r . Similarly, hydrodynamic interactions between particles undergoing Brownian motion decay as 1 /r in bulk, and are assumed to decay in small channels. Such interactions are ubiquitous in biological and technological systems. Here we confine two particles undergoing Brownian motion in narrow, microfluidic channels and study their coupling through hydrodynamic interactions. Our experiments show that the hydrodynamic particle-particle interactions are distance independent in these channels. This finding is of fundamental importance for the interpretation of experiments where dense mixtures of particles or molecules diffuse through finite length, water-filled channels or pore networks.

  13. [Differential diagnosis of a narrow QRS tachycardia].

    PubMed

    Lewalter, Thorsten

    2015-09-01

    The differential diagnosis of a narrow QRS tachycardia requires on the one hand knowledge about the clinical data of the tachycardia patient but on the other hand a systematic step by step analysis of the electrocardiogram (ECG) is the most successful approach. Apart from the question of regularity or irregularity of the QRS complexes, the presence and detection of P waves is also of importance. The P wave timing in relation to the preceding and the following QRS complexes as well as the numerical relationship of P waves and QRS complexes allow a well-founded suspected diagnosis to be achieved in most cases. Even the differentiation between atrioventricular (AV) nodal reentrant tachycardia (AVNRT) versus orthodromic AV reentrant tachycardia (AVRT), e.g. by accessory leads, is in most cases possible in a surface ECG. Obviously, there are constellations which need an invasive electrophysiological procedure for a definitive diagnosis. PMID:26287273

  14. Our "ides fixes" in rhinoplasty: the naso-frontal angle and the alar lateral crus.

    PubMed

    Fontana, A M; Muti, E

    2001-01-01

    An examination of the naso-frontal angle shows that a cranial or caudal shift in position affects the dorsum length. The fundamental aspect characterizing the naso-frontal angle is its width: it may be normal, too wide, or too narrow. It's easier to modify a too-narrow angle than a too-wide one. We show surgical techniques, emphasizing their method applied to the over-wide angle. The second "ide fixe" is the morphology of the alar lateral crus. Every variety of this structure has a peculiar property of shape and relationship with other elements that predetermines the therapeutic action. Diagrams and photographs are shown to describe the surgical techniques and the clinical results. There is also a demonstration of the method for alar lateral crus resection, which we call "the long keel shape," since it resembles the shape of a sailboat keel. PMID:11322390

  15. Dependence of astigmatism, far-field pattern, and spectral envelope width on active layer thickness of gain guided lasers with narrow stripe geometry

    SciTech Connect

    Mamine, T.

    1984-06-15

    The effects of active layer thickness on the astigmatism, the angle of far-field pattern width parallel to the junction, and the spectral envelope width of a gain guided laser with a narrow stripe geometry have been investigated analytically and experimentally. It is concluded that a large level of astigmatism, a narrow far-field pattern width, and a rapid convergence of the spectral envelope width are inherent to the gain guided lasers with thin active layers.

  16. A New Development Of The Method Of Calculating The Approximating Circle In Rotating Mirror Framing Camera

    NASA Astrophysics Data System (ADS)

    Yuejin, Zhao; Wenbi, Luo

    1990-01-01

    This paper deduces a formula which calculates two approximating circles simultaneously. With the help of computer and optimum technology, we calculate two approximating circles parameters under the condition of restricting the angle between optical axle of relay lens and reflecting optical axle of rotating mirror, the evenness of photographic frequency and the evenness of magnifying power of relay lens to certain limits. Rotating mirror high-speed cameras have many advantages such as high efficiency, high resolution and convenient for using. Thus the cameras are used in most of high speed process laboratories. In designing this camera, some calculation methods need to be improved. According to the effect of factors on image quality, this paper obsorbs the advantages of a few methods of selecting approximating circle parameters of framing camera and seeks more reasonable ways to determine approximating circle parameters of framing camera. It makes optical axle of relay lens consistant with reflecting optical axle of rotating mirror in working angle, and also makes the out-of-focus between approximating circle and practical image plane as small as possible. Thinking above two aspects and considering the evenness of photographic frequency and the evenness of magnification of relay lens, this paper deduces the relations between relay lens approximating circle and focal plane approximating circle by using relay lens imagery. By using the method of determining two approximating circles simultaneously, it increases image quality of image point in axle and off axle.

  17. A state observer for using a slow camera as a sensor for fast control applications

    NASA Astrophysics Data System (ADS)

    Gahleitner, Reinhard; Schagerl, Martin

    2013-03-01

    This contribution concerns about a problem that often arises in vision based control, when a camera is used as a sensor for fast control applications, or more precisely, when the sample rate of the control loop is higher than the frame rate of the camera. In control applications for mechanical axes, e.g. in robotics or automated production, a camera and some image processing can be used as a sensor to detect positions or angles. The sample time in these applications is typically in the range of a few milliseconds or less and this demands the use of a camera with a high frame rate up to 1000 fps. The presented solution is a special state observer that can work with a slower and therefore cheaper camera to estimate the state variables at the higher sample rate of the control loop. To simplify the image processing for the determination of positions or angles and make it more robust, some LED markers are applied to the plant. Simulation and experimental results show that the concept can be used even if the plant is unstable like the inverted pendulum.

  18. Small angle neutron scattering

    NASA Astrophysics Data System (ADS)

    Cousin, Fabrice

    2015-10-01

    Small Angle Neutron Scattering (SANS) is a technique that enables to probe the 3-D structure of materials on a typical size range lying from 1 nm up to a few 100 nm, the obtained information being statistically averaged on a sample whose volume is 1 cm3. This very rich technique enables to make a full structural characterization of a given object of nanometric dimensions (radius of gyration, shape, volume or mass, fractal dimension, specific area) through the determination of the form factor as well as the determination of the way objects are organized within in a continuous media, and therefore to describe interactions between them, through the determination of the structure factor. The specific properties of neutrons (possibility of tuning the scattering intensity by using the isotopic substitution, sensitivity to magnetism, negligible absorption, low energy of the incident neutrons) make it particularly interesting in the fields of soft matter, biophysics, magnetic materials and metallurgy. In particular, the contrast variation methods allow to extract some informations that cannot be obtained by any other experimental techniques. This course is divided in two parts. The first one is devoted to the description of the principle of SANS: basics (formalism, coherent scattering/incoherent scattering, notion of elementary scatterer), form factor analysis (I(q?0), Guinier regime, intermediate regime, Porod regime, polydisperse system), structure factor analysis (2nd Virial coefficient, integral equations, characterization of aggregates), and contrast variation methods (how to create contrast in an homogeneous system, matching in ternary systems, extrapolation to zero concentration, Zero Averaged Contrast). It is illustrated by some representative examples. The second one describes the experimental aspects of SANS to guide user in its future experiments: description of SANS spectrometer, resolution of the spectrometer, optimization of spectrometer configurations, optimization of sample characteristics prior to measurements (thickness, volume, hydrogen content), standards measurements to be made and principle of data reduction.

  19. Generalization of the Euler Angles

    NASA Technical Reports Server (NTRS)

    Bauer, Frank H. (Technical Monitor); Shuster, Malcolm D.; Markley, F. Landis

    2002-01-01

    It is shown that the Euler angles can be generalized to axes other than members of an orthonormal triad. As first shown by Davenport, the three generalized Euler axes, hereafter: Davenport axes, must still satisfy the constraint that the first two and the last two axes be mutually perpendicular if these axes are to define a universal set of attitude parameters. Expressions are given which relate the generalized Euler angles, hereafter: Davenport angles, to the 3-1-3 Euler angles of an associated direction-cosine matrix. The computation of the Davenport angles from the attitude matrix and their kinematic equation are presented. The present work offers a more direct development of the Davenport angles than Davenport's original publication and offers additional results.

  20. Fruit detectability analysis for different camera positions in sweet-pepper.

    PubMed

    Hemming, Jochen; Ruizendaal, Jos; Hofstee, Jan Willem; van Henten, Eldert J

    2014-01-01

    For robotic harvesting of sweet-pepper fruits in greenhouses a sensor system is required to detect and localize the fruits on the plants. Due to the complex structure of the plant, most fruits are (partially) occluded when an image is taken from one viewpoint only. In this research the effect of multiple camera positions and viewing angles on fruit visibility and detectability was investigated. A recording device was built which allowed to place the camera under different azimuth and zenith angles and to move the camera horizontally along the crop row. Fourteen camera positions were chosen and the fruit visibility in the recorded images was manually determined for each position. For images taken from one position only with the criterion of maximum 50% occlusion per fruit, the fruit detectability (FD) was in no case higher than 69%. The best single positions were the front views and looking with a zenith angle of 60 upwards. The FD increased when a combination was made of multiple viewpoint positions. With a combination of five favourite positions the maximum FD was 90%. PMID:24681670

  1. Fruit Detectability Analysis for Different Camera Positions in Sweet-Pepper †

    PubMed Central

    Hemming, Jochen; Ruizendaal, Jos; Hofstee, Jan Willem; van Henten, Eldert J.

    2014-01-01

    For robotic harvesting of sweet-pepper fruits in greenhouses a sensor system is required to detect and localize the fruits on the plants. Due to the complex structure of the plant, most fruits are (partially) occluded when an image is taken from one viewpoint only. In this research the effect of multiple camera positions and viewing angles on fruit visibility and detectability was investigated. A recording device was built which allowed to place the camera under different azimuth and zenith angles and to move the camera horizontally along the crop row. Fourteen camera positions were chosen and the fruit visibility in the recorded images was manually determined for each position. For images taken from one position only with the criterion of maximum 50% occlusion per fruit, the fruit detectability (FD) was in no case higher than 69%. The best single positions were the front views and looking with a zenith angle of 60° upwards. The FD increased when a combination was made of multiple viewpoint positions. With a combination of five favourite positions the maximum FD was 90%. PMID:24681670

  2. High Speed Digital Camera Technology Review

    NASA Technical Reports Server (NTRS)

    Clements, Sandra D.

    2009-01-01

    A High Speed Digital Camera Technology Review (HSD Review) is being conducted to evaluate the state-of-the-shelf in this rapidly progressing industry. Five HSD cameras supplied by four camera manufacturers participated in a Field Test during the Space Shuttle Discovery STS-128 launch. Each camera was also subjected to Bench Tests in the ASRC Imaging Development Laboratory. Evaluation of the data from the Field and Bench Tests is underway. Representatives from the imaging communities at NASA / KSC and the Optical Systems Group are participating as reviewers. A High Speed Digital Video Camera Draft Specification was updated to address Shuttle engineering imagery requirements based on findings from this HSD Review. This draft specification will serve as the template for a High Speed Digital Video Camera Specification to be developed for the wider OSG imaging community under OSG Task OS-33.

  3. Television camera video level control system

    NASA Technical Reports Server (NTRS)

    Kravitz, M.; Freedman, L. A.; Fredd, E. H.; Denef, D. E. (Inventor)

    1985-01-01

    A video level control system is provided which generates a normalized video signal for a camera processing circuit. The video level control system includes a lens iris which provides a controlled light signal to a camera tube. The camera tube converts the light signal provided by the lens iris into electrical signals. A feedback circuit in response to the electrical signals generated by the camera tube, provides feedback signals to the lens iris and the camera tube. This assures that a normalized video signal is provided in a first illumination range. An automatic gain control loop, which is also responsive to the electrical signals generated by the camera tube 4, operates in tandem with the feedback circuit. This assures that the normalized video signal is maintained in a second illumination range.

  4. A telephoto camera system with shooting direction control by gaze detection

    NASA Astrophysics Data System (ADS)

    Teraya, Daiki; Hachisu, Takumi; Yendo, Tomohiro

    2015-05-01

    For safe driving, it is important for driver to check traffic conditions such as traffic lights, or traffic signs as early as soon. If on-vehicle camera takes image of important objects to understand traffic conditions from long distance and shows these to driver, driver can understand traffic conditions earlier. To take image of long distance objects clearly, the focal length of camera must be long. When the focal length is long, on-vehicle camera doesn't have enough field of view to check traffic conditions. Therefore, in order to get necessary images from long distance, camera must have long-focal length and controllability of shooting direction. In previous study, driver indicates shooting direction on displayed image taken by a wide-angle camera, a direction controllable camera takes telescopic image, and displays these to driver. However, driver uses a touch panel to indicate the shooting direction in previous study. It is cause of disturb driving. So, we propose a telephoto camera system for driving support whose shooting direction is controlled by driver's gaze to avoid disturbing drive. This proposed system is composed of a gaze detector and an active telephoto camera whose shooting direction is controlled. We adopt non-wear detecting method to avoid hindrance to drive. The gaze detector measures driver's gaze by image processing. The shooting direction of the active telephoto camera is controlled by galvanometer scanners and the direction can be switched within a few milliseconds. We confirmed that the proposed system takes images of gazing straight ahead of subject by experiments.

  5. Feasibility study for the application of the large format camera as a payload for the Orbiter program

    NASA Technical Reports Server (NTRS)

    1978-01-01

    The large format camera (LFC) designed as a 30 cm focal length cartographic camera system that employs forward motion compensation in order to achieve the full image resolution provided by its 80 degree field angle lens is described. The feasibility of application of the current LFC design to deployment in the orbiter program as the Orbiter Camera Payload System was assessed and the changes that are necessary to meet such a requirement are discussed. Current design and any proposed design changes were evaluated relative to possible future deployment of the LFC on a free flyer vehicle or in a WB-57F. Preliminary mission interface requirements for the LFC are given.

  6. Multiple narrow bandpass optical filters based on one-dimensional rugate photonic structures of two periodicities.

    PubMed

    Lu, Zhao

    2011-02-15

    A theoretical analysis of a design of multiple narrow bandpass filters based on one-dimensional (1D) rugate photonic structures with a period jump defect is presented. The optical properties, including transmittance and energy density distributions, are numerically calculated using the propagation matrix method. Our results show that multiple resonance transmission modes are produced when the period jump defect is introduced into the 1D rugate film. Both blueshift and redshift of the stop band of the rugate structure and wavelengths of resonant modes are observed, depending on the change of period jump. The number, the wavelengths, the band intervals, and the intensities of multiple resonance transmission modes are tunable by adjusting structure parameters of the rugate structure. Experimental feasibility of the proposed multiple narrow bandpass optical filters using the technique of glancing angle deposition is also discussed. PMID:21326460

  7. Electrostatic camera system functional design study

    NASA Technical Reports Server (NTRS)

    Botticelli, R. A.; Cook, F. J.; Moore, R. F.

    1972-01-01

    A functional design study for an electrostatic camera system for application to planetary missions is presented. The electrostatic camera can produce and store a large number of pictures and provide for transmission of the stored information at arbitrary times after exposure. Preliminary configuration drawings and circuit diagrams for the system are illustrated. The camera system's size, weight, power consumption, and performance are characterized. Tradeoffs between system weight, power, and storage capacity are identified.

  8. The Mars Science Laboratory Engineering Cameras

    NASA Astrophysics Data System (ADS)

    Maki, J.; Thiessen, D.; Pourangi, A.; Kobzeff, P.; Litwin, T.; Scherr, L.; Elliott, S.; Dingizian, A.; Maimone, M.

    2012-09-01

    NASA's Mars Science Laboratory (MSL) Rover is equipped with a set of 12 engineering cameras. These cameras are build-to-print copies of the Mars Exploration Rover cameras described in Maki et al. (J. Geophys. Res. 108(E12): 8071, 2003). Images returned from the engineering cameras will be used to navigate the rover on the Martian surface, deploy the rover robotic arm, and ingest samples into the rover sample processing system. The Navigation cameras (Navcams) are mounted to a pan/tilt mast and have a 45-degree square field of view (FOV) with a pixel scale of 0.82 mrad/pixel. The Hazard Avoidance Cameras (Hazcams) are body-mounted to the rover chassis in the front and rear of the vehicle and have a 124-degree square FOV with a pixel scale of 2.1 mrad/pixel. All of the cameras utilize a 10241024 pixel detector and red/near IR bandpass filters centered at 650 nm. The MSL engineering cameras are grouped into two sets of six: one set of cameras is connected to rover computer "A" and the other set is connected to rover computer "B". The Navcams and Front Hazcams each provide similar views from either computer. The Rear Hazcams provide different views from the two computers due to the different mounting locations of the "A" and "B" Rear Hazcams. This paper provides a brief description of the engineering camera properties, the locations of the cameras on the vehicle, and camera usage for surface operations.

  9. Development of biostereometric experiments. [stereometric camera system

    NASA Technical Reports Server (NTRS)

    Herron, R. E.

    1978-01-01

    The stereometric camera was designed for close-range techniques in biostereometrics. The camera focusing distance of 360 mm to infinity covers a broad field of close-range photogrammetry. The design provides for a separate unit for the lens system and interchangeable backs on the camera for the use of single frame film exposure, roll-type film cassettes, or glass plates. The system incorporates the use of a surface contrast optical projector.

  10. [Angle-closure chronic glaucoma].

    PubMed

    Lachkar, Y

    2003-10-01

    The incidence of chronic angle closure glaucoma is considerably greater than the incidence of the acute type. This type of glaucoma may mimic primary open angle glaucoma with visual field deterioration, optic nerve alteration and intraocular pressure elevation with a quiet painless eye. Its diagnosis is based on indentation gonioscopy showing peripheral anterior synechiae. The mechanisms of angle closure are the pupillary block, the plateau iris configuration and the creeping form. The treatment of chronic angle closure glaucoma is based on laser peripheral iridotomy. PMID:14646832

  11. Advantages of improved timing accuracy in PET cameras using LSOscintillator

    SciTech Connect

    Moses, William W.

    2002-12-02

    PET scanners based on LSO have the potential forsignificantly better coincidence timing resolution than the 6 ns fwhmtypically achieved with BGO. This study analyzes the performanceenhancements made possible by improved timing as a function of thecoincidence time resolution. If 500 ps fwhm coincidence timing resolutioncan be achieved in a complete PET camera, the following four benefits canbe realized for whole-body FDG imaging: 1) The random event rate can bereduced by using a narrower coincidence timing window, increasing thepeak NECR by~;50 percent. 2) Using time-of-flight in the reconstructionalgorithm will reduce the noise variance by a factor of 5. 3) Emissionand transmission data can be acquired simultaneously, reducing the totalscan time. 4) Axial blurring can be reduced by using time-of-flight todetermine the correct axial plane that each event originated from. Whiletime-of-flight was extensively studied in the 1980's, practical factorslimited its effectiveness at that time and little attention has been paidto timing in PET since then. As these potential improvements aresubstantial and the advent of LSO PET cameras gives us the means toobtain them without other sacrifices, efforts to improve PET timingshould resume after their long dormancy.

  12. Optical design and performance of MIRIS near-infrared camera

    NASA Astrophysics Data System (ADS)

    Ree, Chang Hee; Park, Sung-Joon; Moon, Bongkon; Cha, Sang-Mok; Park, Youngsik; Jeong, Woong-Seob; Lee, Dae-Hee; Nam, Uk-Won; Park, Jang-Hyun; Ka, Nung Hyun; Lee, Mi Hyun; Pyo, Jeonghyun; Lee, Duk-Hang; Rhee, Seung-Woo; Park, Jong-Oh; Lee, Hyung Mok; Matsumoto, Toshio; Yang, Sun Choel; Han, Wonyong

    2010-07-01

    Multi-purpose Infra-Red Imaging System (MIRIS) is a near-infrared camera onboard on the Korea Science and Technology Satellite 3 (STSAT-3). The MIRIS is a wide-field (3.67 3.67) infrared imaging system which employs a fast (F/2) refractive optics with 80 mm diameter aperture. The MIRIS optics consists of five lenses, among which the rear surface of the fifth lens is aspheric. By passive cooling on a Sun-synchronous orbit, the telescope will be cooled down below 200 K in order to deliver the designed performance. As the fabrication and assembly should be carried out at room temperature, however, we convert all the lens data of cold temperature to that of room temperature. The sophisticated opto-mechanical design accommodates the effects of thermal contraction after the launch, and the optical elements are protected by flexure structures from the shock (10 G) during the launch. The MIRIS incorporates the wide-band filters, I (1.05 ?m) and H (1.6 ?m), for the Cosmic Infrared Background observations, and also the narrow-band filters, Pa? (1.876 ?m) and a specially designed dual-band continuum, for the emission line mapping of the Galactic interstellar medium. We present the optical design, fabrication of components, assembly procedure, and the performance test results of the qualification model of MIRIS near-infrared camera.

  13. Wide field and diffraction limited array camera for SIRTF

    NASA Technical Reports Server (NTRS)

    Fazio, G. G.; Koch, D. G.; Melnick, G. J.; Tresch-Fienberg, R. M.; Willner, S. P.; Gezari, D. Y.; Lamb, G.; Shu, P.; Chin, G.; Mccreight, C. R.

    1986-01-01

    The Infrared Array Camera for the Space Infrared Telescope Facility (SIRTF/IRAC) is capable of two-dimensional photometry in either a wide field or diffraction-limited mode over the wavelength interval from 2 to 30 microns. Three different two-dimensional direct readout (DRO) array detectors are being considered: Band 1-InSb or Si:In (2-5 microns) 128 x 128 pixels, Band 2-Si:Ga (5-18 microns) 64 x 64 pixels, and Band 3-Si:Sb (18-30 microns) 64 x 64 pixels. The hybrid DRO readout architecture has the advantages of low read noise, random pixel access with individual readout rates, and nondestructive readout. The scientific goals of IRAC are discussed, which are the basis for several important requirements and capabilities of the array camera: (1) diffraction-limited resolution from 2-30 microns, (2) use of the maximum unvignetted field of view of SIRTF, (3) simultaneous observations within the three infrared spectral bands, and (4) the capability for broad and narrow bandwidth spectral resolution. A strategy has been developed to minimize the total electronic and environmental noise sources to satisfy the scientific requirements.

  14. Wide field and diffraction limited array camera for SIRTF

    NASA Technical Reports Server (NTRS)

    Fazio, G. G.; Koch, D. G.; Melnick, G. J.; Tresch-Fienberg, R. M.; Willner, S. P.; Gezari, D. Y.; Lamb, G.; Shu, P.; Chin, G.; Mccreight, C. R.

    1986-01-01

    The Infrared Array Camera for the space Infrared Telescope Facility (SIRTF/IRAC) is capable of two-dimensional photometry in either a wide field or diffraction-limited mode over the wavelength interval from 2 to 30 microns. Three different two-dimensional direct readout (DRO) array detectors will be used: Band 1-InSb or Si:In (2-5 microns) 128 x 128 pixels, Band 2-Si:Ga (5-18 microns) 64 x 64 pixels, and Band 3-Si:Sb (18-30 microns) 64 x 64 pixels. The hybrid DRO readout architecture has the advantages of low read noise, random pixel access with individual readout rates, and nondestructive readout. The scientific goals of IRAC are discussed, which are the basis for several important requirements and capabilities of the array camera: (1) diffraction-limited resolution from 2-30 microns, (2) use of the maximum unvignetted field of view of SIRTF, (3) simultaneous observations within the three infrared spectral bands, and (4) the capability for broad and narrow bandwidth spectral resolution. A strategy has been developed to minimize the total electronic and environmental noise sources to satisfy the scientific requirements.

  15. A Second Generation Multi-Angle Imaging SpectroRadiometer (MISR-2)

    NASA Technical Reports Server (NTRS)

    Bothwell, Graham; Diner, David J.; Pagano, Thomas S.; Duval, Valerie G.; Beregovski, Yuri; Hovland, Larry E.; Preston, Daniel J.

    2001-01-01

    The Multi-angle Imaging SpectroRadiometer (MISR) has been in Earth orbit since December 1999 on NASA's Terra spacecraft. This instrument provides new ways of looking at the Earth's atmosphere, clouds, and surface for the purpose of understanding the Earth's ecology, environment, and climate. To facilitate the potential future continuation of MISR's multi-angle observations, a study was undertaken in 1999 and 2000 under the Instrument Incubator Program (IIP) of NASA Code Y's Earth Science Technology Office (ESTO) to investigate and demonstrate the feasibility of a successor to MISR that will have greatly reduced size and mass. The kernel of the program was the design, construction, and testing of a highly miniaturized camera, one of the nine that would probably be used on a future space borne MISR-like instrument. This demonstrated that the size and mass reduction of the optical system and camera electronics are possible and that filters can be assembled to meet the miniaturized packaging requirements. An innovative, reflective optics design was used, enabling the wavelength range to be extended into the shortwave infrared. This was the smallest all-reflective camera ever produced by the contractor. A study was undertaken to determine the feasibility of implementing nine (multi-angle) cameras within a single structure. This resulted in several possible configurations. It would also be possible to incorporate one of the cameras into an airborne instrument.

  16. The Critical Angle Can Override the Brewster Angle

    ERIC Educational Resources Information Center

    Froehle, Peter H.

    2009-01-01

    As a culminating activity in their study of optics, my students investigate polarized light and the Brewster angle. In this exercise they encounter a situation in which it is impossible to measure the Brewster angle for light reflecting from a particular surface. This paper describes the activity and explains the students' observations.

  17. Comparative evaluation of consumer grade cameras and mobile phone cameras for close range photogrammetry

    NASA Astrophysics Data System (ADS)

    Chikatsu, Hirofumi; Takahashi, Yoji

    2009-08-01

    The authors have been concentrating on developing convenient 3D measurement methods using consumer grade digital cameras, and it was concluded that consumer grade digital cameras are expected to become a useful photogrammetric device for the various close range application fields. On the other hand, mobile phone cameras which have 10 mega pixels were appeared on the market in Japan. In these circumstances, we are faced with alternative epoch-making problem whether mobile phone cameras are able to take the place of consumer grade digital cameras in close range photogrammetric applications. In order to evaluate potentials of mobile phone cameras in close range photogrammetry, comparative evaluation between mobile phone cameras and consumer grade digital cameras are investigated in this paper with respect to lens distortion, reliability, stability and robustness. The calibration tests for 16 mobile phone cameras and 50 consumer grade digital cameras were conducted indoors using test target. Furthermore, practability of mobile phone camera for close range photogrammetry was evaluated outdoors. This paper presents that mobile phone cameras have ability to take the place of consumer grade digital cameras, and develop the market in digital photogrammetric fields.

  18. True-color night vision cameras

    NASA Astrophysics Data System (ADS)

    Kriesel, Jason; Gat, Nahum

    2007-04-01

    This paper describes True-Color Night Vision cameras that are sensitive to the visible to near-infrared (V-NIR) portion of the spectrum allowing for the "true-color" of scenes and objects to be displayed and recorded under low-light-level conditions. As compared to traditional monochrome (gray or green) night vision imagery, color imagery has increased information content and has proven to enable better situational awareness, faster response time, and more accurate target identification. Urban combat environments, where rapid situational awareness is vital, and marine operations, where there is inherent information in the color of markings and lights, are example applications that can benefit from True-Color Night Vision technology. Two different prototype cameras, employing two different true-color night vision technological approaches, are described and compared in this paper. One camera uses a fast-switching liquid crystal filter in front of a custom Gen-III image intensified camera, and the second camera is based around an EMCCD sensor with a mosaic filter applied directly to the sensor. In addition to visible light, both cameras utilize NIR to (1) increase the signal and (2) enable the viewing of laser aiming devices. The performance of the true-color cameras, along with the performance of standard (monochrome) night vision cameras, are reported and compared under various operating conditions in the lab and the field. In addition to subjective criterion, figures of merit designed specifically for the objective assessment of such cameras are used in this analysis.

  19. Multiple LED camera for dynamic photoelasticity

    NASA Astrophysics Data System (ADS)

    Asundi, A.; Sajan, M. R.

    1995-05-01

    Dynamic photoelasticity involves the high-speed recording of rapidly moving fringe patterns in synchronization with loading. Cranz Schardin cameras are routinely utilized in the recording of dynamic photoelastic patterns. There are no moving components in these cameras, making the setup simple and attractive. A multiple LED camera based on the Cranz Schardin format is presented. High-speed instant polaroid film is used for recording the photoelastic fringes. Low cost, simplicity in the experimental setup, and rapid repeatability are the advantages of the camera.

  20. Multiple LED camera for dynamic photoelasticity.

    PubMed

    Asundi, A; Sajan, M R

    1995-05-01

    Dynamic photoelasticity involves the high-speed recording of rapidly moving fringe patterns in synchronization with loading. Cranz Schardin cameras are routinely utilized in the recording of dynamic photoelastic patterns. There are no moving components in these cameras, making the setup simple and attractive. Amultiple LED camera based on the Cranz Schardin format is presented. Highspeed instant polaroid film is used for recording the photoelastic fringes. Low cost, simplicity in the experimental setup, and rapid repeatability are the advantages of the camera. PMID:21037772