Science.gov

Sample records for camera narrow angle

  1. Reconditioning of Cassini Narrow-Angle Camera

    NASA Technical Reports Server (NTRS)

    2002-01-01

    These five images of single stars, taken at different times with the narrow-angle camera on NASA's Cassini spacecraft, show the effects of haze collecting on the camera's optics, then successful removal of the haze by warming treatments.

    The image on the left was taken on May 25, 2001, before the haze problem occurred. It shows a star named HD339457.

    The second image from left, taken May 30, 2001, shows the effect of haze that collected on the optics when the camera cooled back down after a routine-maintenance heating to 30 degrees Celsius (86 degrees Fahrenheit). The star is Maia, one of the Pleiades.

    The third image was taken on October 26, 2001, after a weeklong decontamination treatment at minus 7 C (19 F). The star is Spica.

    The fourth image was taken of Spica January 30, 2002, after a weeklong decontamination treatment at 4 C (39 F).

    The final image, also of Spica, was taken July 9, 2002, following three additional decontamination treatments at 4 C (39 F) for two months, one month, then another month.

    Cassini, on its way toward arrival at Saturn in 2004, is a cooperative project of NASA, the European Space Agency and the Italian Space Agency. The Jet Propulsion Laboratory, a division of the California Institute of Technology in Pasadena, manages the Cassini mission for NASA's Office of Space Science, Washington, D.C.

  2. Flight Calibration of the LROC Narrow Angle Camera

    NASA Astrophysics Data System (ADS)

    Humm, D. C.; Tschimmel, M.; Brylow, S. M.; Mahanti, P.; Tran, T. N.; Braden, S. E.; Wiseman, S.; Danton, J.; Eliason, E. M.; Robinson, M. S.

    2016-04-01

    Characterization and calibration are vital for instrument commanding and image interpretation in remote sensing. The Lunar Reconnaissance Orbiter Camera Narrow Angle Camera (LROC NAC) takes 500 Mpixel greyscale images of lunar scenes at 0.5 meters/pixel. It uses two nominally identical line scan cameras for a larger crosstrack field of view. Stray light, spatial crosstalk, and nonlinearity were characterized using flight images of the Earth and the lunar limb. These are important for imaging shadowed craters, studying ˜1 meter size objects, and photometry respectively. Background, nonlinearity, and flatfield corrections have been implemented in the calibration pipeline. An eight-column pattern in the background is corrected. The detector is linear for DN = 600--2000 but a signal-dependent additive correction is required and applied for DN<600. A predictive model of detector temperature and dark level was developed to command dark level offset. This avoids images with a cutoff at DN=0 and minimizes quantization error in companding. Absolute radiometric calibration is derived from comparison of NAC images with ground-based images taken with the Robotic Lunar Observatory (ROLO) at much lower spatial resolution but with the same photometric angles.

  3. Extracting Accurate and Precise Topography from Lroc Narrow Angle Camera Stereo Observations

    NASA Astrophysics Data System (ADS)

    Henriksen, M. R.; Manheim, M. R.; Speyerer, E. J.; Robinson, M. S.; LROC Team

    2016-06-01

    The Lunar Reconnaissance Orbiter Camera (LROC) includes two identical Narrow Angle Cameras (NAC) that acquire meter scale imaging. Stereo observations are acquired by imaging from two or more orbits, including at least one off-nadir slew. Digital terrain models (DTMs) generated from the stereo observations are controlled to Lunar Orbiter Laser Altimeter (LOLA) elevation profiles. With current processing methods, digital terrain models (DTM) have absolute accuracies commensurate than the uncertainties of the LOLA profiles (~10 m horizontally and ~1 m vertically) and relative horizontal and vertical precisions better than the pixel scale of the DTMs (2 to 5 m). The NAC stereo pairs and derived DTMs represent an invaluable tool for science and exploration purposes. We computed slope statistics from 81 highland and 31 mare DTMs across a range of baselines. Overlapping DTMs of single stereo sets were also combined to form larger area DTM mosaics, enabling detailed characterization of large geomorphic features and providing a key resource for future exploration planning. Currently, two percent of the lunar surface is imaged in NAC stereo and continued acquisition of stereo observations will serve to strengthen our knowledge of the Moon and geologic processes that occur on all the terrestrial planets.

  4. Two Years of Digital Terrain Model Production Using the Lunar Reconnaissance Orbiter Narrow Angle Camera

    NASA Astrophysics Data System (ADS)

    Burns, K.; Robinson, M. S.; Speyerer, E.; LROC Science Team

    2011-12-01

    One of the primary objectives of the Lunar Reconnaissance Orbiter Camera (LROC) is to gather stereo observations with the Narrow Angle Camera (NAC). These stereo observations are used to generate digital terrain models (DTMs). The NAC has a pixel scale of 0.5 to 2.0 meters but was not designed for stereo observations and thus requires the spacecraft to roll off-nadir to acquire these images. Slews interfere with the data collection of the other instruments, so opportunities are currently limited to four per day. Arizona State University has produced DTMs from 95 stereo pairs for 11 Constellation Project (CxP) sites (Aristarchus, Copernicus crater, Gruithuisen domes, Hortensius domes, Ina D-caldera, Lichtenberg crater, Mare Ingenii, Marius hills, Reiner Gamma, South Pole-Aitkin Rim, Sulpicius Gallus) as well as 30 other regions of scientific interest (including: Bhabha crater, highest and lowest elevation points, Highland Ponds, Kugler Anuchin, Linne Crater, Planck Crater, Slipher crater, Sears Crater, Mandel'shtam Crater, Virtanen Graben, Compton/Belkovich, Rumker Domes, King Crater, Luna 16/20/23/24 landing sites, Ranger 6 landing site, Wiener F Crater, Apollo 11/14/15/17, fresh craters, impact melt flows, Larmor Q crater, Mare Tranquillitatis pit, Hansteen Alpha, Moore F Crater, and Lassell Massif). To generate DTMs, the USGS ISIS software and SOCET SET° from BAE Systems are used. To increase the absolute accuracy of the DTMs, data obtained from the Lunar Orbiter Laser Altimeter (LOLA) is used to coregister the NAC images and define the geodetic reference frame. NAC DTMs have been used in examination of several sites, e.g. Compton-Belkovich, Marius Hills and Ina D-caldera [1-3]. LROC will continue to acquire high-resolution stereo images throughout the science phase of the mission and any extended mission opportunities, thus providing a vital dataset for scientific research as well as future human and robotic exploration. [1] B.L. Jolliff (2011) Nature

  5. Narrow Angle movie

    NASA Technical Reports Server (NTRS)

    1999-01-01

    This brief three-frame movie of the Moon was made from three Cassini narrow-angle images as the spacecraft passed by the Moon on the way to its closest approach with Earth on August 17, 1999. The purpose of this particular set of images was to calibrate the spectral response of the narrow-angle camera and to test its 'on-chip summing mode' data compression technique in flight. From left to right, they show the Moon in the green, blue and ultraviolet regions of the spectrum in 40, 60 and 80 millisecond exposures, respectively. All three images have been scaled so that the brightness of Crisium basin, the dark circular region in the upper right, is the same in each image. The spatial scale in the blue and ultraviolet images is 1.4 miles per pixel (2.3 kilometers). The original scale in the green image (which was captured in the usual manner and then reduced in size by 2x2 pixel summing within the camera system) was 2.8 miles per pixel (4.6 kilometers). It has been enlarged for display to the same scale as the other two. The imaging data were processed and released by the Cassini Imaging Central Laboratory for Operations (CICLOPS) at the University of Arizona's Lunar and Planetary Laboratory, Tucson, AZ.

    Photo Credit: NASA/JPL/Cassini Imaging Team/University of Arizona

    Cassini, launched in 1997, is a joint mission of NASA, the European Space Agency and Italian Space Agency. The mission is managed by NASA's Jet Propulsion Laboratory, Pasadena, CA, for NASA's Office of Space Science, Washington DC. JPL is a division of the California Institute of Technology, Pasadena, CA.

  6. NFC - Narrow Field Camera

    NASA Astrophysics Data System (ADS)

    Koukal, J.; Srba, J.; Gorková, S.

    2015-01-01

    We have been introducing a low-cost CCTV video system for faint meteor monitoring and here we describe the first results from 5 months of two-station operations. Our system called NFC (Narrow Field Camera) with a meteor limiting magnitude around +6.5mag allows research on trajectories of less massive meteoroids within individual parent meteor showers and the sporadic background. At present 4 stations (2 pairs with coordinated fields of view) of NFC system are operated in the frame of CEMeNt (Central European Meteor Network). The heart of each NFC station is a sensitive CCTV camera Watec 902 H2 and a fast cinematographic lens Meopta Meostigmat 1/50 - 52.5 mm (50 mm focal length and fixed aperture f/1.0). In this paper we present the first results based on 1595 individual meteors, 368 of which were recorded from two stations simultaneously. This data set allows the first empirical verification of theoretical assumptions for NFC system capabilities (stellar and meteor magnitude limit, meteor apparent brightness distribution and accuracy of single station measurements) and the first low mass meteoroid trajectory calculations. Our experimental data clearly showed the capabilities of the proposed system for low mass meteor registration and for calculations based on NFC data to lead to a significant refinement in the orbital elements for low mass meteoroids.

  7. High-resolution topomapping of candidate MER landing sites with Mars Orbiter Camera narrow-angle images

    USGS Publications Warehouse

    Kirk, R.L.; Howington-Kraus, E.; Redding, B.; Galuszka, D.; Hare, T.M.; Archinal, B.A.; Soderblom, L.A.; Barrett, J.M.

    2003-01-01

    We analyzed narrow-angle Mars Orbiter Camera (MOC-NA) images to produce high-resolution digital elevation models (DEMs) in order to provide topographic and slope information needed to assess the safety of candidate landing sites for the Mars Exploration Rovers (MER) and to assess the accuracy of our results by a variety of tests. The mapping techniques developed also support geoscientific studies and can be used with all present and planned Mars-orbiting scanner cameras. Photogrammetric analysis of MOC stereopairs yields DEMs with 3-pixel (typically 10 m) horizontal resolution, vertical precision consistent with ???0.22 pixel matching errors (typically a few meters), and slope errors of 1-3??. These DEMs are controlled to the Mars Orbiter Laser Altimeter (MOLA) global data set and consistent with it at the limits of resolution. Photoclinometry yields DEMs with single-pixel (typically ???3 m) horizontal resolution and submeter vertical precision. Where the surface albedo is uniform, the dominant error is 10-20% relative uncertainty in the amplitude of topography and slopes after "calibrating" photoclinometry against a stereo DEM to account for the influence of atmospheric haze. We mapped portions of seven candidate MER sites and the Mars Pathfinder site. Safety of the final four sites (Elysium, Gusev, Isidis, and Meridiani) was assessed by mission engineers by simulating landings on our DEMs of "hazard units" mapped in the sites, with results weighted by the probability of landing on those units; summary slope statistics show that most hazard units are smooth, with only small areas of etched terrain in Gusev crater posing a slope hazard.

  8. Methane Band and Continuum Band Imaging of Titan's Atmosphere Using Cassini ISS Narrow Angle Camera Pictures from the CURE/Cassini Imaging Project

    NASA Astrophysics Data System (ADS)

    Shitanishi, Jennifer; Gillam, S. D.

    2009-05-01

    The study of Titan's atmosphere, which bears resemblance to early Earth's, may help us understand more of our own. Constructing a Monte Carlo model of Titan's atmosphere is helpful to achieve this goal. Methane (MT) and continuum band (CB) images of Titan taken by the CURE/Cassini Imaging Project, using the Cassini Narrow Angle Camera (NAC) were analyzed. They were scheduled by Cassini Optical Navigation. Images were obtained at phase 53°, 112°, 161°, and 165°. They include 22 total MT1(center wavelength at 619nm), MT2(727nm), MT3(889nm), CB1(635nm), CB2(751nm), and CB3(938nm) images. They were reduced with previously written scripts using the National Optical Astronomy Observatory Image Reduction and Analysis Facility scientific analysis suite. Correction for horizontal and vertical banding and cosmic ray hits were made. The MT images were registered with corresponding CB images to ensure that subsequently measured fluxes ratios came from the same parts of the atmosphere. Preliminary DN limb-to-limb scans and loci of the haze layers will be presented. Accurate estimates of the sub-spacecraft points on each picture will be presented. Flux ratios (FMT/FCB=Q0) along the scans and total absorption coefficients along the lines of sight from the spacecraft through the pixels (and into Titan) will also be presented.

  9. Wide angle pinhole camera

    NASA Technical Reports Server (NTRS)

    Franke, J. M.

    1978-01-01

    Hemispherical refracting element gives pinhole camera 180 degree field-of-view without compromising its simplicity and depth-of-field. Refracting element, located just behind pinhole, bends light coming in from sides so that it falls within image area of film. In contrast to earlier pinhole cameras that used water or other transparent fluids to widen field, this model is not subject to leakage and is easily loaded and unloaded with film. Moreover, by selecting glass with different indices of refraction, field at film plane can be widened or reduced.

  10. Photogrammetric measurement of 3D freeform millimetre-sized objects with micro features: an experimental validation of the close-range camera calibration model for narrow angles of view

    NASA Astrophysics Data System (ADS)

    Percoco, Gianluca; Sánchez Salmerón, Antonio J.

    2015-09-01

    The measurement of millimetre and micro-scale features is performed by high-cost systems based on technologies with narrow working ranges to accurately control the position of the sensors. Photogrammetry would lower the costs of 3D inspection of micro-features and would be applicable to the inspection of non-removable micro parts of large objects too. Unfortunately, the behaviour of photogrammetry is not known when photogrammetry is applied to micro-features. In this paper, the authors address these issues towards the application of digital close-range photogrammetry (DCRP) to the micro-scale, taking into account that in literature there are research papers stating that an angle of view (AOV) around 10° is the lower limit to the application of the traditional pinhole close-range calibration model (CRCM), which is the basis of DCRP. At first a general calibration procedure is introduced, with the aid of an open-source software library, to calibrate narrow AOV cameras with the CRCM. Subsequently the procedure is validated using a reflex camera with a 60 mm macro lens, equipped with extension tubes (20 and 32 mm) achieving magnification of up to 2 times approximately, to verify literature findings with experimental photogrammetric 3D measurements of millimetre-sized objects with micro-features. The limitation experienced by the laser printing technology, used to produce the bi-dimensional pattern on common paper, has been overcome using an accurate pattern manufactured with a photolithographic process. The results of the experimental activity prove that the CRCM is valid for AOVs down to 3.4° and that DCRP results are comparable with the results of existing and more expensive commercial techniques.

  11. Narrow-angle Astrometry with SUSI

    NASA Astrophysics Data System (ADS)

    Kok, Y.; Ireland, M. J.; Robertson, J. G.; Tuthill, P. G.; Warrington, B. A.; Tango, W. J.

    2014-09-01

    SUSI (Sydney University Stellar Interferometer) is currently being fitted with a 2nd beam combiner, MUSCA (Micro-arcsecond University of Sydney Companion Astrometry), for the purpose of narrow-angle astrometry. With an aim to achieve ˜10 micro-arcseconds of angular resolution at its best, MUSCA allows SUSI to search for planets around bright binary stars, which are its primary targets. While the first beam combiner, PAVO (Precision Astronomical Visible Observations), is used to track stellar fringes during an observation, MUSCA will be used to measure separations of binary stars. MUSCA is a Michelson interferometer and its setup at SUSI will be described in this poster.

  12. Inflight Calibration of the Lunar Reconnaissance Orbiter Camera Wide Angle Camera

    NASA Astrophysics Data System (ADS)

    Mahanti, P.; Humm, D. C.; Robinson, M. S.; Boyd, A. K.; Stelling, R.; Sato, H.; Denevi, B. W.; Braden, S. E.; Bowman-Cisneros, E.; Brylow, S. M.; Tschimmel, M.

    2016-04-01

    The Lunar Reconnaissance Orbiter Camera (LROC) Wide Angle Camera (WAC) has acquired more than 250,000 images of the illuminated lunar surface and over 190,000 observations of space and non-illuminated Moon since 1 January 2010. These images, along with images from the Narrow Angle Camera (NAC) and other Lunar Reconnaissance Orbiter instrument datasets are enabling new discoveries about the morphology, composition, and geologic/geochemical evolution of the Moon. Characterizing the inflight WAC system performance is crucial to scientific and exploration results. Pre-launch calibration of the WAC provided a baseline characterization that was critical for early targeting and analysis. Here we present an analysis of WAC performance from the inflight data. In the course of our analysis we compare and contrast with the pre-launch performance wherever possible and quantify the uncertainty related to various components of the calibration process. We document the absolute and relative radiometric calibration, point spread function, and scattered light sources and provide estimates of sources of uncertainty for spectral reflectance measurements of the Moon across a range of imaging conditions.

  13. Non-contact measurement of rotation angle with solo camera

    NASA Astrophysics Data System (ADS)

    Gan, Xiaochuan; Sun, Anbin; Ye, Xin; Ma, Liqun

    2015-02-01

    For the purpose to measure a rotation angle around the axis of an object, a non-contact rotation angle measurement method based on solo camera was promoted. The intrinsic parameters of camera were calibrated using chessboard on principle of plane calibration theory. The translation matrix and rotation matrix between the object coordinate and the camera coordinate were calculated according to the relationship between the corners' position on object and their coordinates on image. Then the rotation angle between the measured object and the camera could be resolved from the rotation matrix. A precise angle dividing table (PADT) was chosen as the reference to verify the angle measurement error of this method. Test results indicated that the rotation angle measurement error of this method did not exceed +/- 0.01 degree.

  14. A wide-angle camera module for disposable endoscopy

    NASA Astrophysics Data System (ADS)

    Shim, Dongha; Yeon, Jesun; Yi, Jason; Park, Jongwon; Park, Soo Nam; Lee, Nanhee

    2016-08-01

    A wide-angle miniaturized camera module for disposable endoscope is demonstrated in this paper. A lens module with 150° angle of view (AOV) is designed and manufactured. All plastic injection-molded lenses and a commercial CMOS image sensor are employed to reduce the manufacturing cost. The image sensor and LED illumination unit are assembled with a lens module. The camera module does not include a camera processor to further reduce its size and cost. The size of the camera module is 5.5 × 5.5 × 22.3 mm3. The diagonal field of view (FOV) of the camera module is measured to be 110°. A prototype of a disposable endoscope is implemented to perform a pre-clinical animal testing. The esophagus of an adult beagle dog is observed. These results demonstrate the feasibility of a cost-effective and high-performance camera module for disposable endoscopy.

  15. Associations between Narrow Angle and Adult Anthropometry: The Liwan Eye Study

    PubMed Central

    Jiang, Yuzhen; He, Mingguang; Friedman, David S.; Khawaja, Anthony P.; Lee, Pak Sang; Nolan, Winifred P.; Yin, Qiuxia; Foster, Paul J.

    2015-01-01

    Purpose To assess the associations between narrow angle and adult anthropometry. Methods Chinese adults aged 50 years and older were recruited from a population-based survey in the Liwan District of Guangzhou, China. Narrow angle was defined as the posterior trabecular meshwork not visible under static gonioscopy in at least three quadrants (i.e. a circumference of at least 270°). Logistic regression models were used to examine the associations between narrow angle and anthropomorphic measures (height, weight and body mass index, BMI). Results Among the 912 participants, lower weight, shorter height, and lower BMI were significantly associated with narrower angle width (tests for trend: mean angle width in degrees vs weight p<0.001; vs height p<0.001; vs BMI p = 0.012). In univariate analyses, shorter height, lower weight and lower BMI were all significantly associated with greater odds of narrow angle. The crude association between height and narrow angle was largely attributable to a stronger association with age and sex. Lower BMI and weight remained significantly associated with narrow angle after adjustment for height, age, sex, axial ocular biometric measures and education. In analyses stratified by sex, the association between BMI and narrow angle was only observed in women. Conclusion Lower BMI and weight were associated with significantly greater odds of narrow angle after adjusting for age, education, axial ocular biometric measures and height. The odds of narrow angle increased 7% per 1 unit decrease in BMI. This association was most evident in women. PMID:24707840

  16. Association between choroidal thickness and anterior chamber segment in eyes with narrow or open-angle

    PubMed Central

    Li, Song-Feng; Wu, Ge-Wei; Chen, Chang-Xi; Shen, Ling; Zhang, Zhi-Bao; Gao, Fei; Wang, Ning-Li

    2016-01-01

    AIM To investigate the relationship between choroidal thickness and anterior chamber segment in subjects with eyes with narrow or open-angle. METHODS The subfoveal choroidal thickness was measured with enhanced depth-imaging optical coherence tomography and anterior chamber parameters were measured with ultrasound biomicroscopy in one eye of 23 subjects with open-angle eyes and 38 subjects with narrow-angle eyes. The mean age was 59.52±7.04y for narrow-angle subjects and 60.76±7.23y for open-angle subjects (P=0.514). Multivariate linear regression analysis was performed to assess the association between choroidal thickness and narrow-angle parameters. RESULTS There were no differences in subfoveal choroidal thickness between open- and narrow-angle subjects (P=0.231). Anterior chamber parameters, including central anterior chamber depth, trabecular iris angle, iris thickness 500 µm from the scleral spur (IT500), and ciliary body thickness at 1 mm and 2 mm from the scleral spur (CBT1, CBT2) showed significant differences between the two groups (P<0.05). Subfoveal choroidal thickness showed negative correlation (β=-0.496, P=0.016) only with anterior chamber depth in the open-angle group and with age (β=-0.442, P=0.003) and IT500 (β=-0.399, P=0.008) in the narrow-angle group. However, subfoveal choroidal thickness was not correlated with trabecular iris angle, anterior chamber depth, ciliary body thickness, or central corneal thickness in the narrow-angle group. CONCLUSION Choroidal thickness does not differ in the two groups and has not correlated with anterior chamber parameters in narrow-angle subjects, suggesting a lack of relationship between choroidal thickness and primary angle-closure glaucoma. PMID:27588269

  17. Association of Ocular Conditions with Narrow Angles in Different Ethnicities

    PubMed Central

    Lee, Roland Y.; Chon, Brian H.; Lin, Shuai-Chun; He, Mingguang; Lin, Shan C.

    2015-01-01

    Purpose To quantify the predictive strength of anterior chamber area (ACA), anterior chamber volume (ACV), anterior chamber width (ACW), lens vault (LV), iris thickness (IT), and iris area (IArea) for two angle width parameters, trabecular-iris space area (TISA750) and angle opening distance (AOD750) at 750 μm from the scleral spur, in different ethnicities. Design Prospective, cross-sectional study. Methods Anterior segment optical coherence tomography images for 166 white, 90 African, 75 Hispanic, and 132 Chinese subjects were analyzed. First, ACA, ACV, ACW, LV, IT, and IArea were compared among ethnic groups. Second, associations of TISA750 and AOD750 with ACA, ACV, ACW, LV, IT and IArea were investigated within each ethnic group using multivariable linear regression models, standardized regression coefficients (β), and coefficients of determination (R2). Results Significant ethnic differences were observed in ACA, ACV, ACW, LV, IT, and IArea (all P<0.05). ACA, ACV, and LV were significant predictors of TISA750 and AOD750 in all ethnic groups (all P<0.001). ACW and IT were significant predictors of AOD750 in whites and Africans (all P<0.05). ACW and IT were significant predictors of TISA750 in whites (all P<0.05). IArea was a significant predictor of AOD750 in Chinese (P<0.05). ACA, ACV, and LV had the highest predictive strength for both TISA750 and AOD750 in all ethnic groups based on β and R2. Conclusions Despite ethnic differences in ACA, ACV, ACW, LV, IT, and IArea, the same three anterior segment parameters (ACA, ACV, and LV) were the strongest predictors of angle width (TISA750 and AOD750) in all four ethnic groups. PMID:26093287

  18. Narrow Field-Of Visual Odometry Based on a Focused Plenoptic Camera

    NASA Astrophysics Data System (ADS)

    Zeller, N.; Quint, F.; Stilla, U.

    2015-03-01

    In this article we present a new method for visual odometry based on a focused plenoptic camera. This method fuses the depth data gained by a monocular Simultaneous Localization and Mapping (SLAM) algorithm and the one received from a focused plenoptic camera. Our algorithm uses the depth data and the totally focused images supplied by the plenoptic camera to run a real-time semi-dense direct SLAM algorithm. Based on this combined approach, the scale ambiguity of a monocular SLAM system can be overcome. Furthermore, the additional light-field information highly improves the tracking capabilities of the algorithm. Thus, visual odometry even for narrow field of view (FOV) cameras is possible. We show that not only tracking profits from the additional light-field information. By accumulating the depth information over multiple tracked images, also the depth accuracy of the focused plenoptic camera can be highly improved. This novel approach improves the depth error by one order of magnitude compared to the one received from a single light-field image.

  19. Improved iris localization by using wide and narrow field of view cameras for iris recognition

    NASA Astrophysics Data System (ADS)

    Kim, Yeong Gon; Shin, Kwang Yong; Park, Kang Ryoung

    2013-10-01

    Biometrics is a method of identifying individuals by their physiological or behavioral characteristics. Among other biometric identifiers, iris recognition has been widely used for various applications that require a high level of security. When a conventional iris recognition camera is used, the size and position of the iris region in a captured image vary according to the X, Y positions of a user's eye and the Z distance between a user and the camera. Therefore, the searching area of the iris detection algorithm is increased, which can inevitably decrease both the detection speed and accuracy. To solve these problems, we propose a new method of iris localization that uses wide field of view (WFOV) and narrow field of view (NFOV) cameras. Our study is new as compared to previous studies in the following four ways. First, the device used in our research acquires three images, one each of the face and both irises, using one WFOV and two NFOV cameras simultaneously. The relation between the WFOV and NFOV cameras is determined by simple geometric transformation without complex calibration. Second, the Z distance (between a user's eye and the iris camera) is estimated based on the iris size in the WFOV image and anthropometric data of the size of the human iris. Third, the accuracy of the geometric transformation between the WFOV and NFOV cameras is enhanced by using multiple matrices of the transformation according to the Z distance. Fourth, the searching region for iris localization in the NFOV image is significantly reduced based on the detected iris region in the WFOV image and the matrix of geometric transformation corresponding to the estimated Z distance. Experimental results showed that the performance of the proposed iris localization method is better than that of conventional methods in terms of accuracy and processing time.

  20. Spectral data of specular reflectance, narrow-angle transmittance and angle-resolved surface scattering of materials for solar concentrators

    PubMed Central

    Good, Philipp; Cooper, Thomas; Querci, Marco; Wiik, Nicolay; Ambrosetti, Gianluca; Steinfeld, Aldo

    2015-01-01

    The spectral specular reflectance of conventional and novel reflective materials for solar concentrators is measured with an acceptance angle of 17.5 mrad over the wavelength range 300−2500 nm at incidence angles 15–60° using a spectroscopic goniometry system. The same experimental setup is used to determine the spectral narrow-angle transmittance of semi-transparent materials for solar collector covers at incidence angles 0–60°. In addition, the angle-resolved surface scattering of reflective materials is recorded by an area-scan CCD detector over the spectral range 350–1050 nm. A comprehensive summary, discussion, and interpretation of the results are included in the associated research article “Spectral reflectance, transmittance, and angular scattering of materials for solar concentrators” in Solar Energy Materials and Solar Cells. PMID:26862556

  1. Spectral data of specular reflectance, narrow-angle transmittance and angle-resolved surface scattering of materials for solar concentrators.

    PubMed

    Good, Philipp; Cooper, Thomas; Querci, Marco; Wiik, Nicolay; Ambrosetti, Gianluca; Steinfeld, Aldo

    2016-03-01

    The spectral specular reflectance of conventional and novel reflective materials for solar concentrators is measured with an acceptance angle of 17.5 mrad over the wavelength range 300-2500 nm at incidence angles 15-60° using a spectroscopic goniometry system. The same experimental setup is used to determine the spectral narrow-angle transmittance of semi-transparent materials for solar collector covers at incidence angles 0-60°. In addition, the angle-resolved surface scattering of reflective materials is recorded by an area-scan CCD detector over the spectral range 350-1050 nm. A comprehensive summary, discussion, and interpretation of the results are included in the associated research article "Spectral reflectance, transmittance, and angular scattering of materials for solar concentrators" in Solar Energy Materials and Solar Cells.

  2. Characterizing Geometric Distortion of the Lunar Reconnaissance Orbiter Wide Angle Camera

    NASA Astrophysics Data System (ADS)

    Speyerer, E.; Wagner, R.; Robinson, M. S.; Becker, K. J.; Anderson, J.; Thomas, P. C.

    2011-12-01

    Each month the Lunar Reconnaissance Orbiter (LRO) Wide Angle Camera (WAC) provides 100 m scale images of nearly the entire Moon, each month with different range of lighting conditions [1]. Pre-flight calibration efforts provided a baseline for correcting the geometric distortion present in the WAC. However, residual errors of 1-2 pixels existed with this original model. In-flight calibration enables the derivation of a precise correction for geometric distortion to provide sub-pixel map projection accuracy. For the in-flight calibration, we compared WAC images to high-resolution (0.5 - 2.0 meter scale) images provided by the Narrow Angle Camera (NAC). Since the NAC has very narrow field of view (2.86°) its geometric accuracy is well characterized. The additions of the WAC-derived 100 m/pixel digital terrain model (GLD100) [2] and refined ephemeris provided by LOLA [3] have improved our efforts to remove small distortion artifacts in the WAC camera model. Since the NAC field of view is always in the same cross-track location in the WAC frame, NAC and WAC images of the same regions, under similar lighting conditions, were map projected. Hundreds of NAC (truth image) and WAC images were then co-registered using an automatic registration algorithm in ISIS [4]. This output was fed into a second ISIS program (fplanemap) that converted the registration offsets to focal plane coordinates for the distorted (original) and undistorted (corrected location derived from the truth image) pixel [4]. With this dataset, offsets in the WAC distortion model were identified and accounted for with a new 2D Taylor series function that has been added to the existing radial model. This technique improves the accurate placement of each pixel across the sensor in target space. We have applied this correction to the 643 nm band and will derive the coefficients for the remaining bands. Once this study is complete, a new camera model, instrument kernel (IK), and frames kernel (FK) will be

  3. 12. 22'X34' original vellum, VariableAngle Launcher, 'SIDE VIEW CAMERA TRACK ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    12. 22'X34' original vellum, Variable-Angle Launcher, 'SIDE VIEW CAMERA TRACK H-20 BRIDGE MODIFICATIONS' drawn at 3/16'=1'-0' and 1/2'1'-0'. (BUORD Sketch # 208784, PAPW 907). - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  4. 13. 22'X34' original vellum, VariableAngle Launcher, 'SIDEVIEW CAMERA CAR TRACK ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    13. 22'X34' original vellum, Variable-Angle Launcher, 'SIDEVIEW CAMERA CAR TRACK DETAILS' drawn at 1/4'=1'-0' (BUORD Sketch # 208078, PAPW 908). - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  5. 10. 22'X34' original blueprint, VariableAngle Launcher, 'SIDE VIEW CAMERA CARSTEEL ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    10. 22'X34' original blueprint, Variable-Angle Launcher, 'SIDE VIEW CAMERA CAR-STEEL FRAME AND AXLES' drawn at 1/2'=1'-0'. (BOURD Sketch # 209124). - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  6. The Wide Angle Camera for the Rosetta Mission

    NASA Astrophysics Data System (ADS)

    Debei, S.; Angrilli, F.; Barbieri, C.; Bianchini, G.; da Deppo, V.; de Cecco, M.; Fornasier, S.; Guizzo, G.; Naletto, G.; Ragazzoni, R.; Saggin, B.; Tondello, G.; Zaccariotto, M.; Brunello, F.; Peron, F.

    1999-12-01

    The Wide Angle Camera (WAC) for the Rosetta mission had to fulfil many scientific requirements: Field of View of 12x12 sq deg and focal length of 140 mm, excellent optical throughput in the range 240 - 900 nm after 10 years in space, Encircled Energy of 80 ratio of 10E+4 in order to detect faint gaseous emission features around a bright nucleus, minimum exposure times of 10 msec with photometric accuracy better than 5x102, scattered light rejection for sources out of the FoV (e.g. Sun) and in the FoV (e.g. cometary nucleus), a cover to close off the cometary dust, an optical bench capable to maintain the optical alignment in a passive way and to support the shutter, the baffle, a double filter wheel and the Focal Plane Assembly. To these initial requirements several other constraints were added in the course of the design, in particular a very complex thermal profile, a massive shielding of the front FoV in order to protect the CCD from the cosmic radiation, and a very strict total mass envelope. These requirements called for an unconventional optical design, with 2 aspherical mirrors in an off-axis configuration (the primary mirror being convex), and a carefully studied 3-stage baffle. Both the shutter and the front cover provided extremely challenging technological goals, for the mechanical and for the electronics aspects of them. The paper describes all the main elements of the WAC. At present, the Structural Thermo Model has been delivered after successful completion of vibration and vacuum tests. The STM optical bench configuration has been slightly revised for the flight Model in order to provide more attenuation to the internal baffle.

  7. High-Precision Narrow Angle Astrometry with a Space-Borne Interferometer

    NASA Technical Reports Server (NTRS)

    Milman, Mark H.; Murphy, Dave

    2008-01-01

    This paper develops an observing and processing scheme for narrow angle astrometry using a single baseline interferometer without the aid of "grid" stars to characterize the interferometer baseline vector in inertial space. The basic concept derives from the recognition that over a narrow field the set of fundamental unknown instrument parameters that arise because the interferometer baseline vector has large uncertainties (since there are no grid star measurements) is indistinguishable from a particular set of unobservable errors in the determination of star positions within the field. Reference stars within the narrow field of regard are used to circumvent the unobservable modes. Feasibility of the approach is demonstrated through analysis and example simulations.

  8. Multi-frequency properties of an narrow angle tail radio galaxy J 0037+18

    NASA Astrophysics Data System (ADS)

    Patra, Dusmanta; Chakrabarti, Sandip Kumar; Pal, Sabyasachi; Konar, Chiranjib

    2016-07-01

    We will present multi-frequency properties of narrow angle tailed radio galaxy J 0037+18 using data from Giant Metrewave Radio Telescope (GMRT) and Jansky Very Large Array (JVLA). The angle between two lobes is only 38 degree. We will discuss magnetic field and particle life time of the jet. Spectral properties of the source will be discussed. We also used optical and X-ray data to investigate host environment.

  9. 93. 22'X34' original blueprint, VariableAngle Launcher, 'OVERHEAD CAMERA SUSPENSION SYSTEM, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    93. 22'X34' original blueprint, Variable-Angle Launcher, 'OVERHEAD CAMERA SUSPENSION SYSTEM, TOWER STAY CABLES' drawn at 3/4'=1'-0'. (BUORD Sketch # 208783). - Variable Angle Launcher Complex, Variable Angle Launcher, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  10. 92. 22'X34' original blueprint, VariableAngle Launcher, 'CAMERA CABLE TOWER PLAN ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    92. 22'X34' original blueprint, Variable-Angle Launcher, 'CAMERA CABLE TOWER PLAN AND ELEVATION' drawn at 3/8'=1'0' (BUORD Sketch # 208580). - Variable Angle Launcher Complex, Variable Angle Launcher, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  11. Ultra-narrow bandpass filters for infrared applications with improved angle of incidence performance

    NASA Astrophysics Data System (ADS)

    Rahmlow, Thomas D.; Fredell, Markus; Chanda, Sheetal; Johnson, Robert

    2016-05-01

    Narrow band-pass optical interference filters are used for a variety of applications to improve signal quality in laser based systems. Applications include LIDAR, sensor processing and free space communications. A narrow band width optical filter allows for passage of the laser signal while rejecting ambient light. The more narrow the bandwidth, the better the signal to noise. However, the bandwidth of a design for a particular application is typically limited by a number of factors including spectral shift over the operational angles of incidence, thermal shift over the range of operating temperature and, in the case of laser communication, rejection of adjacent laser channels. The trade-off of these parameters can significantly impact system design and performance. This paper presents design and material approaches to maximize the performance of narrow bandpass filters in the infrared.

  12. Calibration of a trinocular system formed with wide angle lens cameras.

    PubMed

    Ricolfe-Viala, Carlos; Sanchez-Salmeron, Antonio-Jose; Valera, Angel

    2012-12-01

    To obtain 3D information of large areas, wide angle lens cameras are used to reduce the number of cameras as much as possible. However, since images are high distorted, errors in point correspondences increase and 3D information could be erroneous. To increase the number of data from images and to improve the 3D information, trinocular sensors are used. In this paper a calibration method for a trinocular sensor formed with wide angle lens cameras is proposed. First pixels locations in the images are corrected using a set of constraints which define the image formation in a trinocular system. When pixels location are corrected, lens distortion and trifocal tensor is computed.

  13. What are the benefits of having multiple camera angles?

    Atmospheric Science Data Center

    2014-12-08

    ... such as soil, vegetation and snow or ice) reflect solar light differently in different directions. In fact, the variations between the ... angles allowed by the scanner or push-broom design of the sensor. The accumulation of multiangular observations with such instruments ...

  14. Sheath effects on current collection by particle detectors with narrow acceptance angles

    NASA Technical Reports Server (NTRS)

    Singh, N.; Baugher, C. R.

    1981-01-01

    Restriction of the aperture acceptance angle of an ion or electron trap on an attracting spacecraft significantly alters the volt-ampere characteristics of the instrument in a low Mach number plasma. It is shown when the angular acceptance of the aperture is restricted the current to the collector tends to be independent of the Debye length. Expressions for the RPA characteristics for both a thin sheath and a thick sheath are derived; and it is shown that as the aperture is narrowed the curves tend toward equivalence.

  15. Narrow-angle tail radio sources and the distribution of galaxy orbits in Abell clusters

    NASA Technical Reports Server (NTRS)

    O'Dea, Christopher P.; Sarazin, Craig L.; Owen, Frazer N.

    1987-01-01

    The present data on the orientations of the tails with respect to the cluster centers of a sample of 70 narrow-angle-tail (NAT) radio sources in Abell clusters show the distribution of tail angles to be inconsistent with purely radial or circular orbits in all the samples, while being consistent with isotropic orbits in (1) the whole sample, (2) the sample of NATs far from the cluster center, and (3) the samples of morphologically regular Abell clusters. Evidence for very radial orbits is found, however, in the sample of NATs near the cluster center. If these results can be generalized to all cluster galaxies, then the presence of radial orbits near the center of Abell clusters suggests that violent relaxation may not have been fully effective even within the cores of the regular clusters.

  16. Narrow Angle Wide Spectral Range Radiometer Design FEANICS/REEFS Radiometer Design Report

    NASA Technical Reports Server (NTRS)

    Camperchioli, William

    2005-01-01

    A critical measurement for the Radiative Enhancement Effects on Flame Spread (REEFS) microgravity combustion experiment is the net radiative flux emitted from the gases and from the solid fuel bed. These quantities are measured using a set of narrow angle, wide spectral range radiometers. The radiometers are required to have an angular field of view of 1.2 degrees and measure over the spectral range of 0.6 to 30 microns, which presents a challenging design effort. This report details the design of this radiometer system including field of view, radiometer response, radiometric calculations, temperature effects, error sources, baffling and amplifiers. This report presents some radiometer specific data but does not present any REEFS experiment data.

  17. Simulating a dual beam combiner at SUSI for narrow-angle astrometry

    NASA Astrophysics Data System (ADS)

    Kok, Yitping; Maestro, Vicente; Ireland, Michael J.; Tuthill, Peter G.; Robertson, J. Gordon

    2013-08-01

    The Sydney University Stellar Interferometer (SUSI) has two beam combiners, i.e. the Precision Astronomical Visible Observations (PAVO) and the Microarcsecond University of Sydney Companion Astrometry (MUSCA). The primary beam combiner, PAVO, can be operated independently and is typically used to measure properties of binary stars of less than 50 milliarcsec (mas) separation and the angular diameters of single stars. On the other hand, MUSCA was recently installed and must be used in tandem with the former. It is dedicated for microarcsecond precision narrow-angle astrometry of close binary stars. The performance evaluation and development of the data reduction pipeline for the new setup was assisted by an in-house computer simulation tool developed for this and related purposes. This paper describes the framework of the simulation tool, simulations carried out to evaluate the performance of each beam combiner and the expected astrometric precision of the dual beam combiner setup, both at SUSI and possible future sites.

  18. Narrow-angle tail radio sources and evidence for radial orbits in Abell clusters

    NASA Technical Reports Server (NTRS)

    O'Dea, Christopher P.; Owen, Frazer N.; Sarazin, Craig L.

    1986-01-01

    Published observational data on the tail orientations (TOs) of 60 narrow-angle-tail (NAT) radio sources in Abell clusters of galaxies are analyzed statistically using a maximum-likelihood approach. The results are presented in a table, and it is found that the observed TO distributions in the whole sample and in subsamples of morphologically regular NATs and NATs with pericentric distances d greater than 500 kpc are consistent with isotropic orbits, whereas the TOs for NATs with d less than 500 kpc are consistent with highly radial orbits. If radial orbits were observed near the centers of other types of cluster galaxies as well, it could be inferred that violent relaxation during cluster formation was incomplete, and that clusters form by spherical collapse and secondary infall, as proposed by Gunn (1977).

  19. In situ measurements of particle friction angles in steep, narrow channels

    NASA Astrophysics Data System (ADS)

    Prancevic, J.; Lamb, M. P.

    2013-12-01

    vary in a consistent manner with bed slope (φ = 51°, 67°, and 65°, respectively). At an individual site the degree of interlocking is the primary control on particle friction angle. However, the degree of interlocking was not higher in the steep (θ = 9.0°), narrow (W/D50 = 12.5) channel. This indicates that increased grain stability may not play a crucial role in increasing the threshold shear stresses required for sediment motion on very steep slopes.

  20. Development of ultrawide-angle compact camera using free-form optics

    NASA Astrophysics Data System (ADS)

    Takahashi, Koichi

    2011-01-01

    Digital imaging enables us to easily obtain, store, process and display images as digital data, and it is used not only for digital cameras but also for surveillance and in-vehicle cameras, measuring devices, and so on. On the basis of our studies, a free-form optic is not suitable for a high-definition zooming capability; however, it makes optical devices smaller, thinner and lighter. Therefore, it is worth considering the applications of free-form optics other than camera modules for cellular phones. We have investigated the possibilities of miniaturization, which is the most significant feature of free-form optics, and developed a practical application. In this paper, we describe the results of design, prototyping, and evaluation of our ultrawide-angle compact imaging system using free-form optics.

  1. The first demonstration of the concept of "narrow-FOV Si/CdTe semiconductor Compton camera"

    NASA Astrophysics Data System (ADS)

    Ichinohe, Yuto; Uchida, Yuusuke; Watanabe, Shin; Edahiro, Ikumi; Hayashi, Katsuhiro; Kawano, Takafumi; Ohno, Masanori; Ohta, Masayuki; Takeda, Shin`ichiro; Fukazawa, Yasushi; Katsuragawa, Miho; Nakazawa, Kazuhiro; Odaka, Hirokazu; Tajima, Hiroyasu; Takahashi, Hiromitsu; Takahashi, Tadayuki; Yuasa, Takayuki

    2016-01-01

    The Soft Gamma-ray Detector (SGD), to be deployed on board the ASTRO-H satellite, has been developed to provide the highest sensitivity observations of celestial sources in the energy band of 60-600 keV by employing a detector concept which uses a Compton camera whose field-of-view is restricted by a BGO shield to a few degree (narrow-FOV Compton camera). In this concept, the background from outside the FOV can be heavily suppressed by constraining the incident direction of the gamma ray reconstructed by the Compton camera to be consistent with the narrow FOV. We, for the first time, demonstrate the validity of the concept using background data taken during the thermal vacuum test and the low-temperature environment test of the flight model of SGD on ground. We show that the measured background level is suppressed to less than 10% by combining the event rejection using the anti-coincidence trigger of the active BGO shield and by using Compton event reconstruction techniques. More than 75% of the signals from the field-of-view are retained against the background rejection, which clearly demonstrates the improvement of signal-to-noise ratio. The estimated effective area of 22.8 cm2 meets the mission requirement even though not all of the operational parameters of the instrument have been fully optimized yet.

  2. Narrowed Aortoseptal Angle Is Related to Increased Central Blood Pressure and Aortic Pressure Wave Reflection.

    PubMed

    Olafiranye, Oladipupo; Ibrahim, Mediha; Kamran, Haroon; Venner-Jones, Kinda; McFarlane, Samy I; Salciccioli, Louis; Lazar, Jason M

    2012-08-01

    The left ventricular (LV) aortoseptal angle (ASA) decreases with age, and is associated with basal septal hypertrophy (septal bulge). Enhanced arterial pressure wave reflection is known to impact LV hypertrophy. We assessed whether ASA is related to central blood pressure (BP) and augmentation index (AI), a measure of the reflected pressure wave. We studied 75 subjects (age 62 ± 16 years; 66% female) who were referred for transthoracic echocardiography and had radial artery applanation tonometry within 24 h. Peripheral systolic BP (P-SBP), peripheral diastolic BP (P-DBP), and peripheral pulse pressure (P-PP) were obtained by sphygmomanometry. Central BPs (C-SBP, C-DBP, C-PP) and AI were derived from applanation tonometry. AI was corrected for heart rate (AI75). The basal septal wall thickness (SWT), mid SWT and ASA were measured using the parasternal long axis echocardiographic view. Mean ASA and AI75 were 117 ± 11° and 22 ± 11%, respectively. ASA correlated with AI75 (r = -0.31, p ≤ 0.01), C-SBP (r = -0.24, p = 0.04), C-PP (r = -0.29, p = 0.01), but only showed a trend towards significance with P-SBP (r = -0.2, p = 0.09) and P-PP (r = -0.21, p = 0.08). Interestingly, C-PP was correlated with basal SWT (r = 0.27, p = 0.02) but not with mid SWT (r = 0.19, p = 0.11). On multivariate linear regression analysis, adjusted for age, gender, weight, and mean arterial pressure, AI75 was an independent predictor of ASA (p = 0.02). Our results suggest that a narrowed ASA is related to increased pressure wave reflection and higher central BP. Further studies are needed to determine whether narrowed LV ASA is a cause or consequence of enhanced wave reflection and whether other factors are involved.

  3. Phase-Referenced Interferometry and Narrow-Angle Astrometry with SUSI

    NASA Astrophysics Data System (ADS)

    Kok, Y.; Ireland, M. J.; Tuthill, P. G.; Robertson, J. G.; Warrington, B. A.; Rizzuto, A. C.; Tango, W. J.

    The Sydney University Stellar Interferometer (SUSI) now incorporates a new beam combiner, called the Microarc-second University of Sydney Companion Astrometry instrument (MUSCA), for the purpose of high precision differential astrometry of bright binary stars. Operating in the visible wavelength regime where photon-counting and post-processing fringe tracking is possible, MUSCA will be used in tandem with SUSI's primary beam combiner, Precision Astronomical Visible Observations (PAVO), to record high spatial resolution fringes and thereby measure the separation of fringe packets of binary stars. In its current phase of development, the dual beam combiner configuration has successfully demonstrated for the first time a dual-star phase-referencing operation in visible wavelengths. This paper describes the beam combiner optics and hardware, the network of metrology systems employed to measure every non-common path between the two beam combiners and also reports on a recent narrow-angle astrometric observation of δ Orionis A (HR 1852) as the project enters its on-sky testing phase.

  4. Synthesizing wide-angle and arbitrary view-point images from a circular camera array

    NASA Astrophysics Data System (ADS)

    Fukushima, Norishige; Yendo, Tomohiro; Fujii, Toshiaki; Tanimoto, Masayuki

    2006-02-01

    We propose a technique of Imaged-Based Rendering(IBR) using a circular camera array. By the result of having recorded the scene as surrounding the surroundings, we can synthesize a more dynamic arbitrary viewpoint images and a wide angle images like a panorama . This method is based on Ray- Space, one of the image-based rendering, like Light Field. Ray-Space is described by the position (x, y) and a direction (θ, φ) of the ray's parameter which passes a reference plane. All over this space, when the camera has been arranged circularly, the orbit of the point equivalent to an Epipor Plane Image(EPI) at the time of straight line arrangement draws a sin curve. Although described in a very clear form, in case a rendering is performed, pixel of which position of which camera being used and the work for which it asks become complicated. Therefore, the position (u, v) of the position (s, t) pixel of a camera like Light Filed redescribes space expression. It makes the position of a camera a polar-coordinates system (r, theta), and is making it close to description of Ray-Space. Thereby, although the orbit of a point serves as a complicated periodic function of periodic 2pi, the handling of a rendering becomes easy. From such space, the same as straight line arrangement, arbitrary viewpoint picture synthesizing is performed only due to a geometric relationship between cameras. Moreover, taking advantage of the characteristic of concentrating on one circular point, we propose the technique of generating a wide-angle picture like a panorama. When synthesizing a viewpoint, since it is overlapped and is recording the ray of all the directions of the same position, this becomes possible. Having stated until now is the case where it is a time of the camera fully having been arranged and a plenoptic sampling being filled. The discrete thing which does not fill a sampling is described from here. When arranging a camera in a straight line and compounding a picture, in spite of

  5. Axial and total-body bone densitometry using a narrow-angle fan-beam.

    PubMed

    Mazess, R B; Hanson, J A; Payne, R; Nord, R; Wilson, M

    2000-01-01

    We assessed a new dual-energy bone densitometer, the PRODIGY, that uses a narrow-angle fan-beam (4.5 degrees) oriented parallel to the longitudinal axis of the body (i.e., perpendicular to the usual orientation). High-resolution scans across the body can be stepped at 17 mm intervals. The energy-sensitive array detector uses cadmium zinc telluride, which allowed rapid photon counting. Spine and femur scans required 30 s, and total-body scans required 4-5 min; the dose was only 3.7 mrem and 0.04 mrem respectively, or about 5 to 10 times lower than conventional fan-beam densitometry. We found only a small influence of soft-tissue thickness on bone mineral density (BMD) results. There was also a small (+/- 1%) influence of height above the tabletop on BMD results. A software correction for object height allowed a first-order correction for the large magnification effects of position on bone mineral content (BMC) and area. Consequently, the results for BMC and area, as well as BMD, with PRODIGY corresponded closely to those obtained using the predecessor DPX densitometer, both in vitro and in vivo; there was a generally high correlation (r = 0.98-0.99) for BMD values. Spine and femur values for BMC, area and BMD averaged within 0.5% in vivo (n = 122), as did total-body BMC and BMD (n = 46). PRODIGY values for total-body lean tissue and fat also corresponded within 1% to DPX values. Regional and total-body BMD were measured with 0.5% precision in vitro and 1% precision in vivo. The new PRODIGY densitometer appears to combine the low dose and high accuracy of pencil-beam densitometry with the speed of fan-beam densitometers.

  6. Visible-infrared achromatic imaging by wavefront coding with wide-angle automobile camera

    NASA Astrophysics Data System (ADS)

    Ohta, Mitsuhiko; Sakita, Koichi; Shimano, Takeshi; Sugiyama, Takashi; Shibasaki, Susumu

    2016-09-01

    We perform an experiment of achromatic imaging with wavefront coding (WFC) using a wide-angle automobile lens. Our original annular phase mask for WFC was inserted to the lens, for which the difference between the focal positions at 400 nm and at 950 nm is 0.10 mm. We acquired images of objects using a WFC camera with this lens under the conditions of visible and infrared light. As a result, the effect of the removal of the chromatic aberration of the WFC system was successfully determined. Moreover, we fabricated a demonstration set assuming the use of a night vision camera in an automobile and showed the effect of the WFC system.

  7. Visible–infrared achromatic imaging by wavefront coding with wide-angle automobile camera

    NASA Astrophysics Data System (ADS)

    Ohta, Mitsuhiko; Sakita, Koichi; Shimano, Takeshi; Sugiyama, Takashi; Shibasaki, Susumu

    2016-09-01

    We perform an experiment of achromatic imaging with wavefront coding (WFC) using a wide-angle automobile lens. Our original annular phase mask for WFC was inserted to the lens, for which the difference between the focal positions at 400 nm and at 950 nm is 0.10 mm. We acquired images of objects using a WFC camera with this lens under the conditions of visible and infrared light. As a result, the effect of the removal of the chromatic aberration of the WFC system was successfully determined. Moreover, we fabricated a demonstration set assuming the use of a night vision camera in an automobile and showed the effect of the WFC system.

  8. Development of soft x-ray large solid angle camera onboard WF-MAXI

    NASA Astrophysics Data System (ADS)

    Kimura, Masashi; Tomida, Hiroshi; Ueno, Shiro; Kawai, Nobuyuki; Yatsu, Yoichi; Arimoto, Makoto; Mihara, Tatehiro; Serino, Motoko; Tsunemi, Hiroshi; Yoshida, Atsumasa; Sakamoto, Takanori; Kohmura, Takayoshi; Negoro, Hitoshi

    2014-07-01

    Wide-Field MAXI (WF-MAXI) planned to be installed in Japanese Experiment Module "Kibo" Exposed Facility of the international space station (ISS). WF-MAXI consists of two types of cameras, Soft X-ray Large Solid Angle Camera (SLC) and Hard X-ray Monitor (HXM). HXM is multi-channel arrays of CsI scintillators coupled with avalanche photodiodes (APDs) which covers the energy range of 20 - 200 keV. SLC is arrays of CCD, which is evolved version of MAXI/SSC. Instead of slit and collimator in SSC, SLC is equipped with coded mask allowing its field of view to 20% of all sky at any given time, and its location determination accuracy to few arcminutes. In older to achieve larger effective area, the number of CCD chip and the size of each chip will be larger than that of SSC. We are planning to use 59 x 31 mm2 CCD chip provided by Hamamatsu Photonics. Each camera will be quipped with 16 CCDs and total of 4 cameras will be installed in WF-MAXI. Since SLC utilize X-ray CCDs it must equip active cooling system for CCDs. Instead of using the peltier cooler, we use mechanical coolers that are also employed in Astro-H. In this way we can cool the CCDs down to -100C. ISS orbit around the earth in 90 minutes; therefore a point source moves 4 arcminutes per second. In order to achieve location determination accuracy, we need fast readout from CCD. The pulse heights are stacked into a single row along the vertical direction. Charge is transferred continuously, thus the spatial information along the vertical direction is lost and replaced with the precise arrival time information. Currently we are making experimental model of the camera body including the CCD and electronics for the CCDs. In this paper, we show the development status of SLC.

  9. O2 atmospheric band measurements with WINDII: Performance of a narrow band filter/wide angle Michelson combination in space

    SciTech Connect

    Ward, W.E.; Hersom, C.H.; Tai, C.C.; Gault, W.A.; Shepherd, G.G.; Solheim, B.H.

    1994-12-31

    Among the emissions viewed by the Wind Imaging Interferometer (WINDII) on the Upper Atmosphere Research Satellite (UARS) are selected lines in the (0-0) transition of the O2 atmospheric band. These lines are viewed simultaneously using a narrow band filter/wide-angle Michelson interferometer combination. The narrow band filter is used to separate the lines on the CCD (spectral-spatial scanning) and the Michelson used to modulate the emissions so that winds and rotational temperatures may be measured from the Doppler shifts and relative intensities of the lines. In this report this technique will be outlined and the on-orbit behavior since launch summarized.

  10. A New Approach to Micro-arcsecond Astrometry with SIM Allowing Early Mission Narrow Angle Measurements of Compelling Astronomical Targets

    NASA Technical Reports Server (NTRS)

    Shaklan, Stuart; Pan, Xiaopei

    2004-01-01

    The Space Interferometry Mission (SIM) is capable of detecting and measuring the mass of terrestrial planets around stars other than our own. It can measure the mass of black holes and the visual orbits of radio and x-ray binary sources. SIM makes possible a new level of understanding of complex astrophysical processes. SIM achieves its high precision in the so-called narrow-angle regime. This is defined by a 1 degree diameter field in which the position of a target star is measured with respect to a set of reference stars. The observation is performed in two parts: first, SIM observes a grid of stars that spans the full sky. After a few years, repeated observations of the grid allow one to determine the orientation of the interferometer baseline. Second, throughout the mission, SIM periodically observes in the narrow-angle mode. Every narrow-angle observation is linked to the grid to determine the precise attitude and length of the baseline. The narrow angle process demands patience. It is not until five years after launch that SIM achieves its ultimate accuracy of 1 microarcsecond. The accuracy is degraded by a factor of approx. 2 at mid-mission. Our work proposes a technique for narrow angle astrometry that does not rely on the measurement of grid stars. This technique, called Gridless Narrow Angle Astrometry (GNAA) can obtain microarcsecond accuracy and can detect extra-solar planets and other exciting objects with a few days of observation. It can be applied as early as during the first six months of in-orbit calibration (IOC). The motivations for doing this are strong. First, and obviously, it is an insurance policy against a catastrophic mid-mission failure. Second, at the start of the mission, with several space-based interferometers in the planning or implementation phase, NASA will be eager to capture the public's imagination with interferometric science. Third, early results and a technique that can duplicate those results throughout the mission will

  11. Evaluation of the Quality of Action Cameras with Wide-Angle Lenses in Uav Photogrammetry

    NASA Astrophysics Data System (ADS)

    Hastedt, H.; Ekkel, T.; Luhmann, T.

    2016-06-01

    The application of light-weight cameras in UAV photogrammetry is required due to restrictions in payload. In general, consumer cameras with normal lens type are applied to a UAV system. The availability of action cameras, like the GoPro Hero4 Black, including a wide-angle lens (fish-eye lens) offers new perspectives in UAV projects. With these investigations, different calibration procedures for fish-eye lenses are evaluated in order to quantify their accuracy potential in UAV photogrammetry. Herewith the GoPro Hero4 is evaluated using different acquisition modes. It is investigated to which extent the standard calibration approaches in OpenCV or Agisoft PhotoScan/Lens can be applied to the evaluation processes in UAV photogrammetry. Therefore different calibration setups and processing procedures are assessed and discussed. Additionally a pre-correction of the initial distortion by GoPro Studio and its application to the photogrammetric purposes will be evaluated. An experimental setup with a set of control points and a prospective flight scenario is chosen to evaluate the processing results using Agisoft PhotoScan. Herewith it is analysed to which extent a pre-calibration and pre-correction of a GoPro Hero4 will reinforce the reliability and accuracy of a flight scenario.

  12. Limitations of the narrow-angle convergent pair. [of Viking Orbiter photographs for triangulation and topographic mapping

    NASA Technical Reports Server (NTRS)

    Arthur, D. W. G.

    1977-01-01

    Spatial triangulations and topographies of the Martian surface derived from Viking Orbiter pictures depend on the use of symmetric narrow-angle convergent pairs. The overlap in each pair is close to 100 percent and the ground principal points virtually coincide. The analysis of this paper reveals a high degree of indeterminacy in such pairs and at least in part explains the rather disappointing precision of the associated spatial triangulations.

  13. Detection of microcalcification clusters by 2D-mammography and narrow and wide angle digital breast tomosynthesis

    NASA Astrophysics Data System (ADS)

    Hadjipanteli, Andria; Elangovan, Premkumar; Looney, Padraig T.; Mackenzie, Alistair; Wells, Kevin; Dance, David R.; Young, Kenneth C.

    2016-03-01

    The aim of this study was to compare the detection of microcalcification clusters by human observers in breast images using 2D-mammography and narrow (15°/15 projections) and wide (50°/25 projections) angle digital breast tomosynthesis (DBT). Simulated microcalcification clusters with a range of microcalcification diameters (125 μm-275 μm) were inserted into 6 cm thick simulated compressed breasts. Breast images were produced with and without inserted microcalcification clusters using a set of image modelling tools, which were developed to represent clinical imaging by mammography and tomosynthesis. Commercially available software was used for image processing and image reconstruction. The images were then used in a series of 4-alternative forced choice (4AFC) human observer experiments conducted for signal detection with the microcalcification clusters as targets. The minimum detectable calcification diameter was found for each imaging modality: (i) 2D-mammography: 164+/-5 μm (ii) narrow angle DBT: 210+/-5 μm, (iii) wide angle DBT: 255+/-4 μm. A statistically significant difference was found between the minimum detectable calcification diameters that can be detected by the three imaging modalities. Furthermore, it was found that there was not a statistically significant difference between the results of the five observers that participated in this study. In conclusion, this study presents a method that quantifies the threshold diameter required for microcalcification detection, using high resolution, realistic images with observers, for the comparison of DBT geometries with 2D-mammography. 2Dmammography can visualise smaller detail diameter than both DBT imaging modalities and narrow-angle DBT can visualise a smaller detail diameter than wide-angle DBT.

  14. Small-angle approximation to the transfer of narrow laser beams in anisotropic scattering media

    NASA Technical Reports Server (NTRS)

    Box, M. A.; Deepak, A.

    1981-01-01

    The broadening and the signal power detected of a laser beam traversing an anisotropic scattering medium were examined using the small-angle approximation to the radiative transfer equation in which photons suffering large-angle deflections are neglected. To obtain tractable answers, simple Gaussian and non-Gaussian functions for the scattering phase functions are assumed. Two other approximate approaches employed in the field to further simplify the small-angle approximation solutions are described, and the results obtained by one of them are compared with those obtained using small-angle approximation. An exact method for obtaining the contribution of each higher order scattering to the radiance field is examined but no results are presented.

  15. Development of a large-angle pinhole gamma camera with depth-of-interaction capability for small animal imaging

    NASA Astrophysics Data System (ADS)

    Baek, C.-H.; An, S. J.; Kim, H.-I.; Choi, Y.; Chung, Y. H.

    2012-01-01

    A large-angle gamma camera was developed for imaging small animal models used in medical and biological research. The simulation study shows that a large field of view (FOV) system provides higher sensitivity with respect to a typical pinhole gamma cameras by reducing the distance between the pinhole and the object. However, this gamma camera suffers from the degradation of the spatial resolution at the periphery region due to parallax error by obliquely incident photons. We propose a new method to measure the depth of interaction (DOI) using three layers of monolithic scintillators to reduce the parallax error. The detector module consists of three layers of monolithic CsI(Tl) crystals with dimensions of 50.0 × 50.0 × 2.0 mm3, a Hamamatsu H8500 PSPMT and a large-angle pinhole collimator with an acceptance angle of 120°. The 3-dimensional event positions were determined by the maximum-likelihood position-estimation (MLPE) algorithm and the pre-generated look up table (LUT). The spatial resolution (FWHM) of a Co-57 point-like source was measured at different source position with the conventional method (Anger logic) and with DOI information. We proved that high sensitivity can be achieved without degradation of spatial resolution using a large-angle pinhole gamma camera: this system can be used as a small animal imaging tool.

  16. A triple axis double crystal multiple reflection camera for ultra small angle X-ray scattering

    NASA Astrophysics Data System (ADS)

    Lambard, Jacques; Lesieur, Pierre; Zemb, Thomas

    1992-06-01

    To extend the domain of small angle X-ray scattering requires multiple reflection crystals to collimate the beam. A double crystal, triple axis X-ray camera using multiple reflection channel cut crystals is described. Procedures for measuring the desmeared scattering cross-section on absolute scale are described as well as the measurement from several typical samples : fibrils of collagen, 0.3 μm diameter silica spheres, 0.16 μm diameter interacting latex spheres, porous lignite coal, liquid crystals in a surfactant-water system, colloidal crystal of 0.32 μm diameter silica spheres. L'extension du domaine de diffusion des rayons-X vers les petits angles demande l'emploi de cristaux à réflexions multiples pour collimater le faisceau. Nous décrivons une caméra à rayons-X à trois axes où les réflexions multiples sont réalisées dans deux cristaux à gorge. Nous donnons ensuite les procédures de déconvolution pour obtenir la section efficace de diffusion en échelle absolue, ainsi que les résultats des mesures effectuées avec plusieurs échantillons typiques : fibres de collagène, sphères de silice de 0,3 μm de diamètre, sphères de latex de 0,16 μm de diamètre en interaction, charbon lignite poreux, cristaux liquides formés dans un système eau-tensioactif, solution colloïdale de sphères de silice de 0,32 μm de diamètre.

  17. The measurement and modelling of light scattering by phytoplankton cells at narrow forward angles

    NASA Astrophysics Data System (ADS)

    MacCallum, Iain; Cunningham, Alex; McKee, David

    2004-07-01

    A procedure has been devised for measuring the angular dependence of light scattering from suspensions of phytoplankton cells at forward angles from 0.25° to 8°. The cells were illuminated with a spatially-filtered laser beam and the angular distribution of scattered light measured by tracking a photodetector across the Fourier plane of a collecting lens using a stepper-motor driven stage. The procedure was calibrated by measuring scattering from latex bead suspensions with known size distributions. It was then used to examine the scattering from cultures of the unicellular algae Isochrysis galbana (4 µm × 5 µm), Dunaliella primolecta (6 µm × 7 µm) and Rhinomonas reticulata (5 µm × 11 µm). The results were compared with the predictions of Mie theory. Excellent agreement was obtained for spherical particles. A suitable choice of spherical-equivalent scattering parameters was required to enable reasonable agreement within the first diffraction lobe for ellipsoidal particles.

  18. Wide angle and narrow-band asymmetric absorption in visible and near-infrared regime through lossy Bragg stacks

    PubMed Central

    Shu, Shiwei; Zhan, Yawen; Lee, Chris; Lu, Jian; Li, Yang Yang

    2016-01-01

    Absorber is an important component in various optical devices. Here we report a novel type of asymmetric absorber in the visible and near-infrared spectrum which is based on lossy Bragg stacks. The lossy Bragg stacks can achieve near-perfect absorption at one side and high reflection at the other within the narrow bands (several nm) of resonance wavelengths, whereas display almost identical absorption/reflection responses for the rest of the spectrum. Meanwhile, this interesting wavelength-selective asymmetric absorption behavior persists for wide angles, does not depend on polarization, and can be ascribed to the lossy characteristics of the Bragg stacks. Moreover, interesting Fano resonance with easily tailorable peak profiles can be realized using the lossy Bragg stacks. PMID:27251768

  19. Numerical simulations of the bending of narrow-angle-tail radio jets by ram pressure or pressure gradients

    NASA Technical Reports Server (NTRS)

    Soker, Noam; Sarazin, Craig L.; O'Dea, Christopher P.

    1988-01-01

    Three-dimensional numerical hydrodynamic simulations are used to study the bending of radio jets. The simulations are compared with observations of jets in narrow-angle-tail radio sources. Two mechanisms for the observed bending are considered: direct bending of quasi-continuous jets by ram pressure from intergalactic gas and bending by pressure gradients in the interstellar gas of the host galaxy, the pressure gradients themselves being the result of ram pressure by intergalactic gas. It is shown that the pressure gradients are much less effective in bending jets, implying that the jets have roughly 30 times lower momentum fluxes if they are bent by this mechanism. Ram-pressure bending produces jets with 'kidney-shaped' cross sections; when observed from the side, these jets appear to have diffuse extensions on the downstream side. On the other hand, pressure-gradient bending causes the jets to be densest near their upstream side.

  20. Early direct-injection, low-temperature combustion of diesel fuel in an optical engine utilizing a 15-hole, dual-row, narrow-included-angle nozzle.

    SciTech Connect

    Gehrke, Christopher R.; Radovanovic, Michael S.; Milam, David M.; Martin, Glen C.; Mueller, Charles J.

    2008-04-01

    Low-temperature combustion of diesel fuel was studied in a heavy-duty, single-cylinder optical engine employing a 15-hole, dual-row, narrow-included-angle nozzle (10 holes x 70/mD and 5 holes x 35/mD) with 103-/gmm-diameter orifices. This nozzle configuration provided the spray targeting necessary to contain the direct-injected diesel fuel within the piston bowl for injection timings as early as 70/mD before top dead center. Spray-visualization movies, acquired using a high-speed camera, show that impingement of liquid fuel on the piston surface can result when the in-cylinder temperature and density at the time of injection are sufficiently low. Seven single- and two-parameter sweeps around a 4.82-bar gross indicated mean effective pressure load point were performed to map the sensitivity of the combustion and emissions to variations in injection timing, injection pressure, equivalence ratio, simulated exhaust-gas recirculation, intake temperature, intake boost pressure, and load. High-speed movies of natural luminosity were acquired by viewing through a window in the cylinder wall and through a window in the piston to provide quasi-3D information about the combustion process. These movies revealed that advanced combustion phasing resulted in intense pool fires within the piston bowl, after the end of significant heat release. These pool fires are a result of fuel-films created when the injected fuel impinged on the piston surface. The emissions results showed a strong correlation with pool-fire activity. Smoke and NO/dx emissions rose steadily as pool-fire intensity increased, whereas HC and CO showed a dramatic increase with near-zero pool-fire activity.

  1. Wide-angle and ultrathin camera module using a curved hexagonal microlens array and all spherical surfaces.

    PubMed

    Liang, Wei-Lun; Su, Guo-Dung J

    2014-10-10

    In this paper, we propose a wide-angle and thin camera module integrating the principles of an insect's compound eye and the human eye, mimicking them with a curved hexagonal microlens array and a hemispherical lens, respectively. Compared to typical mobile phone cameras with more than four lenses and a limited full field of view (FFOV), the proposed system uses only two lenses to achieve a wide FFOV. Furthermore, the thickness of our proposed system is only 2.7 mm. It has an f-number of 2.07, an image diameter of 4.032 mm, and a diagonal FFOV of 136°. The results showed good image quality with a modulation transfer function above 0.3 at a Nyquist frequency of 166  cycles/mm. PMID:25322408

  2. On an assessment of surface roughness estimates from lunar laser altimetry pulse-widths for the Moon from LOLA using LROC narrow-angle stereo DTMs.

    NASA Astrophysics Data System (ADS)

    Muller, Jan-Peter; Poole, William

    2013-04-01

    Neumann et al. [1] proposed that laser altimetry pulse-widths could be employed to derive "within-footprint" surface roughness as opposed to surface roughness estimated from between laser altimetry pierce-points such as the example for Mars [2] and more recently from the 4-pointed star-shaped LOLA (Lunar reconnaissance Orbiter Laser Altimeter) onboard the NASA-LRO [3]. Since 2009, the LOLA has been collecting extensive global laser altimetry data with a 5m footprint and ?25m between the 5 points in a star-shape. In order to assess how accurately surface roughness (defined as simple RMS after slope correction) derived from LROC matches with surface roughness derived from LOLA footprints, publicly released LROC-NA (LRO Camera Narrow Angle) 1m Digital Terrain Models (DTMs) were employed to measure the surface roughness directly within each 5m footprint. A set of 20 LROC-NA DTMs were examined. Initially the match-up between the LOLA and LROC-NA orthorectified images (ORIs) is assessed visually to ensure that the co-registration is better than the LOLA footprint resolution. For each LOLA footprint, the pulse-width geolocation is then retrieved and this is used to "cookie-cut" the surface roughness and slopes derived from the LROC-NA DTMs. The investigation which includes data from a variety of different landforms shows little, if any correlation between surface roughness estimated from DTMs with LOLA pulse-widths at sub-footprint scale. In fact there is only any perceptible correlation between LOLA and LROC-DTMs at baselines of 40-60m for surface roughness and 20m for slopes. [1] Neumann et al. Mars Orbiter Laser Altimeter pulse width measurements and footprint-scale roughness. Geophysical Research Letters (2003) vol. 30 (11), paper 1561. DOI: 10.1029/2003GL017048 [2] Kreslavsky and Head. Kilometer-scale roughness of Mars: results from MOLA data analysis. J Geophys Res (2000) vol. 105 (E11) pp. 26695-26711. [3] Rosenburg et al. Global surface slopes and roughness of the

  3. Lunar Reconnaissance Orbiter Camera (LROC) instrument overview

    USGS Publications Warehouse

    Robinson, M.S.; Brylow, S.M.; Tschimmel, M.; Humm, D.; Lawrence, S.J.; Thomas, P.C.; Denevi, B.W.; Bowman-Cisneros, E.; Zerr, J.; Ravine, M.A.; Caplinger, M.A.; Ghaemi, F.T.; Schaffner, J.A.; Malin, M.C.; Mahanti, P.; Bartels, A.; Anderson, J.; Tran, T.N.; Eliason, E.M.; McEwen, A.S.; Turtle, E.; Jolliff, B.L.; Hiesinger, H.

    2010-01-01

    The Lunar Reconnaissance Orbiter Camera (LROC) Wide Angle Camera (WAC) and Narrow Angle Cameras (NACs) are on the NASA Lunar Reconnaissance Orbiter (LRO). The WAC is a 7-color push-frame camera (100 and 400 m/pixel visible and UV, respectively), while the two NACs are monochrome narrow-angle linescan imagers (0.5 m/pixel). The primary mission of LRO is to obtain measurements of the Moon that will enable future lunar human exploration. The overarching goals of the LROC investigation include landing site identification and certification, mapping of permanently polar shadowed and sunlit regions, meter-scale mapping of polar regions, global multispectral imaging, a global morphology base map, characterization of regolith properties, and determination of current impact hazards.

  4. Matching the best viewing angle in depth cameras for biomass estimation based on poplar seedling geometry.

    PubMed

    Andújar, Dionisio; Fernández-Quintanilla, César; Dorado, José

    2015-06-04

    In energy crops for biomass production a proper plant structure is important to optimize wood yields. A precise crop characterization in early stages may contribute to the choice of proper cropping techniques. This study assesses the potential of the Microsoft Kinect for Windows v.1 sensor to determine the best viewing angle of the sensor to estimate the plant biomass based on poplar seedling geometry. Kinect Fusion algorithms were used to generate a 3D point cloud from the depth video stream. The sensor was mounted in different positions facing the tree in order to obtain depth (RGB-D) images from different angles. Individuals of two different ages, e.g., one month and one year old, were scanned. Four different viewing angles were compared: top view (0°), 45° downwards view, front view (90°) and ground upwards view (-45°). The ground-truth used to validate the sensor readings consisted of a destructive sampling in which the height, leaf area and biomass (dry weight basis) were measured in each individual plant. The depth image models agreed well with 45°, 90° and -45° measurements in one-year poplar trees. Good correlations (0.88 to 0.92) between dry biomass and the area measured with the Kinect were found. In addition, plant height was accurately estimated with a few centimeters error. The comparison between different viewing angles revealed that top views showed poorer results due to the fact the top leaves occluded the rest of the tree. However, the other views led to good results. Conversely, small poplars showed better correlations with actual parameters from the top view (0°). Therefore, although the Microsoft Kinect for Windows v.1 sensor provides good opportunities for biomass estimation, the viewing angle must be chosen taking into account the developmental stage of the crop and the desired parameters. The results of this study indicate that Kinect is a promising tool for a rapid canopy characterization, i.e., for estimating crop biomass

  5. Matching the Best Viewing Angle in Depth Cameras for Biomass Estimation Based on Poplar Seedling Geometry

    PubMed Central

    Andújar, Dionisio; Fernández-Quintanilla, César; Dorado, José

    2015-01-01

    In energy crops for biomass production a proper plant structure is important to optimize wood yields. A precise crop characterization in early stages may contribute to the choice of proper cropping techniques. This study assesses the potential of the Microsoft Kinect for Windows v.1 sensor to determine the best viewing angle of the sensor to estimate the plant biomass based on poplar seedling geometry. Kinect Fusion algorithms were used to generate a 3D point cloud from the depth video stream. The sensor was mounted in different positions facing the tree in order to obtain depth (RGB-D) images from different angles. Individuals of two different ages, e.g., one month and one year old, were scanned. Four different viewing angles were compared: top view (0°), 45° downwards view, front view (90°) and ground upwards view (−45°). The ground-truth used to validate the sensor readings consisted of a destructive sampling in which the height, leaf area and biomass (dry weight basis) were measured in each individual plant. The depth image models agreed well with 45°, 90° and −45° measurements in one-year poplar trees. Good correlations (0.88 to 0.92) between dry biomass and the area measured with the Kinect were found. In addition, plant height was accurately estimated with a few centimeters error. The comparison between different viewing angles revealed that top views showed poorer results due to the fact the top leaves occluded the rest of the tree. However, the other views led to good results. Conversely, small poplars showed better correlations with actual parameters from the top view (0°). Therefore, although the Microsoft Kinect for Windows v.1 sensor provides good opportunities for biomass estimation, the viewing angle must be chosen taking into account the developmental stage of the crop and the desired parameters. The results of this study indicate that Kinect is a promising tool for a rapid canopy characterization, i.e., for estimating crop biomass

  6. HAWC+: A Detector, Polarimetry, and Narrow-Band Imaging Upgrade to SOFIA's Far-Infrared Facility Camera

    NASA Astrophysics Data System (ADS)

    Dowell, C. D.; Staguhn, J.; Harper, D. A.; Ames, T. J.; Benford, D. J.; Berthoud, M.; Chapman, N. L.; Chuss, D. T.; Dotson, J. L.; Irwin, K. D.; Jhabvala, C. A.; Kovacs, A.; Looney, L.; Novak, G.; Stacey, G. J.; Vaillancourt, J. E.; HAWC+ Science Collaboration

    2013-01-01

    HAWC, the High-resolution Airborne Widebandwidth Camera, is the facility far-infrared camera for SOFIA, providing continuum imaging from 50 to 250 microns wavelength. As a result of NASA selection as a SOFIA Second Generation Instruments upgrade investigation, HAWC will be upgraded with enhanced capability for addressing current problems in star formation and interstellar medium physics prior to commissioning in early 2015. We describe the capabilities of the upgraded HAWC+, as well as our initial science program. The mapping speed of HAWC is increased by a factor of 9, accomplished by using NASA/Goddard's Backshort-Under-Grid bolometer detectors in a 64x40 format. Two arrays are used in a dual-beam polarimeter format, and the full complement of 5120 transition-edge detectors is read using NIST SQUID multiplexers and U.B.C. Multi-Channel Electronics. A multi-band polarimeter is added to the HAWC opto-mechanical system, at the cryogenic pupil image, employing rotating quartz half-wave plates. Six new filters are added to HAWC+, bringing the full set to 53, 63, 89, 155, and 216 microns at R = 5 resolution and 52, 63, 88, 158, and 205 microns at R = 300 resolution. The latter filters are fixed-tuned to key fine-structure emission lines from [OIII], [OI], [CII], and [NII]. Polarimetry can be performed in any of the filter bands. The first-light science program with HAWC+ emphasizes polarimetry for the purpose of mapping magnetic fields in Galactic clouds. The strength and character of magnetic fields in molecular clouds before, during, and after the star formation phase are largely unknown, despite pioneering efforts on the KAO and ground-based telescopes. SOFIA and HAWC+ provide significant new capability: sensitivity to extended dust emission (to A_V ~ 1) which is unmatched, ~10 arcsec angular resolution combined with wide-field mapping which allows statistical estimates of magnetic field strength, and wavelength coverage spanning the peak of the far

  7. The wavelength dependence of the lunar phase curve as seen by the Lunar Reconnaissance Orbiter wide-angle camera

    NASA Astrophysics Data System (ADS)

    Hapke, Bruce; Denevi, Brett; Sato, Hiroyuki; Braden, Sarah; Robinson, Mark

    2012-03-01

    The Lunar Reconnaissance Orbiter wide-angle camera measured the bidirectional reflectances of two areas on the Moon at seven wavelengths between 321 and 689 nm and at phase angles between 0° and 120°. It is not possible to account for the phase curves unless both coherent backscatter and shadow hiding contribute to the opposition effect. For the analyzed highlands area, coherent backscatter contributes nearly 40% in the UV, increasing to over 60% in the red. This conclusion is supported by laboratory measurements of the circular polarization ratios of Apollo regolith samples, which also indicate that the Moon's opposition effect contains a large component of coherent backscatter. The angular width of the lunar opposition effect is almost independent of wavelength, contrary to theories of the coherent backscatter which, for the Moon, predict that the width should be proportional to the square of the wavelength. When added to the large body of other experimental evidence, this lack of wavelength dependence reinforces the argument that our current understanding of the coherent backscatter opposition effect is incomplete or perhaps incorrect. It is shown that phase reddening is caused by the increased contribution of interparticle multiple scattering as the wavelength and albedo increase. Hence, multiple scattering cannot be neglected in lunar photometric analyses. A simplified semiempirical bidirectional reflectance function is proposed for the Moon that contains four free parameters and that is mathematically simple and straightforward to invert. This function should be valid everywhere on the Moon for phase angles less than about 120°, except at large viewing and incidence angles close to the limb, terminator, and poles.

  8. Study of Flow and Heat Transfer Characteristics of non-periodical attack angle in Narrow Rectangular Channel with Longitudinal Vortex generators

    NASA Astrophysics Data System (ADS)

    Wang, L.; Huang, J.

    2010-03-01

    The heat transfer enhancement of Longitudinal Vortex (LV) is a kind of technology with good efficiency and low resistance. LV is produced by Longitudinal Vortex Generators (LVGs) mounted on the heated surface. With relative long influence distance and simple structure, the LVGs can be used in narrow channels with flat surface. The dimension of narrow rectangular channel is 600 mm (length)×40 mm (width) ×3 mm (gap width), the single rectangular block LVGs is laid out in one heated plate. The dimension of LVGs is as follows: height is 1.8 mm, width is 2.2 mm, length is 14 mm, transverse distance is 4 mm, and longitudinal distance is 150 mm. The attack angle of LVGs is very important to extend this kind of technology in narrow rectangular channel with water medium. In previous study, the attack angle of LVGs of periodicity mounted was discussed and the optimal value was 440. In this paper, the attack angle of the first and the second LVG are changed and the others keep 440. Study of flow and heat transfer characteristic of non-periodicity attack angle is completed. The result shows that with the change of attack angle of the first and the second LVGs, the heat transfer enhancement of water medium is advantageous. This conclusion should be extended when the working medium is vapor-liquid two-phase. The results of this calculate method are compared with the experimental results of thermal infrared imager and phase doppler particle analyzer, and they are reasonable. FLUENT6.2 is used to simulate this question, and three velocity components of water flow have been used to define residual intensity ratio of LV.

  9. Comparing Laser Peripheral Iridotomy to Cataract Extraction in Narrow Angle Eyes Using Anterior Segment Optical Coherence Tomography

    PubMed Central

    Melese, Ephrem; Peterson, Jeffrey R.; Feldman, Robert M.; Baker, Laura A.; Bell, Nicholas P.; Chuang, Alice Z.

    2016-01-01

    Purpose To evaluate the changes in anterior chamber angle (ACA) parameters in primary angle closure (PAC) spectrum eyes before and after cataract extraction (CE) and compare to the changes after laser peripheral iridotomy (LPI) using anterior segment optical coherence tomography (ASOCT). Methods Twenty-eight PAC spectrum eyes of 18 participants who underwent CE and 34 PAC spectrum eyes of 21 participants who underwent LPI were included. ASOCT images with 3-dimensional mode angle analysis scans were taken with the CASIA SS-1000 (Tomey Corp., Nagoya, Japan) before and after CE or LPI. Mixed-effect model analysis was used to 1) compare best-corrected visual acuity, intraocular pressure, and ACA parameters before and after CE; 2) identify and estimate the effects of potential contributing factors affecting changes in ACA parameters; and 3) compare CE and LPI treatment groups. Results The increase in average angle parameters (TISA750 and TICV750) was significantly greater after CE than LPI. TICV750 increased by 102% (2.114 [±1.203] μL) after LPI and by 174% (4.546 [± 1.582] μL) after CE (P < 0.001). Change of TICV750 in the CE group was significantly affected by age (P = 0.002), race (P = 0.006), and intraocular lens power (P = 0.037). Conclusions CE results in greater anatomic changes in the ACA than LPI in PAC spectrum eyes. ASOCT may be used to follow anatomic changes in the angle after intervention. PMID:27606482

  10. Post-trial anatomical frame alignment procedure for comparison of 3D joint angle measurement from magnetic/inertial measurement units and camera-based systems.

    PubMed

    Li, Qingguo; Zhang, Jun-Tian

    2014-11-01

    Magnetic and inertial measurement units (MIMUs) have been widely used as an alternative to traditional camera-based motion capture systems for 3D joint kinematics measurement. Since these sensors do not directly measure position, a pre-trial anatomical calibration, either with the assistance of a special protocol/apparatus or with another motion capture system is required to establish the transformation matrices between the local sensor frame and the anatomical frame (AF) of each body segment on which the sensors are attached. Because the axes of AFs are often used as the rotational axes in the joint angle calculation, any difference in the AF determination will cause discrepancies in the calculated joint angles. Therefore, a direct comparison of joint angles between MIMU systems and camera-based systems is less meaningful because the calculated joint angles contain a systemic error due to the differences in the AF determination. To solve this problem a new post-trial AF alignment procedure is proposed. By correcting the AF misalignments, the joint angle differences caused by the difference in AF determination are eliminated and the remaining discrepancies are mainly from the measurement accuracy of the systems themselves. Lower limb joint angles from 30 walking trials were used to validate the effectiveness of the proposed AF alignment procedure. This technique could serve as a new means for calibrating magnetic/inertial sensor-based motion capture systems and correcting for AF misalignment in scenarios where joint angles are compared directly.

  11. Ultra-narrow angle-tunable Fabry-Perot bandpass interference filter for use as tuning element in infrared lasers

    NASA Astrophysics Data System (ADS)

    Kischkat, Jan; Peters, Sven; Semtsiv, Mykhaylo P.; Wegner, Tristan; Elagin, Mikaela; Monastyrskyi, Grygorii; Flores, Yuri; Kurlov, Sergii; Masselink, W. Ted

    2014-11-01

    We have developed a bandpass infrared interference filter with sufficiently narrow bandwidth to be potentially suitable for tuning a self-stabilizing external-cavity quantum-cascade laser (ECQCL) in single-mode operation and describe the process parameters for fabrication of such filters with central wavelengths in the 3-12 μm range. The filter has a passband width of 6 nm or 0.14% with peak transmission of 55% and a central wavelength of approximately 4.0 μm. It can be tuned through over 4% by tilting with respect to the incident beam and offers orders of magnitude larger angular dispersion than diffraction gratings. We compare filters with single-cavity and coupled-cavity Fabry-Perot designs.

  12. Inner jet kinematics and the viewing angle towards the γ-ray narrow-line Seyfert 1 galaxy 1H 0323+342

    NASA Astrophysics Data System (ADS)

    Fuhrmann, Lars; Karamanavis, Vassilis; Komossa, Stefanie; Angelakis, Emmanouil; Krichbaum, Thomas P.; Schulz, Robert; Kreikenbohm, Annika; Kadler, Matthias; Myserlis, Ioannis; Ros, Eduardo; Nestoras, Ioannis; Zensus, J. Anton

    2016-11-01

    Near-Eddington accretion rates onto low-mass black holes are thought to be a prime driver of the multi-wavelength properties of the narrow-line Seyfert 1 (NLS1) population of active galactic nuclei (AGNs). Orientation effects have repeatedly been considered as another important factor involved, but detailed studies have been hampered by the lack of measured viewing angles towards this type of AGN. Here we present multi-epoch, 15 GHz VLBA images (MOJAVE program) of the radio-loud and Fermi/LAT-detected NLS1 galaxy 1H 0323+342. These are combined with single-dish, multi-frequency radio monitoring of the source's variability, obtained with the Effelsberg 100-m and IRAM 30-m telescopes, in the course of the F-GAMMA program. The VLBA images reveal six components with apparent speeds of ∼ 1–7 c, and one quasi-stationary feature. Combining the obtained apparent jet speed (β app) and variability Doppler factor (D var) estimates together with other methods, we constrain the viewing angle θ towards 1H 0323+342 to θ ≤ 4°–13°. Using literature values of βapp and D var, we also deduce a viewing angle of ≤ 8°–9° towards another radio- and γ-ray-loud NLS1, namely SBS 0846+513.

  13. 2. VAL CAMERA CAR, VIEW OF CAMERA CAR AND TRACK ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    2. VAL CAMERA CAR, VIEW OF CAMERA CAR AND TRACK WITH CAMERA STATION ABOVE LOOKING WEST TAKEN FROM RESERVOIR. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  14. 6. VAL CAMERA CAR, DETAIL OF COMMUNICATION EQUIPMENT INSIDE CAMERA ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    6. VAL CAMERA CAR, DETAIL OF COMMUNICATION EQUIPMENT INSIDE CAMERA CAR WITH CAMERA MOUNT IN FOREGROUND. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  15. 7. VAL CAMERA CAR, DETAIL OF 'FLARE' OR TRAJECTORY CAMERA ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    7. VAL CAMERA CAR, DETAIL OF 'FLARE' OR TRAJECTORY CAMERA INSIDE CAMERA CAR. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  16. Combined ab interno trabeculotomy and lens extraction: a novel management option for combined uveitic and chronic narrow angle raised intraocular pressure.

    PubMed

    Lin, Siying; Gupta, Bhaskar; Rossiter, Jonathan

    2016-02-01

    Minimally invasive glaucoma surgery is a developing area that has the potential to replace traditional glaucoma surgery, with its known risk profile, but at present there are no randomised controlled data to validate its use. We report on a case where sequential bilateral combined ab interno trabeculotomy and lens extraction surgery was performed on a 45-year-old woman with combined uveitic and chronic narrow angle raised intraocular pressure. Maximal medical management alone could not control the intraocular pressure. At 12-month follow-up, the patient had achieved stable intraocular pressure in both eyes on a combination of topical ocular antiglaucomatous and steroid therapies. This case demonstrates the effectiveness of trabecular meshwork ablation via ab interno trabeculotomy in a case of complex mixed mechanism glaucoma.

  17. Use of Multiple-Angle Snow Camera (MASC) Observations as a Constraint on Radar-Based Retrievals of Snowfall Rate

    NASA Astrophysics Data System (ADS)

    Cooper, S.; Garrett, T. J.; Wood, N.; L'Ecuyer, T. S.

    2015-12-01

    We use a combination of Ka-band Zenith Radar (KaZR) and Multiple-Angle Snow Camera (MASC) observations at the ARM North Slope Alaska Climate Facility Site at Barrow to quantify snowfall. The optimal-estimation framework is used to combine information from the KaZR and MASC into a common retrieval scheme, where retrieved estimates of snowfall are compared to observations at a nearby NWS measurement site for evaluation. Modified from the operational CloudSat algorithm, the retrieval scheme returns estimates of the vertical profile of exponential PSD slope parameter with a constant number density. These values, in turn, can be used to calculate surface snowrate (liquid equivalent) given knowledge of snowflake microphysical properties and fallspeeds. We exploit scattering models for a variety of ice crystal shapes including aggregates developed specifically from observations of snowfall properties at high-latitudes, as well as more pristine crystal shapes involving sector plates, bullet rosettes, and hexagonal columns. As expected, initial retrievals suggest large differences (300% for some events) in estimated snowfall accumulations given the use of the different ice crystal assumptions. The complex problem of how we can more quantitatively link MASC snowflake images to specific radar scattering properties is an ongoing line of research. Here, however, we do quantify the use of MASC observations of fallspeed and PSD parameters as constraint on our optimal-estimation retrieval approach. In terms of fallspeed, we find differences in estimated snowfall of nearly 50% arising from the use of MASC observed fallspeeds relative to those derived from traditional fallspeed parameterizations. In terms of snowflake PSD, we find differences of nearly 25% arising from the use of MASC observed slope parameters relative to those derived from field campaign observations of high-altitude snow events. Of course, these different sources of error conspire to make the estimate of snowfall

  18. The rate and causes of lunar space weathering: Insights from Lunar Reconnaissance Orbiter Wide Angle Camera ultraviolet observations

    NASA Astrophysics Data System (ADS)

    Denevi, B. W.; Robinson, M. S.; Sato, H.; Hapke, B. W.; McEwen, A. S.; Hawke, B. R.

    2011-12-01

    Lunar Reconnaissance Orbiter Wide Angle Camera global ultraviolet and visible imaging provides a unique opportunity to examine the rate and causes of space weathering on the Moon. Silicates typically have a strong decrease in reflectance toward UV wavelengths (<~450 nm) due to strong bands at 250 nm and in the far UV. Metallic iron is relatively spectrally neutral, and laboratory spectra suggest that its addition to mature soils in the form of submicroscopic iron (also known as nanophase iron) flattens silicate spectra, significantly reducing spectral slope in the ultraviolet. Reflectance at ultraviolet wavelengths may be especially sensitive to the surface coatings that form due to exposure to space weathering because scattering from the surfaces of grains contributes a larger fraction to the reflectance spectrum at short wavelengths. We find that the UV slope (as measured by the 320/415 nm ratio) is a more sensitive measure of maturity than indexes based on visible and near-infrared wavelengths. Only the youngest features (less than ~100 Ma) retain a UV slope that is distinct from mature soils of the same composition. No craters >20 km have UV slopes that approach those observed in laboratory spectra of fresh lunar materials (powdered lunar rocks). While the 320/415 nm ratio increases by ~18% from powdered rocks to mature soils in laboratory samples, Giordano Bruno, the freshest large crater, only shows a 3% difference between fresh and mature materials. At the resolution of our UV data (400 m/pixel), we observe some small (<5 km) craters that show a ~14% difference in 320/415 nm ratio from their mature surroundings. UV observations show that Reiner Gamma has had significantly lower levels of space weathering than any of the Copernican craters we examined, and was the only region we found with a UV slope that approached laboratory values for fresh powdered rock samples. This is consistent with the hypothesis that its high albedo is due to magnetic shielding from

  19. Why do I sometimes see bright speckles in an image of the Terrain product, particularly at the oblique camera angles?

    Atmospheric Science Data Center

    2014-12-08

    MISR Level 1B2 data products use various high data values to signify fill, and one of the fill values (16377) in the 14 ... more oblique angles, the prevalence of this particular fill value increases as the view angle increases. This particular fill value does ...

  20. Reliability of sagittal plane hip, knee, and ankle joint angles from a single frame of video data using the GAITRite camera system.

    PubMed

    Ross, Sandy A; Rice, Clinton; Von Behren, Kristyn; Meyer, April; Alexander, Rachel; Murfin, Scott

    2015-01-01

    The purpose of this study was to establish intra-rater, intra-session, and inter-rater, reliability of sagittal plane hip, knee, and ankle angles with and without reflective markers using the GAITRite walkway and single video camera between student physical therapists and an experienced physical therapist. This study included thirty-two healthy participants age 20-59, stratified by age and gender. Participants performed three successful walks with and without markers applied to anatomical landmarks. GAITRite software was used to digitize sagittal hip, knee, and ankle angles at two phases of gait: (1) initial contact; and (2) mid-stance. Intra-rater reliability was more consistent for the experienced physical therapist, regardless of joint or phase of gait. Intra-session reliability was variable, the experienced physical therapist showed moderate to high reliability (intra-class correlation coefficient (ICC) = 0.50-0.89) and the student physical therapist showed very poor to high reliability (ICC = 0.07-0.85). Inter-rater reliability was highest during mid-stance at the knee with markers (ICC = 0.86) and lowest during mid-stance at the hip without markers (ICC = 0.25). Reliability of a single camera system, especially at the knee joint shows promise. Depending on the specific type of reliability, error can be attributed to the testers (e.g. lack of digitization practice and marker placement), participants (e.g. loose fitting clothing) and camera systems (e.g. frame rate and resolution). However, until the camera technology can be upgraded to a higher frame rate and resolution, and the software can be linked to the GAITRite walkway, the clinical utility for pre/post measures is limited.

  1. Quasi-null lens optical system for the fabrication of an oblate convex ellipsoidal mirror: application to the Wide Angle Camera of the Rosetta space mission.

    PubMed

    Pelizzo, Maria-Guglielmina; Da Deppo, Vania; Naletto, Giampiero; Ragazzoni, Roberto; Novi, Andrea

    2006-08-20

    The design of a quasi-null lens system for the fabrication of an aspheric oblate convex ellipsoidal mirror is presented. The Performance and tolerance of the system have been analyzed. The system has been applied successfully for the fabrication of the primary mirror of the Wide Angle Camera (WAC), the imaging system onboard the Rosetta, the European Space Agency cornerstone mission dedicated to the exploration of a comet. The WAC is based on an off-axis two-mirror configuration, in which the primary mirror is an oblate convex ellipsoid with a significant conic constant.

  2. Angle-of-arrival anemometry by means of a large-aperture Schmidt-Cassegrain telescope equipped with a CCD camera.

    PubMed

    Cheon, Yonghun; Hohreiter, Vincent; Behn, Mario; Muschinski, Andreas

    2007-11-01

    The frequency spectrum of angle-of-arrival (AOA) fluctuations of optical waves propagating through atmospheric turbulence carries information of wind speed transverse to the propagation path. We present the retrievals of the transverse wind speed, upsilon b, from the AOA spectra measured with a Schmidt-Cassegrain telescope equipped with a CCD camera by estimating the "knee frequency," the intersection of two power laws of the AOA spectrum. The rms difference between 30 s estimates of upsilon b retrieved from the measured AOA spectra and 30s averages of the transverse horizontal wind speed measured with an ultrasonic anemometer was 11 cm s(-1) for a 1 h period, during which the transverse horizontal wind speed varied between 0 and 80 cm s(-1). Potential and limitations of angle-of-arrival anemometry are discussed.

  3. 3. VAL CAMERA CAR, VIEW OF CAMERA CAR AND TRACK ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    3. VAL CAMERA CAR, VIEW OF CAMERA CAR AND TRACK WITH THE VAL TO THE RIGHT, LOOKING NORTHEAST. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  4. 7. VAL CAMERA STATION, INTERIOR VIEW OF CAMERA MOUNT, COMMUNICATION ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    7. VAL CAMERA STATION, INTERIOR VIEW OF CAMERA MOUNT, COMMUNICATION EQUIPMENT AND STORAGE CABINET. - Variable Angle Launcher Complex, Camera Stations, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  5. 1. VARIABLEANGLE LAUNCHER CAMERA CAR, VIEW OF CAMERA CAR AND ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    1. VARIABLE-ANGLE LAUNCHER CAMERA CAR, VIEW OF CAMERA CAR AND TRACK WITH CAMERA STATION ABOVE LOOKING NORTH TAKEN FROM RESERVOIR. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  6. A point-focusing small angle x-ray scattering camera using a doubly curved monochromator of a W/Si multilayer

    NASA Astrophysics Data System (ADS)

    Sasanuma, Yuji; Law, Robert V.; Kobayashi, Yuji

    1996-03-01

    A point-focusing small angle x-ray scattering (SAXS) camera using a doubly curved monochromator of a W/Si multilayer has been designed, constructed, and tested. The two radii of curvature of the monochromator are 20 400 and 7.6 mm. The reflectivity of its first-order Bragg reflection for CuKα radiation was calculated to be 0.82, being comparable to that (0.81) of its total reflection. By only 10 s x-ray exposure, scattering from a high-density polyethylene film was detected on an imaging plate (IP). A rotating-anode x-ray generator operated at 40 kV and 30 mA was used. Diffraction from rat-tail collagen has shown that the optical arrangement gives the Bragg spacing up to, at least, 30 nm for CuKα radiation. Combined with IPs, the camera may permit us to carry out time-resolved SAXS measurements for phase behaviors of liquid crystals, lipids, polymer alloys, etc., on conventional x-ray generators available in laboratories.

  7. 9. VIEW OF CAMERA STATIONS UNDER CONSTRUCTION INCLUDING CAMERA CAR ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    9. VIEW OF CAMERA STATIONS UNDER CONSTRUCTION INCLUDING CAMERA CAR ON RAILROAD TRACK AND FIXED CAMERA STATION 1400 (BUILDING NO. 42021) ABOVE, ADJACENT TO STATE HIGHWAY 39, LOOKING WEST, March 23, 1948. (Original photograph in possession of Dave Willis, San Diego, California.) - Variable Angle Launcher Complex, Camera Stations, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  8. The limits of narrow and wide-angle AVA inversions for high Vp/Vs ratios: An application to elastic seabed characterization

    NASA Astrophysics Data System (ADS)

    Aleardi, Mattia; Tognarelli, Andrea

    2016-08-01

    Since its introduction in the oil and gas industry, amplitude versus angle (AVA) inversion has become a standard tool in deep hydrocarbon exploration. However, with the intensification of offshore construction activity, applications of this method have been extended to evaluate the elastic properties of seabed sediments and of the shallowest part of the subsurface. These regions are often characterized by undercompacted sediments with very low S-wave velocities (Vs) and high P-wave velocity to S-wave velocity (Vp/Vs) ratios. However, the importance of the Vp/Vs ratio is usually underrated in AVA inversion. In this study, we analyse the limits of the AVA method in cases of high Vp/Vs ratios and the benefits introduced by wide-angle reflections in constraining the inversion results. A simplified seabed model that is characterized by a high Vp/Vs ratio is used to study the influence of the elastic and viscoelastic parameters on the P-wave reflection coefficients and to compute the error function of the AVA inversion. In addition, a synthetic AVA inversion is performed on this simplified model, which enables us to apply the sensitivity analysis tools to the inversion kernel. These theoretical analyses show that in the case of high Vp/Vs ratios, the Vs contrast at the reflecting interface plays a very minor role in determining the P-wave reflection coefficients and that the viscoelastic effects can be neglected when only pre-critical angles are considered in the inversion. In addition, wide-angle reflections are essential to reducing both the cross-talk between the inverted elastic parameters and the uncertainties in the Vp and density estimations, but they are not sufficient to better constrain the Vs estimation. As an application to field data, we derive the elastic properties of the seabed interface by applying AVA inversion to a 2D seismic dataset from a well-site survey acquisition. The limited water depth, the maximum available source-to-receiver offset, and the

  9. Is Perceptual Narrowing Too Narrow?

    ERIC Educational Resources Information Center

    Cashon, Cara H.; Denicola, Christopher A.

    2011-01-01

    There is a growing list of examples illustrating that infants are transitioning from having earlier abilities that appear more "universal," "broadly tuned," or "unconstrained" to having later abilities that appear more "specialized," "narrowly tuned," or "constrained." Perceptual narrowing, a well-known phenomenon related to face, speech, and…

  10. Wide Angle Movie

    NASA Technical Reports Server (NTRS)

    1999-01-01

    This brief movie illustrates the passage of the Moon through the Saturn-bound Cassini spacecraft's wide-angle camera field of view as the spacecraft passed by the Moon on the way to its closest approach with Earth on August 17, 1999. From beginning to end of the sequence, 25 wide-angle images (with a spatial image scale of about 14 miles per pixel (about 23 kilometers)were taken over the course of 7 and 1/2 minutes through a series of narrow and broadband spectral filters and polarizers, ranging from the violet to the near-infrared regions of the spectrum, to calibrate the spectral response of the wide-angle camera. The exposure times range from 5 milliseconds to 1.5 seconds. Two of the exposures were smeared and have been discarded and replaced with nearby images to make a smooth movie sequence. All images were scaled so that the brightness of Crisium basin, the dark circular region in the upper right, is approximately the same in every image. The imaging data were processed and released by the Cassini Imaging Central Laboratory for Operations (CICLOPS)at the University of Arizona's Lunar and Planetary Laboratory, Tucson, AZ.

    Photo Credit: NASA/JPL/Cassini Imaging Team/University of Arizona

    Cassini, launched in 1997, is a joint mission of NASA, the European Space Agency and Italian Space Agency. The mission is managed by NASA's Jet Propulsion Laboratory, Pasadena, CA, for NASA's Office of Space Science, Washington DC. JPL is a division of the California Institute of Technology, Pasadena, CA.

  11. HIGH SPEED CAMERA

    DOEpatents

    Rogers, B.T. Jr.; Davis, W.C.

    1957-12-17

    This patent relates to high speed cameras having resolution times of less than one-tenth microseconds suitable for filming distinct sequences of a very fast event such as an explosion. This camera consists of a rotating mirror with reflecting surfaces on both sides, a narrow mirror acting as a slit in a focal plane shutter, various other mirror and lens systems as well as an innage recording surface. The combination of the rotating mirrors and the slit mirror causes discrete, narrow, separate pictures to fall upon the film plane, thereby forming a moving image increment of the photographed event. Placing a reflecting surface on each side of the rotating mirror cancels the image velocity that one side of the rotating mirror would impart, so as a camera having this short a resolution time is thereby possible.

  12. 6. VAL CAMERA STATION, VIEW FROM INTERIOR OUT OF WINDOW ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    6. VAL CAMERA STATION, VIEW FROM INTERIOR OUT OF WINDOW OPENING TOWARD VAL FIRING RANGE LOOKING SOUTHEAST WITH CAMERA MOUNT IN FOREGROUND. - Variable Angle Launcher Complex, Camera Stations, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  13. 8. VAL CAMERA CAR, CLOSEUP VIEW OF 'FLARE' OR TRAJECTORY ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    8. VAL CAMERA CAR, CLOSE-UP VIEW OF 'FLARE' OR TRAJECTORY CAMERA ON SLIDING MOUNT. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  14. Camera Optics.

    ERIC Educational Resources Information Center

    Ruiz, Michael J.

    1982-01-01

    The camera presents an excellent way to illustrate principles of geometrical optics. Basic camera optics of the single-lens reflex camera are discussed, including interchangeable lenses and accessories available to most owners. Several experiments are described and results compared with theoretical predictions or manufacturer specifications.…

  15. OSIRIS The Scientific Camera System Onboard Rosetta

    NASA Astrophysics Data System (ADS)

    Keller, H. U.; Barbieri, C.; Lamy, P.; Rickman, H.; Rodrigo, R.; Wenzel, K.-P.; Sierks, H.; A'Hearn, M. F.; Angrilli, F.; Angulo, M.; Bailey, M. E.; Barthol, P.; Barucci, M. A.; Bertaux, J.-L.; Bianchini, G.; Boit, J.-L.; Brown, V.; Burns, J. A.; Büttner, I.; Castro, J. M.; Cremonese, G.; Curdt, W.; da Deppo, V.; Debei, S.; de Cecco, M.; Dohlen, K.; Fornasier, S.; Fulle, M.; Germerott, D.; Gliem, F.; Guizzo, G. P.; Hviid, S. F.; Ip, W.-H.; Jorda, L.; Koschny, D.; Kramm, J. R.; Kührt, E.; Küppers, M.; Lara, L. M.; Llebaria, A.; López, A.; López-Jimenez, A.; López-Moreno, J.; Meller, R.; Michalik, H.; Michelena, M. D.; Müller, R.; Naletto, G.; Origné, A.; Parzianello, G.; Pertile, M.; Quintana, C.; Ragazzoni, R.; Ramous, P.; Reiche, K.-U.; Reina, M.; Rodríguez, J.; Rousset, G.; Sabau, L.; Sanz, A.; Sivan, J.-P.; Stöckner, K.; Tabero, J.; Telljohann, U.; Thomas, N.; Timon, V.; Tomasch, G.; Wittrock, T.; Zaccariotto, M.

    2007-02-01

    The Optical, Spectroscopic, and Infrared Remote Imaging System OSIRIS is the scientific camera system onboard the Rosetta spacecraft (Figure 1). The advanced high performance imaging system will be pivotal for the success of the Rosetta mission. OSIRIS will detect 67P/Churyumov-Gerasimenko from a distance of more than 106 km, characterise the comet shape and volume, its rotational state and find a suitable landing spot for Philae, the Rosetta lander. OSIRIS will observe the nucleus, its activity and surroundings down to a scale of ~2 cm px-1. The observations will begin well before the onset of cometary activity and will extend over months until the comet reaches perihelion. During the rendezvous episode of the Rosetta mission, OSIRIS will provide key information about the nature of cometary nuclei and reveal the physics of cometary activity that leads to the gas and dust coma. OSIRIS comprises a high resolution Narrow Angle Camera (NAC) unit and a Wide Angle Camera (WAC) unit accompanied by three electronics boxes. The NAC is designed to obtain high resolution images of the surface of comet 67P/Churyumov-Gerasimenko through 12 discrete filters over the wavelength range 250 1000 nm at an angular resolution of 18.6 μrad px-1. The WAC is optimised to provide images of the near-nucleus environment in 14 discrete filters at an angular resolution of 101 μrad px-1. The two units use identical shutter, filter wheel, front door, and detector systems. They are operated by a common Data Processing Unit. The OSIRIS instrument has a total mass of 35 kg and is provided by institutes from six European countries.

  16. LROC - Lunar Reconnaissance Orbiter Camera

    NASA Astrophysics Data System (ADS)

    Robinson, M. S.; Eliason, E.; Hiesinger, H.; Jolliff, B. L.; McEwen, A.; Malin, M. C.; Ravine, M. A.; Thomas, P. C.; Turtle, E. P.

    2009-12-01

    The Lunar Reconnaissance Orbiter (LRO) went into lunar orbit on 23 June 2009. The LRO Camera (LROC) acquired its first lunar images on June 30 and commenced full scale testing and commissioning on July 10. The LROC consists of two narrow-angle cameras (NACs) that provide 0.5 m scale panchromatic images over a combined 5 km swath, and a wide-angle camera (WAC) to provide images at a scale of 100 m per pixel in five visible wavelength bands (415, 566, 604, 643, and 689 nm) and 400 m per pixel in two ultraviolet bands (321 nm and 360 nm) from the nominal 50 km orbit. Early operations were designed to test the performance of the cameras under all nominal operating conditions and provided a baseline for future calibrations. Test sequences included off-nadir slews to image stars and the Earth, 90° yaw sequences to collect flat field calibration data, night imaging for background characterization, and systematic mapping to test performance. LRO initially was placed into a terminator orbit resulting in images acquired under low signal conditions. Over the next three months the incidence angle at the spacecraft’s equator crossing gradually decreased towards high noon, providing a range of illumination conditions. Several hundred south polar images were collected in support of impact site selection for the LCROSS mission; details can be seen in many of the shadows. Commissioning phase images not only proved the instruments’ overall performance was nominal, but also that many geologic features of the lunar surface are well preserved at the meter-scale. Of particular note is the variety of impact-induced morphologies preserved in a near pristine state in and around kilometer-scale and larger young Copernican age impact craters that include: abundant evidence of impact melt of a variety of rheological properties, including coherent flows with surface textures and planimetric properties reflecting supersolidus (e.g., liquid melt) emplacement, blocks delicately perched on

  17. 1. VARIABLEANGLE LAUNCHER (VAL) CONCRETE 'A' FRAME STRUCTURE SHOWING CAMERA ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    1. VARIABLE-ANGLE LAUNCHER (VAL) CONCRETE 'A' FRAME STRUCTURE SHOWING CAMERA TOWER STRUCTURE LOOKING SOUTH AND ARCHED OPENING FOR ROADWAY. - Variable Angle Launcher Complex, Variable Angle Launcher, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  18. Space Camera

    NASA Technical Reports Server (NTRS)

    1983-01-01

    Nikon's F3 35mm camera was specially modified for use by Space Shuttle astronauts. The modification work produced a spinoff lubricant. Because lubricants in space have a tendency to migrate within the camera, Nikon conducted extensive development to produce nonmigratory lubricants; variations of these lubricants are used in the commercial F3, giving it better performance than conventional lubricants. Another spinoff is the coreless motor which allows the F3 to shoot 140 rolls of film on one set of batteries.

  19. Infrared Camera

    NASA Technical Reports Server (NTRS)

    1997-01-01

    A sensitive infrared camera that observes the blazing plumes from the Space Shuttle or expendable rocket lift-offs is capable of scanning for fires, monitoring the environment and providing medical imaging. The hand-held camera uses highly sensitive arrays in infrared photodetectors known as quantum well infrared photo detectors (QWIPS). QWIPS were developed by the Jet Propulsion Laboratory's Center for Space Microelectronics Technology in partnership with Amber, a Raytheon company. In October 1996, QWIP detectors pointed out hot spots of the destructive fires speeding through Malibu, California. Night vision, early warning systems, navigation, flight control systems, weather monitoring, security and surveillance are among the duties for which the camera is suited. Medical applications are also expected.

  20. Electronic Still Camera

    NASA Technical Reports Server (NTRS)

    Holland, S. Douglas (Inventor)

    1992-01-01

    A handheld, programmable, digital camera is disclosed that supports a variety of sensors and has program control over the system components to provide versatility. The camera uses a high performance design which produces near film quality images from an electronic system. The optical system of the camera incorporates a conventional camera body that was slightly modified, thus permitting the use of conventional camera accessories, such as telephoto lenses, wide-angle lenses, auto-focusing circuitry, auto-exposure circuitry, flash units, and the like. An image sensor, such as a charge coupled device ('CCD') collects the photons that pass through the camera aperture when the shutter is opened, and produces an analog electrical signal indicative of the image. The analog image signal is read out of the CCD and is processed by preamplifier circuitry, a correlated double sampler, and a sample and hold circuit before it is converted to a digital signal. The analog-to-digital converter has an accuracy of eight bits to insure accuracy during the conversion. Two types of data ports are included for two different data transfer needs. One data port comprises a general purpose industrial standard port and the other a high speed/high performance application specific port. The system uses removable hard disks as its permanent storage media. The hard disk receives the digital image signal from the memory buffer and correlates the image signal with other sensed parameters, such as longitudinal or other information. When the storage capacity of the hard disk has been filled, the disk can be replaced with a new disk.

  1. 5. VAL CAMERA CAR, DETAIL OF HOIST AT SIDE OF ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    5. VAL CAMERA CAR, DETAIL OF HOIST AT SIDE OF BRIDGE AND ENGINE CAR ON TRACKS, LOOKING NORTHEAST. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  2. CCD Camera

    DOEpatents

    Roth, R.R.

    1983-08-02

    A CCD camera capable of observing a moving object which has varying intensities of radiation emanating therefrom and which may move at varying speeds is shown wherein there is substantially no overlapping of successive images and wherein the exposure times and scan times may be varied independently of each other. 7 figs.

  3. CCD Camera

    DOEpatents

    Roth, Roger R.

    1983-01-01

    A CCD camera capable of observing a moving object which has varying intensities of radiation eminating therefrom and which may move at varying speeds is shown wherein there is substantially no overlapping of successive images and wherein the exposure times and scan times may be varied independently of each other.

  4. Nikon Camera

    NASA Technical Reports Server (NTRS)

    1980-01-01

    Nikon FM compact has simplification feature derived from cameras designed for easy, yet accurate use in a weightless environment. Innovation is a plastic-cushioned advance lever which advances the film and simultaneously switches on a built in light meter. With a turn of the lens aperture ring, a glowing signal in viewfinder confirms correct exposure.

  5. Changes in local energy spectra with SPECT rotation for two Anger cameras

    SciTech Connect

    Koral, K.F.; Luo, J.Q.; Ahmad, W.; Buchbinder, S.; Ficaro, E.

    1995-08-01

    The authors investigated the shift of local energy spectra with SPECT rotation for the GE 400 AT and the Picker Prism 3000 tomographs. A Co-57 flood source was taped to the parallel-beam collimator of the GE 400 AT; a Tc-99m line source was placed at the focus of the fan-beam collimator of one head of the Picker Prism. The count-based method, which employs a narrow window (about 4 keV) on the maximum slope of the photopeak, was used with both systems. Non-linear, polynomial spectral fittings was applied to x-y-E data acquisitions with the GE camera. The fitting yielded either shifts or shifts and width changes. Results show (1) the shifts are pseudo-sinusoidal with angle and similar for different spatial locations, (2) the average of their absolute value is 0.71 keV and 0.13 keV for the GE and Picker cameras, respectively, (3) width changes for the GE camera are small and appear random, (4) the calculated shifts from the count-based method for the central part of the GE camera are correlated with those from the spectral fitting method. They are 12% smaller. The conclusion is that energy shift with angle may be present with many rotating cameras although they may be smaller with newer cameras. It might be necessary to account for them in schemes designed for high-accuracy compensation of Compton-scattered gamma rays although they possibly could be ignored for newer cameras.

  6. Cassini Camera Contamination Anomaly: Experiences and Lessons Learned

    NASA Technical Reports Server (NTRS)

    Haemmerle, Vance R.; Gerhard, James H.

    2006-01-01

    We discuss the contamination 'Haze' anomaly for the Cassini Narrow Angle Camera (NAC), one of two optical telescopes that comprise the Imaging Science Subsystem (ISS). Cassini is a Saturn Orbiter with a 4-year nominal mission. The incident occurred in 2001, five months after Jupiter encounter during the Cruise phase and ironically at the resumption of planned maintenance decontamination cycles. The degraded optical performance was first identified by the Instrument Operations Team with the first ISS Saturn imaging six weeks later. A distinct haze of varying size from image to image marred the images of Saturn. A photometric star calibration of the Pleiades, 4 days after the incident, showed stars with halos. Analysis showed that while the halo's intensity was only 1 - 2% of the intensity of the central peak of a star, the halo contained 30 - 70% of its integrated flux. This condition would impact science return. In a review of our experiences, we examine the contamination control plan, discuss the analysis of the limited data available and describe the one-year campaign to remove the haze from the camera. After several long conservative heating activities and interim analysis of their results, the contamination problem as measured by the camera's point spread function was essentially back to preanomaly size and at a point where there would be more risk to continue. We stress the importance of the flexibility of operations and instrument design, the need to do early infight instrument calibration and continual monitoring of instrument performance.

  7. 2. VAL CONCRETE 'A' FRAME STRUCTURE SHOWING CAMERA TOWER, PROJECTILE ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    2. VAL CONCRETE 'A' FRAME STRUCTURE SHOWING CAMERA TOWER, PROJECTILE LOADING DECK AND BREECH END OF LAUNCHER BRIDGE LOOKING SOUTH. - Variable Angle Launcher Complex, Variable Angle Launcher, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  8. 43. VAL, DETAIL OF CAMERA STATION ON CONNECTING BRIDGE SHOWING ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    43. VAL, DETAIL OF CAMERA STATION ON CONNECTING BRIDGE SHOWING SIDE THAT FACED THE VAL. - Variable Angle Launcher Complex, Variable Angle Launcher, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  9. 7. VAL CONCRETE 'A' FRAME STRUCTURE SHOWING CAMERA TOWER ARCHED ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    7. VAL CONCRETE 'A' FRAME STRUCTURE SHOWING CAMERA TOWER ARCHED OPENING FOR ROADWAY AND COUNTERWEIGHT SLOPE TAKEN FROM RESERVOIR LOOKING WEST. - Variable Angle Launcher Complex, Variable Angle Launcher, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  10. 11. VAL, DETAIL OF CAMERA TOWER AND THE TOP OF ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    11. VAL, DETAIL OF CAMERA TOWER AND THE TOP OF CONCRETE 'A' FRAME STRUCTURE LOOKING NORTH. - Variable Angle Launcher Complex, Variable Angle Launcher, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  11. 8. VAL COUNTERWEIGHT CAR ON COUNTERWEIGHT SLAB AND CAMERA TOWER ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    8. VAL COUNTERWEIGHT CAR ON COUNTERWEIGHT SLAB AND CAMERA TOWER TAKEN FROM RESERVOIR LOOKING SOUTHWEST. - Variable Angle Launcher Complex, Variable Angle Launcher, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  12. Calibration of the Lunar Reconnaissance Orbiter Camera

    NASA Astrophysics Data System (ADS)

    Tschimmel, M.; Robinson, M. S.; Humm, D. C.; Denevi, B. W.; Lawrence, S. J.; Brylow, S.; Ravine, M.; Ghaemi, T.

    2008-12-01

    The Lunar Reconnaissance Orbiter Camera (LROC) onboard the NASA Lunar Reconnaissance Orbiter (LRO) spacecraft consists of three cameras: the Wide-Angle Camera (WAC) and two identical Narrow Angle Cameras (NAC-L, NAC-R). The WAC is push-frame imager with 5 visible wavelength filters (415 to 680 nm) at a spatial resolution of 100 m/pixel and 2 UV filters (315 and 360 nm) with a resolution of 400 m/pixel. In addition to the multicolor imaging the WAC can operate in monochrome mode to provide a global large- incidence angle basemap and a time-lapse movie of the illumination conditions at both poles. The WAC has a highly linear response, a read noise of 72 e- and a full well capacity of 47,200 e-. The signal-to-noise ratio in each band is 140 in the worst case. There are no out-of-band leaks and the spectral response of each filter is well characterized. Each NAC is a monochrome pushbroom scanner, providing images with a resolution of 50 cm/pixel from a 50-km orbit. A single NAC image has a swath width of 2.5 km and a length of up to 26 km. The NACs are mounted to acquire side-by-side imaging for a combined swath width of 5 km. The NAC is designed to fully characterize future human and robotic landing sites in terms of topography and hazard risks. The North and South poles will be mapped on a 1-meter-scale poleward of 85.5° latitude. Stereo coverage can be provided by pointing the NACs off-nadir. The NACs are also highly linear. Read noise is 71 e- for NAC-L and 74 e- for NAC-R and the full well capacity is 248,500 e- for NAC-L and 262,500 e- for NAC- R. The focal lengths are 699.6 mm for NAC-L and 701.6 mm for NAC-R; the system MTF is 28% for NAC-L and 26% for NAC-R. The signal-to-noise ratio is at least 46 (terminator scene) and can be higher than 200 (high sun scene). Both NACs exhibit a straylight feature, which is caused by out-of-field sources and is of a magnitude of 1-3%. However, as this feature is well understood it can be greatly reduced during ground

  13. Pinhole Cameras: For Science, Art, and Fun!

    ERIC Educational Resources Information Center

    Button, Clare

    2007-01-01

    A pinhole camera is a camera without a lens. A tiny hole replaces the lens, and light is allowed to come in for short amount of time by means of a hand-operated shutter. The pinhole allows only a very narrow beam of light to enter, which reduces confusion due to scattered light on the film. This results in an image that is focused, reversed, and…

  14. 11. COMPLETED FIXED CAMERA STATION 1700 (BUILDING NO. 42022) LOOKING ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    11. COMPLETED FIXED CAMERA STATION 1700 (BUILDING NO. 42022) LOOKING WEST SHOWING WINDOW OPENING FOR CAMERA, March 31, 1948. (Original photograph in possession of Dave Willis, San Diego, California.) - Variable Angle Launcher Complex, Camera Stations, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  15. Omnidirectional underwater camera design and calibration.

    PubMed

    Bosch, Josep; Gracias, Nuno; Ridao, Pere; Ribas, David

    2015-01-01

    This paper presents the development of an underwater omnidirectional multi-camera system (OMS) based on a commercially available six-camera system, originally designed for land applications. A full calibration method is presented for the estimation of both the intrinsic and extrinsic parameters, which is able to cope with wide-angle lenses and non-overlapping cameras simultaneously. This method is valid for any OMS in both land or water applications. For underwater use, a customized housing is required, which often leads to strong image distortion due to refraction among the different media. This phenomena makes the basic pinhole camera model invalid for underwater cameras, especially when using wide-angle lenses, and requires the explicit modeling of the individual optical rays. To address this problem, a ray tracing approach has been adopted to create a field-of-view (FOV) simulator for underwater cameras. The simulator allows for the testing of different housing geometries and optics for the cameras to ensure a complete hemisphere coverage in underwater operation. This paper describes the design and testing of a compact custom housing for a commercial off-the-shelf OMS camera (Ladybug 3) and presents the first results of its use. A proposed three-stage calibration process allows for the estimation of all of the relevant camera parameters. Experimental results are presented, which illustrate the performance of the calibration method and validate the approach. PMID:25774707

  16. Omnidirectional underwater camera design and calibration.

    PubMed

    Bosch, Josep; Gracias, Nuno; Ridao, Pere; Ribas, David

    2015-03-12

    This paper presents the development of an underwater omnidirectional multi-camera system (OMS) based on a commercially available six-camera system, originally designed for land applications. A full calibration method is presented for the estimation of both the intrinsic and extrinsic parameters, which is able to cope with wide-angle lenses and non-overlapping cameras simultaneously. This method is valid for any OMS in both land or water applications. For underwater use, a customized housing is required, which often leads to strong image distortion due to refraction among the different media. This phenomena makes the basic pinhole camera model invalid for underwater cameras, especially when using wide-angle lenses, and requires the explicit modeling of the individual optical rays. To address this problem, a ray tracing approach has been adopted to create a field-of-view (FOV) simulator for underwater cameras. The simulator allows for the testing of different housing geometries and optics for the cameras to ensure a complete hemisphere coverage in underwater operation. This paper describes the design and testing of a compact custom housing for a commercial off-the-shelf OMS camera (Ladybug 3) and presents the first results of its use. A proposed three-stage calibration process allows for the estimation of all of the relevant camera parameters. Experimental results are presented, which illustrate the performance of the calibration method and validate the approach.

  17. Omnidirectional Underwater Camera Design and Calibration

    PubMed Central

    Bosch, Josep; Gracias, Nuno; Ridao, Pere; Ribas, David

    2015-01-01

    This paper presents the development of an underwater omnidirectional multi-camera system (OMS) based on a commercially available six-camera system, originally designed for land applications. A full calibration method is presented for the estimation of both the intrinsic and extrinsic parameters, which is able to cope with wide-angle lenses and non-overlapping cameras simultaneously. This method is valid for any OMS in both land or water applications. For underwater use, a customized housing is required, which often leads to strong image distortion due to refraction among the different media. This phenomena makes the basic pinhole camera model invalid for underwater cameras, especially when using wide-angle lenses, and requires the explicit modeling of the individual optical rays. To address this problem, a ray tracing approach has been adopted to create a field-of-view (FOV) simulator for underwater cameras. The simulator allows for the testing of different housing geometries and optics for the cameras to ensure a complete hemisphere coverage in underwater operation. This paper describes the design and testing of a compact custom housing for a commercial off-the-shelf OMS camera (Ladybug 3) and presents the first results of its use. A proposed three-stage calibration process allows for the estimation of all of the relevant camera parameters. Experimental results are presented, which illustrate the performance of the calibration method and validate the approach. PMID:25774707

  18. 5. VAL CAMERA STATION, VIEW FROM INTERIOR OUT OF WINDOW ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    5. VAL CAMERA STATION, VIEW FROM INTERIOR OUT OF WINDOW OPENING TOWARD VAL FIRING RANGE LOOKING EAST WITH VARIABLE ANGLE LAUNCHER IN BACKGROUND. - Variable Angle Launcher Complex, Camera Stations, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  19. Laboratory calibration and characterization of video cameras

    NASA Technical Reports Server (NTRS)

    Burner, A. W.; Snow, W. L.; Shortis, M. R.; Goad, W. K.

    1990-01-01

    Some techniques for laboratory calibration and characterization of video cameras used with frame grabber boards are presented. A laser-illuminated displaced reticle technique (with camera lens removed) is used to determine the camera/grabber effective horizontal and vertical pixel spacing as well as the angle of nonperpendicularity of the axes. The principal point of autocollimation and point of symmetry are found by illuminating the camera with an unexpanded laser beam, either aligned with the sensor or lens. Lens distortion and the principal distance are determined from images of a calibration plate suitably aligned with the camera. Calibration and characterization results for several video cameras are presented. Differences between these laboratory techniques and test range and plumb line calibration are noted.

  20. Laboratory Calibration and Characterization of Video Cameras

    NASA Technical Reports Server (NTRS)

    Burner, A. W.; Snow, W. L.; Shortis, M. R.; Goad, W. K.

    1989-01-01

    Some techniques for laboratory calibration and characterization of video cameras used with frame grabber boards are presented. A laser-illuminated displaced reticle technique (with camera lens removed) is used to determine the camera/grabber effective horizontal and vertical pixel spacing as well as the angle of non-perpendicularity of the axes. The principal point of autocollimation and point of symmetry are found by illuminating the camera with an unexpanded laser beam, either aligned with the sensor or lens. Lens distortion and the principal distance are determined from images of a calibration plate suitable aligned with the camera. Calibration and characterization results for several video cameras are presented. Differences between these laboratory techniques and test range and plumb line calibration are noted.

  1. Caught on Camera.

    ERIC Educational Resources Information Center

    Milshtein, Amy

    2002-01-01

    Describes the benefits of and rules to be followed when using surveillance cameras for school security. Discusses various camera models, including indoor and outdoor fixed position cameras, pan-tilt zoom cameras, and pinhole-lens cameras for covert surveillance. (EV)

  2. Narrowness and Liberality

    ERIC Educational Resources Information Center

    Agresto, John

    2003-01-01

    John Agresto, whose task has been to rebuild the war-ravaged infrastructure of a Middle-Eastern university system, is discouraged to see that narrow expertise is the only goal of education there, to the utter exclusion of intellectual breadth. He comments that, although it is not that bad in the U.S., he feels that doctoral programs as currently…

  3. Dynamics of narrow rings

    NASA Technical Reports Server (NTRS)

    Dermott, S. F.

    1984-01-01

    The ring models described here were developed to account for the dynamical problems posed by the narrow rings of Uranus. Some of these rings are now known to be eccentric, inclined, nonuniform in width, optically thick, and narrow, with very sharp edges. The eccentric rings have common pericenters and large, positive eccentricity gradients. The theory of shepherding satellites successfully accounts for most of these features and can also account for some features of the narrow Saturnian rings, in particular, waves, kinks, and periodic variations in brightness. Outstanding problems include the putative relation between eccentricity and inclination displayed by eight of the nine Uranian rings, and the magnitudes of the tidal torques acting on the shepherding satellites. The horseshoe-orbit model, although viable, probably has more application to the narrow rings from which the Saturnian coorbital satellites formed. The angular momentum flow rate due to particle collisions is a minimum at the Lagrangian equilibrium points L(4) and L(5), and one can expect accretion to be rapid at these points.

  4. System Synchronizes Recordings from Separated Video Cameras

    NASA Technical Reports Server (NTRS)

    Nail, William; Nail, William L.; Nail, Jasper M.; Le, Doung T.

    2009-01-01

    A system of electronic hardware and software for synchronizing recordings from multiple, physically separated video cameras is being developed, primarily for use in multiple-look-angle video production. The system, the time code used in the system, and the underlying method of synchronization upon which the design of the system is based are denoted generally by the term "Geo-TimeCode(TradeMark)." The system is embodied mostly in compact, lightweight, portable units (see figure) denoted video time-code units (VTUs) - one VTU for each video camera. The system is scalable in that any number of camera recordings can be synchronized. The estimated retail price per unit would be about $350 (in 2006 dollars). The need for this or another synchronization system external to video cameras arises because most video cameras do not include internal means for maintaining synchronization with other video cameras. Unlike prior video-camera-synchronization systems, this system does not depend on continuous cable or radio links between cameras (however, it does depend on occasional cable links lasting a few seconds). Also, whereas the time codes used in prior video-camera-synchronization systems typically repeat after 24 hours, the time code used in this system does not repeat for slightly more than 136 years; hence, this system is much better suited for long-term deployment of multiple cameras.

  5. How vessels narrow.

    PubMed

    Schwartz, S M

    1995-01-01

    Vascular narrowing, the clinical dilatation of narrowed vessels, and the restenosis of those vessels are central topics in modern cardiology. This review discusses the cellular basis both for the spontaneous narrowing of vessels and for the restenotic process that occurs after angioplasty. The central issue, as discussed in this review, is likely to be remodeling of the vessel wall rather than simple accretion of lipid mass in atherosclerosis or simple physical dilatation following angioplasty. While it is true that the atherosclerotic lesion grows by accretion of lipid mass, this by itself does not narrow vessels. As we will discuss, the vessel has a phenomenal ability to accommodate changes of this sort. Narrowing must occur, at least in part, because of a failure of this normal ability to accommodate. In a similar manner, one might expect the restonotic vessel to simply remodel itself down to its preangioplasty size. The issue for cell and molecular biologists is what "remodeling" means. Until recently, the assertion has been that remodeling occurred as the result of the formation of new intimal mass; that is, the atherosclerotic vessel was seen as returning to its original dimensions following angioplasty as a result of forming a new intimal mass that filled in the dilated space. Recent studies using cell kinetic methods as well as intravascular ultrasound, however, have cast doubt upon this hypothesis. It now appears that the loss of gain following angioplasty is likely to be due to the formation of new tissues which remodel the vessel wall without necessarily adding mass to it. This is the same sort of process that is well described in wound healing. The nature of this new tissue is of great interest. Studies in this laboratory and others have identified genes which may be unique to this tissue and explain the remodeling response. PMID:8585265

  6. Characterization of previously unidentified lunar pyroclastic deposits using Lunar Reconnaissance Orbiter Camera (LROC) data

    USGS Publications Warehouse

    Gustafson, J. Olaf; Bell, James F.; Gaddis, Lisa R.R.; Hawke, B. Ray Ray; Giguere, Thomas A.

    2012-01-01

    We used a Lunar Reconnaissance Orbiter Camera (LROC) global monochrome Wide-angle Camera (WAC) mosaic to conduct a survey of the Moon to search for previously unidentified pyroclastic deposits. Promising locations were examined in detail using LROC multispectral WAC mosaics, high-resolution LROC Narrow Angle Camera (NAC) images, and Clementine multispectral (ultraviolet-visible or UVVIS) data. Out of 47 potential deposits chosen for closer examination, 12 were selected as probable newly identified pyroclastic deposits. Potential pyroclastic deposits were generally found in settings similar to previously identified deposits, including areas within or near mare deposits adjacent to highlands, within floor-fractured craters, and along fissures in mare deposits. However, a significant new finding is the discovery of localized pyroclastic deposits within floor-fractured craters Anderson E and F on the lunar farside, isolated from other known similar deposits. Our search confirms that most major regional and localized low-albedo pyroclastic deposits have been identified on the Moon down to ~100 m/pix resolution, and that additional newly identified deposits are likely to be either isolated small deposits or additional portions of discontinuous, patchy deposits.

  7. Pre-hibernation performances of the OSIRIS cameras onboard the Rosetta spacecraft

    NASA Astrophysics Data System (ADS)

    Magrin, S.; La Forgia, F.; Da Deppo, V.; Lazzarin, M.; Bertini, I.; Ferri, F.; Pajola, M.; Barbieri, M.; Naletto, G.; Barbieri, C.; Tubiana, C.; Küppers, M.; Fornasier, S.; Jorda, L.; Sierks, H.

    2015-02-01

    Context. The ESA cometary mission Rosetta was launched in 2004. In the past years and until the spacecraft hibernation in June 2011, the two cameras of the OSIRIS imaging system (Narrow Angle and Wide Angle Camera, NAC and WAC) observed many different sources. On 20 January 2014 the spacecraft successfully exited hibernation to start observing the primary scientific target of the mission, comet 67P/Churyumov-Gerasimenko. Aims: A study of the past performances of the cameras is now mandatory to be able to determine whether the system has been stable through the time and to derive, if necessary, additional analysis methods for the future precise calibration of the cometary data. Methods: The instrumental responses and filter passbands were used to estimate the efficiency of the system. A comparison with acquired images of specific calibration stars was made, and a refined photometric calibration was computed, both for the absolute flux and for the reflectivity of small bodies of the solar system. Results: We found a stability of the instrumental performances within ±1.5% from 2007 to 2010, with no evidence of an aging effect on the optics or detectors. The efficiency of the instrumentation is found to be as expected in the visible range, but lower than expected in the UV and IR range. A photometric calibration implementation was discussed for the two cameras. Conclusions: The calibration derived from pre-hibernation phases of the mission will be checked as soon as possible after the awakening of OSIRIS and will be continuously monitored until the end of the mission in December 2015. A list of additional calibration sources has been determined that are to be observed during the forthcoming phases of the mission to ensure a better coverage across the wavelength range of the cameras and to study the possible dust contamination of the optics.

  8. Multi-cameras calibration from spherical targets

    NASA Astrophysics Data System (ADS)

    Zhao, Chengyun; Zhang, Jin; Deng, Huaxia; Yu, Liandong

    2016-01-01

    Multi-cameras calibration using spheres is more convenient than using planar target because it has an obvious advantage in imaging in different angles. The internal and external parameters of multi-cameras can be obtained through once calibrat ion. In this paper, a novel mult i-cameras calibration method is proposed based on multiple spheres. A calibration target with fixed multiple balls is applied in this method and the geometric propert ies of the sphere projection model will be analyzed. During the experiment, the spherical target is placed in the public field of mult i-cameras system. Then the corresponding data can be stored when the cameras are triggered by signal generator. The contours of the balls are detected by Hough transform and the center coordinates are determined with sub-pixel accuracy. Then the center coordinates are used as input information for calibrat ion and the internal as well as external parameters can be calculated by Zhang's theory. When mult iple cameras are calibrated simultaneously from different angles using multiple spheres, the center coordinates of each sphere can be determined accurately even the target images taken out of focus. So this method can improve the calibration precision. Meanwhile, Zhang's plane template method is added to the contrast calibrat ion experiment. And the error sources of the experiment are analyzed. The results indicate that the method proposed in this paper is suitable for mult i-cameras calibrat ion.

  9. The narrow pentaquark

    NASA Astrophysics Data System (ADS)

    Diakonov, Dmitri

    2007-02-01

    The experimental status of the pentaquark searches is briefly reviewed. Recent null results by the CLAS collaboration are commented, and new strong evidence of a very narrow Θ+ resonance by the DIANA collaboration is presented. On the theory side, I revisit the argument against the existence of the pentaquark — that of Callan and Klebanov — and show that actually a strong resonance is predicted in that approach, however its width is grossly overestimated. A recent calculation gives 2 MeV for the pentaquark width, and this number is probably still an upper bound.

  10. Determining Camera Gain in Room Temperature Cameras

    SciTech Connect

    Joshua Cogliati

    2010-12-01

    James R. Janesick provides a method for determining the amplification of a CCD or CMOS camera when only access to the raw images is provided. However, the equation that is provided ignores the contribution of dark current. For CCD or CMOS cameras that are cooled well below room temperature, this is not a problem, however, the technique needs adjustment for use with room temperature cameras. This article describes the adjustment made to the equation, and a test of this method.

  11. Two-Camera Acquisition and Tracking of a Flying Target

    NASA Technical Reports Server (NTRS)

    Biswas, Abhijit; Assad, Christopher; Kovalik, Joseph M.; Pain, Bedabrata; Wrigley, Chris J.; Twiss, Peter

    2008-01-01

    A method and apparatus have been developed to solve the problem of automated acquisition and tracking, from a location on the ground, of a luminous moving target in the sky. The method involves the use of two electronic cameras: (1) a stationary camera having a wide field of view, positioned and oriented to image the entire sky; and (2) a camera that has a much narrower field of view (a few degrees wide) and is mounted on a two-axis gimbal. The wide-field-of-view stationary camera is used to initially identify the target against the background sky. So that the approximate position of the target can be determined, pixel locations on the image-detector plane in the stationary camera are calibrated with respect to azimuth and elevation. The approximate target position is used to initially aim the gimballed narrow-field-of-view camera in the approximate direction of the target. Next, the narrow-field-of view camera locks onto the target image, and thereafter the gimbals are actuated as needed to maintain lock and thereby track the target with precision greater than that attainable by use of the stationary camera.

  12. Vacuum Camera Cooler

    NASA Technical Reports Server (NTRS)

    Laugen, Geoffrey A.

    2011-01-01

    Acquiring cheap, moving video was impossible in a vacuum environment, due to camera overheating. This overheating is brought on by the lack of cooling media in vacuum. A water-jacketed camera cooler enclosure machined and assembled from copper plate and tube has been developed. The camera cooler (see figure) is cup-shaped and cooled by circulating water or nitrogen gas through copper tubing. The camera, a store-bought "spy type," is not designed to work in a vacuum. With some modifications the unit can be thermally connected when mounted in the cup portion of the camera cooler. The thermal conductivity is provided by copper tape between parts of the camera and the cooled enclosure. During initial testing of the demonstration unit, the camera cooler kept the CPU (central processing unit) of this video camera at operating temperature. This development allowed video recording of an in-progress test, within a vacuum environment.

  13. Making Ceramic Cameras

    ERIC Educational Resources Information Center

    Squibb, Matt

    2009-01-01

    This article describes how to make a clay camera. This idea of creating functional cameras from clay allows students to experience ceramics, photography, and painting all in one unit. (Contains 1 resource and 3 online resources.)

  14. Constrained space camera assembly

    DOEpatents

    Heckendorn, Frank M.; Anderson, Erin K.; Robinson, Casandra W.; Haynes, Harriet B.

    1999-01-01

    A constrained space camera assembly which is intended to be lowered through a hole into a tank, a borehole or another cavity. The assembly includes a generally cylindrical chamber comprising a head and a body and a wiring-carrying conduit extending from the chamber. Means are included in the chamber for rotating the body about the head without breaking an airtight seal formed therebetween. The assembly may be pressurized and accompanied with a pressure sensing means for sensing if a breach has occurred in the assembly. In one embodiment, two cameras, separated from their respective lenses, are installed on a mounting apparatus disposed in the chamber. The mounting apparatus includes means allowing both longitudinal and lateral movement of the cameras. Moving the cameras longitudinally focuses the cameras, and moving the cameras laterally away from one another effectively converges the cameras so that close objects can be viewed. The assembly further includes means for moving lenses of different magnification forward of the cameras.

  15. Study on airflow characteristics in the semi-closed irregular narrow flow channel

    NASA Astrophysics Data System (ADS)

    Jin, Yuzhen; Hu, Xiaodong; Zhu, Linhang; Hu, Xudong; Jin, Yingzi

    2016-04-01

    The air-jet loom is widely used in the textile industry. The interaction mechanism of airflow and yarn is not clear in such a narrow flow channel, the gas consumption is relatively large, the yarn motion is unstable and the weft insertion is often interrupted during the operation. In order to study the characteristics of the semi-closed flow field in profiled dents, the momentum conservation equation is modified and the model parameters and boundary conditions are set. Compared with the different r, the ratio of profiled dent's thickness and gap, the results show that the smaller the r is, the smaller the velocity fluctuations of the airflow is. When the angle of profiled dents α is close to zero, the diffusion of the airflow will be less. The experiment is also conducted to verify the result of the simulation with a high-speed camera and pressure sensor in profiled dents. The airflow characteristics in the semi-closed irregular narrow flow channel in the paper would provide the theoretical basis for optimizing the weft insertion process of the air-jet loom.

  16. Nanosecond frame cameras

    SciTech Connect

    Frank, A M; Wilkins, P R

    2001-01-05

    The advent of CCD cameras and computerized data recording has spurred the development of several new cameras and techniques for recording nanosecond images. We have made a side by side comparison of three nanosecond frame cameras, examining them for both performance and operational characteristics. The cameras include; Micro-Channel Plate/CCD, Image Diode/CCD and Image Diode/Film; combinations of gating/data recording. The advantages and disadvantages of each device will be discussed.

  17. Harpicon camera for HDTV

    NASA Astrophysics Data System (ADS)

    Tanada, Jun

    1992-08-01

    Ikegami has been involved in broadcast equipment ever since it was established as a company. In conjunction with NHK it has brought forth countless television cameras, from black-and-white cameras to color cameras, HDTV cameras, and special-purpose cameras. In the early days of HDTV (high-definition television, also known as "High Vision") cameras the specifications were different from those for the cameras of the present-day system, and cameras using all kinds of components, having different arrangements of components, and having different appearances were developed into products, with time spent on experimentation, design, fabrication, adjustment, and inspection. But recently the knowhow built up thus far in components, , printed circuit boards, and wiring methods has been incorporated in camera fabrication, making it possible to make HDTV cameras by metbods similar to the present system. In addition, more-efficient production, lower costs, and better after-sales service are being achieved by using the same circuits, components, mechanism parts, and software for both HDTV cameras and cameras that operate by the present system.

  18. Digital Pinhole Camera

    ERIC Educational Resources Information Center

    Lancor, Rachael; Lancor, Brian

    2014-01-01

    In this article we describe how the classic pinhole camera demonstration can be adapted for use with digital cameras. Students can easily explore the effects of the size of the pinhole and its distance from the sensor on exposure time, magnification, and image quality. Instructions for constructing a digital pinhole camera and our method for…

  19. Ultrabroadband absorber using a deep metallic grating with narrow slits

    NASA Astrophysics Data System (ADS)

    Liao, Yan-Lin; Zhao, Yan

    2015-01-01

    We report an ultrabroadband absorber using a deep metallic grating with narrow slits in the infrared regime. In this absorber, the ultrabroadband electromagnetic wave has been perfectly transmitted through the vacuum-grating interface due to the plasmonic Brewster angle effect, and its energy can attenuate effectively through the slits because of the enhanced electric field inside the slits. In addition, simulation results show that this ultrabroadband absorber works over a narrow angle range which is a very important feature of directional thermal source.

  20. Prediction of Viking lander camera image quality

    NASA Technical Reports Server (NTRS)

    Huck, F. O.; Burcher, E. E.; Jobson, D. J.; Wall, S. D.

    1976-01-01

    Formulations are presented that permit prediction of image quality as a function of camera performance, surface radiance properties, and lighting and viewing geometry. Predictions made for a wide range of surface radiance properties reveal that image quality depends strongly on proper camera dynamic range command and on favorable lighting and viewing geometry. Proper camera dynamic range commands depend mostly on the surface albedo that will be encountered. Favorable lighting and viewing geometries depend mostly on lander orientation with respect to the diurnal sun path over the landing site, and tend to be independent of surface albedo and illumination scattering function. Side lighting with low sun elevation angles (10 to 30 deg) is generally favorable for imaging spatial details and slopes, whereas high sun elevation angles are favorable for measuring spectral reflectances.

  1. 10. CONSTRUCTION OF FIXED CAMERA STATION 1100 (BUILDING NO. 42020) ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    10. CONSTRUCTION OF FIXED CAMERA STATION 1100 (BUILDING NO. 42020) LOOKING NORTHEAST SHOWING CONCRETE FOUNDATION, WOOD FORMWORK AND STEEL REINFORCING, March 26, 1948. (Original photograph in possession of Dave Willis, San Diego, California.) - Variable Angle Launcher Complex, Camera Stations, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  2. 9. COMPLETED ROLLING CAMERA CAR ON RAILROAD TRACK AND BRIDGE ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    9. COMPLETED ROLLING CAMERA CAR ON RAILROAD TRACK AND BRIDGE LOOKING WEST, APRIL 26, 1948. (ORIGINAL PHOTOGRAPH IN POSSESSION OF DAVE WILLIS, SAN DIEGO, CALIFORNIA.) - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  3. Contrail study with ground-based cameras

    NASA Astrophysics Data System (ADS)

    Schumann, U.; Hempel, R.; Flentje, H.; Garhammer, M.; Graf, K.; Kox, S.; Lösslein, H.; Mayer, B.

    2013-08-01

    Photogrammetric methods and analysis results for contrails observed with wide-angle cameras are described. Four cameras of two different types (view angle < 90° or whole-sky imager) at the ground at various positions are used to track contrails and to derive their altitude, width, and horizontal speed. Camera models for both types are described to derive the observation angles for given image coordinates and their inverse. The models are calibrated with sightings of the Sun, the Moon and a few bright stars. The methods are applied and tested in a case study. Four persistent contrails crossing each other together with a short-lived one are observed with the cameras. Vertical and horizontal positions of the contrails are determined from the camera images to an accuracy of better than 200 m and horizontal speed to 0.2 m s-1. With this information, the aircraft causing the contrails are identified by comparison to traffic waypoint data. The observations are compared with synthetic camera pictures of contrails simulated with the contrail prediction model CoCiP, a Lagrangian model using air traffic movement data and numerical weather prediction (NWP) data as input. The results provide tests for the NWP and contrail models. The cameras show spreading and thickening contrails suggesting ice-supersaturation in the ambient air. The ice-supersaturated layer is found thicker and more humid in this case than predicted by the NWP model used. The simulated and observed contrail positions agree up to differences caused by uncertain wind data. The contrail widths, which depend on wake vortex spreading, ambient shear and turbulence, were partly wider than simulated.

  4. Contrail study with ground-based cameras

    NASA Astrophysics Data System (ADS)

    Schumann, U.; Hempel, R.; Flentje, H.; Garhammer, M.; Graf, K.; Kox, S.; Lösslein, H.; Mayer, B.

    2013-12-01

    Photogrammetric methods and analysis results for contrails observed with wide-angle cameras are described. Four cameras of two different types (view angle < 90° or whole-sky imager) at the ground at various positions are used to track contrails and to derive their altitude, width, and horizontal speed. Camera models for both types are described to derive the observation angles for given image coordinates and their inverse. The models are calibrated with sightings of the Sun, the Moon and a few bright stars. The methods are applied and tested in a case study. Four persistent contrails crossing each other, together with a short-lived one, are observed with the cameras. Vertical and horizontal positions of the contrails are determined from the camera images to an accuracy of better than 230 m and horizontal speed to 0.2 m s-1. With this information, the aircraft causing the contrails are identified by comparison to traffic waypoint data. The observations are compared with synthetic camera pictures of contrails simulated with the contrail prediction model CoCiP, a Lagrangian model using air traffic movement data and numerical weather prediction (NWP) data as input. The results provide tests for the NWP and contrail models. The cameras show spreading and thickening contrails, suggesting ice-supersaturation in the ambient air. The ice-supersaturated layer is found thicker and more humid in this case than predicted by the NWP model used. The simulated and observed contrail positions agree up to differences caused by uncertain wind data. The contrail widths, which depend on wake vortex spreading, ambient shear and turbulence, were partly wider than simulated.

  5. Ultraviolet Spectroscopy of Narrow Coronal Mass Ejections

    NASA Astrophysics Data System (ADS)

    Dobrzycka, D.; Raymond, J. C.; Biesecker, D. A.; Li, J.; Ciaravella, A.

    2003-05-01

    We present Ultraviolet Coronagraph Spectrometer (UVCS) observations of five narrow coronal mass ejections (CMEs) that were among 15 narrow CMEs originally selected by Gilbert and coworkers. Two events (1999 March 27, April 15) were ``structured,'' i.e., in white-light data they exhibited well-defined interior features, and three (1999 May 9, May 21, June 3) were ``unstructured,'' i.e., appeared featureless. In UVCS data the events were seen as 4°-13° wide enhancements of the strongest coronal lines H I Lyα and O VI λλ1032, 1037. We derived electron densities for several of the events from the Large Angle and Spectrometric Coronagraph Experiment (LASCO) C2 white-light observations. They are comparable to or smaller than densities inferred for other CMEs. We modeled the observable properties of examples of the structured (1999 April 15) and unstructured (1999 May 9) narrow CMEs at different heights in the corona between 1.5 and 2 Rsolar. The derived electron temperatures, densities, and outflow speeds are similar for those two types of ejections. They were compared with properties of polar coronal jets and other CMEs. We discuss different scenarios of narrow CME formation as either a jet formed by reconnection onto open field lines or a CME ejected by expansion of closed field structures. Overall, we conclude that the existing observations do not definitively place the narrow CMEs into the jet or the CME picture, but the acceleration of the 1999 April 15 event resembles acceleration seen in many CMEs, rather than constant speeds or deceleration observed in jets.

  6. Compact stereo endoscopic camera using microprism arrays.

    PubMed

    Yang, Sung-Pyo; Kim, Jae-Jun; Jang, Kyung-Won; Song, Weon-Kook; Jeong, Ki-Hun

    2016-03-15

    This work reports a microprism array (MPA) based compact stereo endoscopic camera with a single image sensor. The MPAs were monolithically fabricated by using two-step photolithography and geometry-guided resist reflow to form an appropriate prism angle for stereo image pair formation. The fabricated MPAs were transferred onto a glass substrate with a UV curable resin replica by using polydimethylsiloxane (PDMS) replica molding and then successfully integrated in front of a single camera module. The stereo endoscopic camera with MPA splits an image into two stereo images and successfully demonstrates the binocular disparities between the stereo image pairs for objects with different distances. This stereo endoscopic camera can serve as a compact and 3D imaging platform for medical, industrial, or military uses.

  7. Tower Camera Handbook

    SciTech Connect

    Moudry, D

    2005-01-01

    The tower camera in Barrow provides hourly images of ground surrounding the tower. These images may be used to determine fractional snow cover as winter arrives, for comparison with the albedo that can be calculated from downward-looking radiometers, as well as some indication of present weather. Similarly, during spring time, the camera images show the changes in the ground albedo as the snow melts. The tower images are saved in hourly intervals. In addition, two other cameras, the skydeck camera in Barrow and the piling camera in Atqasuk, show the current conditions at those sites.

  8. Automated Camera Calibration

    NASA Technical Reports Server (NTRS)

    Chen, Siqi; Cheng, Yang; Willson, Reg

    2006-01-01

    Automated Camera Calibration (ACAL) is a computer program that automates the generation of calibration data for camera models used in machine vision systems. Machine vision camera models describe the mapping between points in three-dimensional (3D) space in front of the camera and the corresponding points in two-dimensional (2D) space in the camera s image. Calibrating a camera model requires a set of calibration data containing known 3D-to-2D point correspondences for the given camera system. Generating calibration data typically involves taking images of a calibration target where the 3D locations of the target s fiducial marks are known, and then measuring the 2D locations of the fiducial marks in the images. ACAL automates the analysis of calibration target images and greatly speeds the overall calibration process.

  9. Automatic camera tracking for remote manipulators

    SciTech Connect

    Stoughton, R.S.; Martin, H.L.; Bentz, R.R.

    1984-07-01

    The problem of automatic camera tracking of mobile objects is addressed with specific reference to remote manipulators and using either fixed or mobile cameras. The technique uses a kinematic approach employing 4 x 4 coordinate transformation matrices to solve for the needed camera PAN and TILT angles. No vision feedback systems are used, as the required input data are obtained entirely from position sensors from the manipulator and the camera-positioning system. All hardware requirements are generally satisfied by currently available remote manipulator systems with a supervisory computer. The system discussed here implements linear plus on/off (bang-bang) closed-loop control with a +-2-deg deadband. The deadband area is desirable to avoid operator seasickness caused by continuous camera movement. Programming considerations for camera control, including operator interface options, are discussed. The example problem presented is based on an actual implementation using a PDP 11/34 computer, a TeleOperator Systems SM-229 manipulator, and an Oak Ridge National Laboratory (ORNL) camera-positioning system. 3 references, 6 figures, 2 tables.

  10. Automatic camera tracking for remote manipulators

    SciTech Connect

    Stoughton, R.S.; Martin, H.L.; Bentz, R.R.

    1984-04-01

    The problem of automatic camera tracking of mobile objects is addressed with specific reference to remote manipulators and using either fixed or mobile cameras. The technique uses a kinematic approach employing 4 x 4 coordinate transformation matrices to solve for the needed camera PAN and TILT angles. No vision feedback systems are used, as the required input data are obtained entirely from position sensors from the manipulator and the camera-positioning system. All hardware requirements are generally satisfied by currently available remote manipulator systems with a supervisory computer. The system discussed here implements linear plus on/off (bang-bang) closed-loop control with a +-2/sup 0/ deadband. The deadband area is desirable to avoid operator seasickness caused by continuous camera movement. Programming considerations for camera control, including operator interface options, are discussed. The example problem presented is based on an actual implementation using a PDP 11/34 computer, a TeleOperator Systems SM-229 manipulator, and an Oak Ridge National Laboratory (ORNL) camera-positioning system. 3 references, 6 figures, 2 tables.

  11. Sky camera geometric calibration using solar observations

    NASA Astrophysics Data System (ADS)

    Urquhart, Bryan; Kurtz, Ben; Kleissl, Jan

    2016-09-01

    A camera model and associated automated calibration procedure for stationary daytime sky imaging cameras is presented. The specific modeling and calibration needs are motivated by remotely deployed cameras used to forecast solar power production where cameras point skyward and use 180° fisheye lenses. Sun position in the sky and on the image plane provides a simple and automated approach to calibration; special equipment or calibration patterns are not required. Sun position in the sky is modeled using a solar position algorithm (requiring latitude, longitude, altitude and time as inputs). Sun position on the image plane is detected using a simple image processing algorithm. The performance evaluation focuses on the calibration of a camera employing a fisheye lens with an equisolid angle projection, but the camera model is general enough to treat most fixed focal length, central, dioptric camera systems with a photo objective lens. Calibration errors scale with the noise level of the sun position measurement in the image plane, but the calibration is robust across a large range of noise in the sun position. Calibration performance on clear days ranged from 0.94 to 1.24 pixels root mean square error.

  12. Single-Camera Panoramic-Imaging Systems

    NASA Technical Reports Server (NTRS)

    Lindner, Jeffrey L.; Gilbert, John

    2007-01-01

    Panoramic detection systems (PDSs) are developmental video monitoring and image-data processing systems that, as their name indicates, acquire panoramic views. More specifically, a PDS acquires images from an approximately cylindrical field of view that surrounds an observation platform. The main subsystems and components of a basic PDS are a charge-coupled- device (CCD) video camera and lens, transfer optics, a panoramic imaging optic, a mounting cylinder, and an image-data-processing computer. The panoramic imaging optic is what makes it possible for the single video camera to image the complete cylindrical field of view; in order to image the same scene without the benefit of the panoramic imaging optic, it would be necessary to use multiple conventional video cameras, which have relatively narrow fields of view.

  13. Ultra-fast framing camera tube

    DOEpatents

    Kalibjian, Ralph

    1981-01-01

    An electronic framing camera tube features focal plane image dissection and synchronized restoration of the dissected electron line images to form two-dimensional framed images. Ultra-fast framing is performed by first streaking a two-dimensional electron image across a narrow slit, thereby dissecting the two-dimensional electron image into sequential electron line images. The dissected electron line images are then restored into a framed image by a restorer deflector operated synchronously with the dissector deflector. The number of framed images on the tube's viewing screen is equal to the number of dissecting slits in the tube. The distinguishing features of this ultra-fast framing camera tube are the focal plane dissecting slits, and the synchronously-operated restorer deflector which restores the dissected electron line images into a two-dimensional framed image. The framing camera tube can produce image frames having high spatial resolution of optical events in the sub-100 picosecond range.

  14. Angle detector

    NASA Technical Reports Server (NTRS)

    Parra, G. T. (Inventor)

    1978-01-01

    An angle detector for determining a transducer's angular disposition to a capacitive pickup element is described. The transducer comprises a pendulum mounted inductive element moving past the capacitive pickup element. The capacitive pickup element divides the inductive element into two parts L sub 1 and L sub 2 which form the arms of one side of an a-c bridge. Two networks R sub 1 and R sub 2 having a plurality of binary weighted resistors and an equal number of digitally controlled switches for removing resistors from the networks form the arms of the other side of the a-c bridge. A binary counter, controlled by a phase detector, balances the bridge by adjusting the resistance of R sub 1 and R sub 2. The binary output of the counter is representative of the angle.

  15. Bilateral acute angle-closure glaucoma after dexfenfluramine treatment.

    PubMed

    Denis, P; Charpentier, D; Berros, P; Touameur, S

    1995-01-01

    We report the case of a patient with narrow angles who had an attack of bilateral acute angle-closure glaucoma precipitated by dexfenfluramine, a serotoninergic drug developed for appetite suppression. Although the exact mechanism remains uncertain, the pupillary block observed in our case may be the result of the serotoninergic or indirect parasympatholytic properties of the drug on the iris sphincter muscle. Serotonergic psychoactive drugs should be prescribed cautiously in patients with known narrow angles and should be monitored by an ophthalmologist.

  16. Microchannel plate streak camera

    DOEpatents

    Wang, Ching L.

    1989-01-01

    An improved streak camera in which a microchannel plate electron multiplier is used in place of or in combination with the photocathode used in prior streak cameras. The improved streak camera is far more sensitive to photons (UV to gamma-rays) than the conventional x-ray streak camera which uses a photocathode. The improved streak camera offers gamma-ray detection with high temporal resolution. It also offers low-energy x-ray detection without attenuation inside the cathode. Using the microchannel plate in the improved camera has resulted in a time resolution of about 150 ps, and has provided a sensitivity sufficient for 1000 KeV x-rays.

  17. Microchannel plate streak camera

    DOEpatents

    Wang, C.L.

    1984-09-28

    An improved streak camera in which a microchannel plate electron multiplier is used in place of or in combination with the photocathode used in prior streak cameras. The improved streak camera is far more sensitive to photons (uv to gamma-rays) than the conventional x-ray streak camera which uses a photocathode. The improved streak camera offers gamma-ray detection with high temporal resolution. It also offers low-energy x-ray detection without attenuation inside the cathode. Using the microchannel plate in the improved camera has resulted in a time resolution of about 150 ps, and has provided a sensitivity sufficient for 1000 keV x-rays.

  18. Microchannel plate streak camera

    DOEpatents

    Wang, C.L.

    1989-03-21

    An improved streak camera in which a microchannel plate electron multiplier is used in place of or in combination with the photocathode used in prior streak cameras is disclosed. The improved streak camera is far more sensitive to photons (UV to gamma-rays) than the conventional x-ray streak camera which uses a photocathode. The improved streak camera offers gamma-ray detection with high temporal resolution. It also offers low-energy x-ray detection without attenuation inside the cathode. Using the microchannel plate in the improved camera has resulted in a time resolution of about 150 ps, and has provided a sensitivity sufficient for 1,000 KeV x-rays. 3 figs.

  19. GRACE star camera noise

    NASA Astrophysics Data System (ADS)

    Harvey, Nate

    2016-08-01

    Extending results from previous work by Bandikova et al. (2012) and Inacio et al. (2015), this paper analyzes Gravity Recovery and Climate Experiment (GRACE) star camera attitude measurement noise by processing inter-camera quaternions from 2003 to 2015. We describe a correction to star camera data, which will eliminate a several-arcsec twice-per-rev error with daily modulation, currently visible in the auto-covariance function of the inter-camera quaternion, from future GRACE Level-1B product releases. We also present evidence supporting the argument that thermal conditions/settings affect long-term inter-camera attitude biases by at least tens-of-arcsecs, and that several-to-tens-of-arcsecs per-rev star camera errors depend largely on field-of-view.

  20. Analytical multicollimator camera calibration

    USGS Publications Warehouse

    Tayman, W.P.

    1978-01-01

    Calibration with the U.S. Geological survey multicollimator determines the calibrated focal length, the point of symmetry, the radial distortion referred to the point of symmetry, and the asymmetric characteristiecs of the camera lens. For this project, two cameras were calibrated, a Zeiss RMK A 15/23 and a Wild RC 8. Four test exposures were made with each camera. Results are tabulated for each exposure and averaged for each set. Copies of the standard USGS calibration reports are included. ?? 1978.

  1. Polarization encoded color camera.

    PubMed

    Schonbrun, Ethan; Möller, Guðfríður; Di Caprio, Giuseppe

    2014-03-15

    Digital cameras would be colorblind if they did not have pixelated color filters integrated into their image sensors. Integration of conventional fixed filters, however, comes at the expense of an inability to modify the camera's spectral properties. Instead, we demonstrate a micropolarizer-based camera that can reconfigure its spectral response. Color is encoded into a linear polarization state by a chiral dispersive element and then read out in a single exposure. The polarization encoded color camera is capable of capturing three-color images at wavelengths spanning the visible to the near infrared. PMID:24690806

  2. Ringfield lithographic camera

    DOEpatents

    Sweatt, William C.

    1998-01-01

    A projection lithography camera is presented with a wide ringfield optimized so as to make efficient use of extreme ultraviolet radiation from a large area radiation source (e.g., D.sub.source .apprxeq.0.5 mm). The camera comprises four aspheric mirrors optically arranged on a common axis of symmetry with an increased etendue for the camera system. The camera includes an aperture stop that is accessible through a plurality of partial aperture stops to synthesize the theoretical aperture stop. Radiation from a mask is focused to form a reduced image on a wafer, relative to the mask, by reflection from the four aspheric mirrors.

  3. LSST Camera Optics Design

    SciTech Connect

    Riot, V J; Olivier, S; Bauman, B; Pratuch, S; Seppala, L; Gilmore, D; Ku, J; Nordby, M; Foss, M; Antilogus, P; Morgado, N

    2012-05-24

    The Large Synoptic Survey Telescope (LSST) uses a novel, three-mirror, telescope design feeding a camera system that includes a set of broad-band filters and three refractive corrector lenses to produce a flat field at the focal plane with a wide field of view. Optical design of the camera lenses and filters is integrated in with the optical design of telescope mirrors to optimize performance. We discuss the rationale for the LSST camera optics design, describe the methodology for fabricating, coating, mounting and testing the lenses and filters, and present the results of detailed analyses demonstrating that the camera optics will meet their performance goals.

  4. Ultraviolet Spectroscopy of Narrow CMEs

    NASA Astrophysics Data System (ADS)

    Dobrzycka, D.; Raymond, J. C.; Biesecker, D. A.; Li, J.; Ciaravella, A.

    2002-12-01

    Coronal mass ejections (CMEs) are commonly described as new, discrete, bright features appearing in the field of view of a white light coronagraph and moving outward over a period of minutes to hours. Apparent angular widths of the CMEs cover a wide range, from few to 360°. The very narrow structures (narrower than ~15-20°) form only a small subset of all the observed CMEs and are usually referred to as rays, spikes, fans, etc. Recently, Gilbert et al. (2001, ApJ, 550, 1093) reported LASCO white light observations of 15 selected narrow CMEs. We extended the study and analyzed ultraviolet spectroscopy of narrow ejections, including several events listed by Gilbert et al. The data were obtained by the Ultraviolet Coronagraph Spectrometer (UVCS/SOHO). We present comparison of narrow and large CMEs and discuss the relation of the narrow CMEs to coronal jets and/or other narrow transient events. This work is supported by NASA under Grant NAG5-11420 to the Smithsonian Astrophysical Observatory, by the Italian Space Agency and by PRODEX (Swiss contribution).

  5. 71. COMPLETED 'A' FRAME STRUCTURE LOOKING SOUTH SHOWING CAMERA TOWER, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    71. COMPLETED 'A' FRAME STRUCTURE LOOKING SOUTH SHOWING CAMERA TOWER, DRIVE GEARS, COUNTERWEIGHT CAR AND CANTILEVERED WALKWAYS, July 28, 1948. (Original photograph in possession of Dave Willis, San Diego, California.) - Variable Angle Launcher Complex, Variable Angle Launcher, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  6. Calibration of cameras with radially symmetric distortion.

    PubMed

    Tardif, Jean-Philippe; Sturm, Peter; Trudeau, Martin; Roy, Sébastien

    2009-09-01

    We present algorithms for plane-based calibration of general radially distorted cameras. By this, we understand cameras that have a distortion center and an optical axis such that the projection rays of pixels lying on a circle centered on the distortion center form a right viewing cone centered on the optical axis. The camera is said to have a single viewpoint (SVP) if all such viewing cones have the same apex (the optical center); otherwise, we speak of NSVP cases. This model encompasses the classical radial distortion model [5], fisheyes, and most central or noncentral catadioptric cameras. Calibration consists in the estimation of the distortion center, the opening angles of all viewing cones, and their optical centers. We present two approaches of computing a full calibration from dense correspondences of a single or multiple planes with known euclidean structure. The first one is based on a geometric constraint linking viewing cones and their intersections with the calibration plane (conic sections). The second approach is a homography-based method. Experiments using simulated and a broad variety of real cameras show great stability. Furthermore, we provide a comparison with Hartley-Kang's algorithm [12], which, however, cannot handle such a broad variety of camera configurations, showing similar performance.

  7. New developments to improve SO2 cameras

    NASA Astrophysics Data System (ADS)

    Luebcke, P.; Bobrowski, N.; Hoermann, C.; Kern, C.; Klein, A.; Kuhn, J.; Vogel, L.; Platt, U.

    2012-12-01

    The SO2 camera is a remote sensing instrument that measures the two-dimensional distribution of SO2 (column densities) in volcanic plumes using scattered solar radiation as a light source. From these data SO2-fluxes can be derived. The high time resolution of the order of 1 Hz allows correlating SO2 flux measurements with other traditional volcanological measurement techniques, i.e., seismology. In the last years the application of SO2 cameras has increased, however, there is still potential to improve the instrumentation. First of all, the influence of aerosols and ash in the volcanic plume can lead to large errors in the calculated SO2 flux, if not accounted for. We present two different concepts to deal with the influence of ash and aerosols. The first approach uses a co-axial DOAS system that was added to a two filter SO2 camera. The camera used Filter A (peak transmission centred around 315 nm) to measures the optical density of SO2 and Filter B (centred around 330 nm) to correct for the influence of ash and aerosol. The DOAS system simultaneously performs spectroscopic measurements in a small area of the camera's field of view and gives additional information to correct for these effects. Comparing the optical densities for the two filters with the SO2 column density from the DOAS allows not only a much more precise calibration, but also to draw conclusions about the influence from ash and aerosol scattering. Measurement examples from Popocatépetl, Mexico in 2011 are shown and interpreted. Another approach combines the SO2 camera measurement principle with the extremely narrow and periodic transmission of a Fabry-Pérot interferometer. The narrow transmission window allows to select individual SO2 absorption bands (or series of bands) as a substitute for Filter A. Measurements are therefore more selective to SO2. Instead of Filter B, as in classical SO2 cameras, the correction for aerosol can be performed by shifting the transmission window of the Fabry

  8. Improved Tracking of Targets by Cameras on a Mars Rover

    NASA Technical Reports Server (NTRS)

    Kim, Won; Ansar, Adnan; Steele, Robert

    2007-01-01

    A paper describes a method devised to increase the robustness and accuracy of tracking of targets by means of three stereoscopic pairs of video cameras on a Mars-rover-type exploratory robotic vehicle. Two of the camera pairs are mounted on a mast that can be adjusted in pan and tilt; the third camera pair is mounted on the main vehicle body. Elements of the method include a mast calibration, a camera-pointing algorithm, and a purely geometric technique for handing off tracking between different camera pairs at critical distances as the rover approaches a target of interest. The mast calibration is an extension of camera calibration in which the camera images of calibration targets at known positions are collected at various pan and tilt angles. In the camerapointing algorithm, pan and tilt angles are computed by a closed-form, non-iterative solution of inverse kinematics of the mast combined with mathematical models of the cameras. The purely geometric camera-handoff technique involves the use of stereoscopic views of a target of interest in conjunction with the mast calibration.

  9. Integrated mobile radar-camera system in airport perimeter security

    NASA Astrophysics Data System (ADS)

    Zyczkowski, M.; Szustakowski, M.; Ciurapinski, W.; Dulski, R.; Kastek, M.; Trzaskawka, P.

    2011-11-01

    The paper presents the test results of a mobile system for the protection of large-area objects, which consists of a radar and thermal and visual cameras. Radar is used for early detection and localization of an intruder and the cameras with narrow field of view are used for identification and tracking of a moving object. The range evaluation of an integrated system are presented as well as the probability of human detection as a function of the distance from radar-camera unit.

  10. Integrated radar-camera security system: range test

    NASA Astrophysics Data System (ADS)

    Zyczkowski, M.; Szustakowski, M.; Ciurapinski, W.; Karol, M.; Markowski, P.

    2012-06-01

    The paper presents the test results of a mobile system for the protection of large-area objects, which consists of a radar and thermal and visual cameras. Radar is used for early detection and localization of an intruder and the cameras with narrow field of view are used for identification and tracking of a moving object. The range evaluation of an integrated system is presented as well as the probability of human detection as a function of the distance from radar-camera unit.

  11. Camera Operator and Videographer

    ERIC Educational Resources Information Center

    Moore, Pam

    2007-01-01

    Television, video, and motion picture camera operators produce images that tell a story, inform or entertain an audience, or record an event. They use various cameras to shoot a wide range of material, including television series, news and sporting events, music videos, motion pictures, documentaries, and training sessions. Those who film or…

  12. The Camera Cook Book.

    ERIC Educational Resources Information Center

    Education Development Center, Inc., Newton, MA.

    Intended for use with the photographic materials available from the Workshop for Learning Things, Inc., this "camera cookbook" describes procedures that have been tried in classrooms and workshops and proven to be the most functional and inexpensive. Explicit starting off instructions--directions for exploring and loading the camera and for taking…

  13. Constrained space camera assembly

    DOEpatents

    Heckendorn, F.M.; Anderson, E.K.; Robinson, C.W.; Haynes, H.B.

    1999-05-11

    A constrained space camera assembly which is intended to be lowered through a hole into a tank, a borehole or another cavity is disclosed. The assembly includes a generally cylindrical chamber comprising a head and a body and a wiring-carrying conduit extending from the chamber. Means are included in the chamber for rotating the body about the head without breaking an airtight seal formed therebetween. The assembly may be pressurized and accompanied with a pressure sensing means for sensing if a breach has occurred in the assembly. In one embodiment, two cameras, separated from their respective lenses, are installed on a mounting apparatus disposed in the chamber. The mounting apparatus includes means allowing both longitudinal and lateral movement of the cameras. Moving the cameras longitudinally focuses the cameras, and moving the cameras laterally away from one another effectively converges the cameras so that close objects can be viewed. The assembly further includes means for moving lenses of different magnification forward of the cameras. 17 figs.

  14. CCD Luminescence Camera

    NASA Technical Reports Server (NTRS)

    Janesick, James R.; Elliott, Tom

    1987-01-01

    New diagnostic tool used to understand performance and failures of microelectronic devices. Microscope integrated to low-noise charge-coupled-device (CCD) camera to produce new instrument for analyzing performance and failures of microelectronics devices that emit infrared light during operation. CCD camera also used to indentify very clearly parts that have failed where luminescence typically found.

  15. Dry imaging cameras

    PubMed Central

    Indrajit, IK; Alam, Aftab; Sahni, Hirdesh; Bhatia, Mukul; Sahu, Samaresh

    2011-01-01

    Dry imaging cameras are important hard copy devices in radiology. Using dry imaging camera, multiformat images of digital modalities in radiology are created from a sealed unit of unexposed films. The functioning of a modern dry camera, involves a blend of concurrent processes, in areas of diverse sciences like computers, mechanics, thermal, optics, electricity and radiography. Broadly, hard copy devices are classified as laser and non laser based technology. When compared with the working knowledge and technical awareness of different modalities in radiology, the understanding of a dry imaging camera is often superficial and neglected. To fill this void, this article outlines the key features of a modern dry camera and its important issues that impact radiology workflow. PMID:21799589

  16. Impact of CCD camera SNR on polarimetric accuracy.

    PubMed

    Chen, Zhenyue; Wang, Xia; Pacheco, Shaun; Liang, Rongguang

    2014-11-10

    A comprehensive charge-coupled device (CCD) camera noise model is employed to study the impact of CCD camera signal-to-noise ratio (SNR) on polarimetric accuracy. The study shows that the standard deviations of the measured degree of linear polarization (DoLP) and angle of linear polarization (AoLP) are mainly dependent on the camera SNR. With increase in the camera SNR, both the measurement errors and the standard deviations caused by the CCD camera noise decrease. When the DoLP of the incident light is smaller than 0.1, the camera SNR should be at least 75 to achieve a measurement error of less than 0.01. When the input DoLP is larger than 0.5, a SNR of 15 is sufficient to achieve the same measurement accuracy. An experiment is carried out to verify the simulation results.

  17. Detection of pointing errors with CMOS-based camera in intersatellite optical communications

    NASA Astrophysics Data System (ADS)

    Yu, Si-yuan; Ma, Jing; Tan, Li-ying

    2005-01-01

    For very high data rates, intersatellite optical communications hold a potential performance edge over microwave communications. Acquisition and Tracking problem is critical because of the narrow transmit beam. A single array detector in some systems performs both spatial acquisition and tracking functions to detect pointing errors, so both wide field of view and high update rate is required. The past systems tend to employ CCD-based camera with complex readout arrangements, but the additional complexity reduces the applicability of the array based tracking concept. With the development of CMOS array, CMOS-based cameras can employ the single array detector concept. The area of interest feature of the CMOS-based camera allows a PAT system to specify portion of the array. The maximum allowed frame rate increases as the size of the area of interest decreases under certain conditions. A commercially available CMOS camera with 105 fps @ 640×480 is employed in our PAT simulation system, in which only part pixels are used in fact. Beams angle varying in the field of view can be detected after getting across a Cassegrain telescope and an optical focus system. Spot pixel values (8 bits per pixel) reading out from CMOS are transmitted to a DSP subsystem via IEEE 1394 bus, and pointing errors can be computed by the centroid equation. It was shown in test that: (1) 500 fps @ 100×100 is available in acquisition when the field of view is 1mrad; (2)3k fps @ 10×10 is available in tracking when the field of view is 0.1mrad.

  18. Slit-Drum Camera For Projectile Studies

    NASA Astrophysics Data System (ADS)

    Liangyi, Chen; Shaoxiang, Zhou; Guanhua, Cha; Yuxi, Hu

    1983-03-01

    The' model XF-70 slit-drum camera has been developed to record projectile in flight for observation and acquisition. It has two operation modes: (1) synchro-ballistic photography, (2) streak record. The film is located on the inner surface of rotating drum to make it travel. The folding mirror is arranged to reflect light beam 90 degree on to film. The assembly of folding mirror and slit aperture can be together rotated about the optical axis of objective so that the camera makes a feature of recording projectile having any launching angle either in synchro-ballistic photography or in streak record through prerotating the folding mirror assembly by an appropriate angle. The mechanical-electric shutter preventing film from reexposing is close to the slit aperture. The loading mechanism is designed for use in daylight. LED fiducial mark and timing mark are printed at the edges of the frame for accurate measurements.

  19. Night Vision Camera

    NASA Technical Reports Server (NTRS)

    1996-01-01

    PixelVision, Inc. developed the Night Video NV652 Back-illuminated CCD Camera, based on the expertise of a former Jet Propulsion Laboratory employee and a former employee of Scientific Imaging Technologies, Inc. The camera operates without an image intensifier, using back-illuminated and thinned CCD technology to achieve extremely low light level imaging performance. The advantages of PixelVision's system over conventional cameras include greater resolution and better target identification under low light conditions, lower cost and a longer lifetime. It is used commercially for research and aviation.

  20. Gamma camera purchasing.

    PubMed

    Wells, C P; Buxton-Thomas, M

    1995-03-01

    The purchase of a new gamma camera is a major undertaking and represents a long-term commitment for most nuclear medicine departments. The purpose of tendering for gamma cameras is to assess the best match between the requirements of the clinical department and the equipment available and not necessarily to buy the 'best camera' [1-3]. After many years of drawing up tender specifications, this paper tries to outline some of the traps and pitfalls of this potentially perilous, although largely rewarding, exercise. PMID:7770241

  1. Ringfield lithographic camera

    DOEpatents

    Sweatt, W.C.

    1998-09-08

    A projection lithography camera is presented with a wide ringfield optimized so as to make efficient use of extreme ultraviolet radiation from a large area radiation source (e.g., D{sub source} {approx_equal} 0.5 mm). The camera comprises four aspheric mirrors optically arranged on a common axis of symmetry. The camera includes an aperture stop that is accessible through a plurality of partial aperture stops to synthesize the theoretical aperture stop. Radiation from a mask is focused to form a reduced image on a wafer, relative to the mask, by reflection from the four aspheric mirrors. 11 figs.

  2. Kitt Peak speckle camera

    NASA Technical Reports Server (NTRS)

    Breckinridge, J. B.; Mcalister, H. A.; Robinson, W. G.

    1979-01-01

    The speckle camera in regular use at Kitt Peak National Observatory since 1974 is described in detail. The design of the atmospheric dispersion compensation prisms, the use of film as a recording medium, the accuracy of double star measurements, and the next generation speckle camera are discussed. Photographs of double star speckle patterns with separations from 1.4 sec of arc to 4.7 sec of arc are shown to illustrate the quality of image formation with this camera, the effects of seeing on the patterns, and to illustrate the isoplanatic patch of the atmosphere.

  3. Status of the Los Alamos Anger camera

    SciTech Connect

    Seeger, P.A.; Nutter, M.J.

    1985-01-01

    Results of preliminary tests of the neutron Anger camera being developed at Los Alamos are presented. This detector uses a unique encoding scheme involving parellel processing of multiple receptive fields. Design goals have not yet been met, but the results are very encouraging and improvements in the test procedures are expected to show that the detector will be ready for use on a small-angle scattering instrument next year. 3 refs., 4 figs.

  4. Advanced CCD camera developments

    SciTech Connect

    Condor, A.

    1994-11-15

    Two charge coupled device (CCD) camera systems are introduced and discussed, describing briefly the hardware involved, and the data obtained in their various applications. The Advanced Development Group Defense Sciences Engineering Division has been actively designing, manufacturing, fielding state-of-the-art CCD camera systems for over a decade. These systems were originally developed for the nuclear test program to record data from underground nuclear tests. Today, new and interesting application for these systems have surfaced and development is continuing in the area of advanced CCD camera systems, with the new CCD camera that will allow experimenters to replace film for x-ray imaging at the JANUS, USP, and NOVA laser facilities.

  5. The MKID Camera

    NASA Astrophysics Data System (ADS)

    Maloney, P. R.; Czakon, N. G.; Day, P. K.; Duan, R.; Gao, J.; Glenn, J.; Golwala, S.; Hollister, M.; LeDuc, H. G.; Mazin, B.; Noroozian, O.; Nguyen, H. T.; Sayers, J.; Schlaerth, J.; Vaillancourt, J. E.; Vayonakis, A.; Wilson, P.; Zmuidzinas, J.

    2009-12-01

    The MKID Camera project is a collaborative effort of Caltech, JPL, the University of Colorado, and UC Santa Barbara to develop a large-format, multi-color millimeter and submillimeter-wavelength camera for astronomy using microwave kinetic inductance detectors (MKIDs). These are superconducting, micro-resonators fabricated from thin aluminum and niobium films. We couple the MKIDs to multi-slot antennas and measure the change in surface impedance produced by photon-induced breaking of Cooper pairs. The readout is almost entirely at room temperature and can be highly multiplexed; in principle hundreds or even thousands of resonators could be read out on a single feedline. The camera will have 576 spatial pixels that image simultaneously in four bands at 750, 850, 1100 and 1300 microns. It is scheduled for deployment at the Caltech Submillimeter Observatory in the summer of 2010. We present an overview of the camera design and readout and describe the current status of testing and fabrication.

  6. The Complementary Pinhole Camera.

    ERIC Educational Resources Information Center

    Bissonnette, D.; And Others

    1991-01-01

    Presents an experiment based on the principles of rectilinear motion of light operating in a pinhole camera that projects the image of an illuminated object through a small hole in a sheet to an image screen. (MDH)

  7. Miniature TV Camera

    NASA Technical Reports Server (NTRS)

    2004-01-01

    Originally devised to observe Saturn stage separation during Apollo flights, Marshall Space Flight Center's Miniature Television Camera, measuring only 4 x 3 x 1 1/2 inches, quickly made its way to the commercial telecommunications market.

  8. Early Experience & Multisensory Perceptual Narrowing

    PubMed Central

    Lewkowicz, David J.

    2014-01-01

    Perceptual narrowing is a reflection of early experience and contributes in key ways to perceptual and cognitive development. In general, findings have shown that unisensory perceptual sensitivity in early infancy is broadly tuned such that young infants respond to, and discriminate, native as well as non-native sensory inputs, whereas older infants only respond to native inputs. Recently, my colleagues and I discovered that perceptual narrowing occurs at the multisensory processing level as well. The present article reviews this new evidence and puts it in the larger context of multisensory perceptual development and the role that perceptual experience plays in it. Together, the evidence on unisensory and multisensory narrowing shows that early experience shapes the emergence of perceptual specialization and expertise. PMID:24435505

  9. Gamma ray camera

    SciTech Connect

    Robbins, C.D.; Wang, S.

    1980-09-09

    An anger gamma ray camera is improved by the substitution of a gamma ray sensitive, proximity type image intensifier tube for the scintillator screen in the anger camera, the image intensifier tube having a negatively charged flat scintillator screen and a flat photocathode layer and a grounded, flat output phosphor display screen all of the same dimension (Unity image magnification) and all within a grounded metallic tube envelope and having a metallic, inwardly concaved input window between the scintillator screen and the collimator.

  10. Calibration of Action Cameras for Photogrammetric Purposes

    PubMed Central

    Balletti, Caterina; Guerra, Francesco; Tsioukas, Vassilios; Vernier, Paolo

    2014-01-01

    The use of action cameras for photogrammetry purposes is not widespread due to the fact that until recently the images provided by the sensors, using either still or video capture mode, were not big enough to perform and provide the appropriate analysis with the necessary photogrammetric accuracy. However, several manufacturers have recently produced and released new lightweight devices which are: (a) easy to handle, (b) capable of performing under extreme conditions and more importantly (c) able to provide both still images and video sequences of high resolution. In order to be able to use the sensor of action cameras we must apply a careful and reliable self-calibration prior to the use of any photogrammetric procedure, a relatively difficult scenario because of the short focal length of the camera and its wide angle lens that is used to obtain the maximum possible resolution of images. Special software, using functions of the OpenCV library, has been created to perform both the calibration and the production of undistorted scenes for each one of the still and video image capturing mode of a novel action camera, the GoPro Hero 3 camera that can provide still images up to 12 Mp and video up 8 Mp resolution. PMID:25237898

  11. Calibration of action cameras for photogrammetric purposes.

    PubMed

    Balletti, Caterina; Guerra, Francesco; Tsioukas, Vassilios; Vernier, Paolo

    2014-09-18

    The use of action cameras for photogrammetry purposes is not widespread due to the fact that until recently the images provided by the sensors, using either still or video capture mode, were not big enough to perform and provide the appropriate analysis with the necessary photogrammetric accuracy. However, several manufacturers have recently produced and released new lightweight devices which are: (a) easy to handle, (b) capable of performing under extreme conditions and more importantly (c) able to provide both still images and video sequences of high resolution. In order to be able to use the sensor of action cameras we must apply a careful and reliable self-calibration prior to the use of any photogrammetric procedure, a relatively difficult scenario because of the short focal length of the camera and its wide angle lens that is used to obtain the maximum possible resolution of images. Special software, using functions of the OpenCV library, has been created to perform both the calibration and the production of undistorted scenes for each one of the still and video image capturing mode of a novel action camera, the GoPro Hero 3 camera that can provide still images up to 12 Mp and video up 8 Mp resolution.

  12. Calibration of action cameras for photogrammetric purposes.

    PubMed

    Balletti, Caterina; Guerra, Francesco; Tsioukas, Vassilios; Vernier, Paolo

    2014-01-01

    The use of action cameras for photogrammetry purposes is not widespread due to the fact that until recently the images provided by the sensors, using either still or video capture mode, were not big enough to perform and provide the appropriate analysis with the necessary photogrammetric accuracy. However, several manufacturers have recently produced and released new lightweight devices which are: (a) easy to handle, (b) capable of performing under extreme conditions and more importantly (c) able to provide both still images and video sequences of high resolution. In order to be able to use the sensor of action cameras we must apply a careful and reliable self-calibration prior to the use of any photogrammetric procedure, a relatively difficult scenario because of the short focal length of the camera and its wide angle lens that is used to obtain the maximum possible resolution of images. Special software, using functions of the OpenCV library, has been created to perform both the calibration and the production of undistorted scenes for each one of the still and video image capturing mode of a novel action camera, the GoPro Hero 3 camera that can provide still images up to 12 Mp and video up 8 Mp resolution. PMID:25237898

  13. Calibration Procedures on Oblique Camera Setups

    NASA Astrophysics Data System (ADS)

    Kemper, G.; Melykuti, B.; Yu, C.

    2016-06-01

    the nadir camera and the GPS/IMU data, an initial orientation correction and radial correction were calculated. With this approach, the whole project was calculated and calibrated in one step. During the iteration process the radial and tangential parameters were switched on individually for the camera heads and after that the camera constants and principal point positions were checked and finally calibrated. Besides that, the bore side calibration can be performed either on basis of the nadir camera and their offsets, or independently for each camera without correlation to the others. This must be performed in a complete mission anyway to get stability between the single camera heads. Determining the lever arms of the nodal-points to the IMU centre needs more caution than for a single camera especially due to the strong tilt angle. Prepared all these previous steps, you get a highly accurate sensor that enables a fully automated data extraction with a rapid update of you existing data. Frequently monitoring urban dynamics is then possible in fully 3D environment.

  14. Narrow rings - Observations and theory

    NASA Astrophysics Data System (ADS)

    Porco, C. C.

    Voyager 1 and 2 observations have revealed that within the rings of Saturn lies a set of narrow, eccentric rings resembling those of Uranus. Voyager 2 observations have proven crucial in refining the Uranian ring orbit models to a remarkable level of precision. All these rings share some common structural and kinematical characteristics, such as spatially variable radial widths and uniform precession; however, interesting differences exist which provoke attention and may be related to the differing dynamical environments in which these rings dwell. The current state of the knowledge of the shape, behavior, and confinement of narrow rings is discussed.

  15. Dual-view angle backlight module design.

    PubMed

    Chen, Bo-Tsuen; Pan, Jui-Wen

    2015-10-01

    We propose a bilayer light guide plate (BLGP) with specially designed microstructures and two light source modules to achieve an adjustable viewing angle backlight for ecofriendly displays. The dual viewing angle backlight module has a thin, simple structure and a high optical efficiency. Comparison with the conventional edge-lit backlight module shows an improvement in the on-axis luminance of the narrow viewing angle mode of 4.3 times and a decrease in the half-luminance angle of 7° in the horizontal direction. When using the wide viewing angle mode, there is an improvement in the on-axis luminance of 1.8 times and an improvement in the half-luminance angle of 54° in the horizontal direction. The uniformity of illuminance can reach 80% in each mode. The backlight optical sheet number is reduced from 5 to 1. PMID:26479670

  16. Seasonal and vertical changes in leaf angle distribution for selected deciduous broadleaf tree species common to Europe

    NASA Astrophysics Data System (ADS)

    Raabe, Kairi; Pisek, Jan; Sonnentag, Oliver; Annuk, Kalju

    2014-05-01

    Leaf inclination angle distribution is a key parameter in determining the transmission and reflection of radiation by vegetation canopies. It has been previously observed that leaf inclination angle might change gradually from more vertical in the upper canopy and in high light habitats to more horizontal in the lower canopy and in low light habitats [1]. Despite its importance, relatively few measurements on actual leaf angle distributions have been reported for different tree species. Even smaller number of studies have dealt with the possible seasonal changes in leaf angle distribution [2]. In this study the variation of leaf inclination angle distributions was examined both temporally throughout the growing season and vertically at different heights of trees. We report on leaf inclination angle distributions for five deciduous broadleaf species found commonly in several parts of Europe: grey alder (Alnus incana), Silver birch (Betula pendula Roth), chestnut (Castanea), Norway maple (Acer platanoides), and aspen (Populus tremula). The angles were measured using the leveled camera method [3], with the data collected at several separate heights and four times during the period of May-September 2013. The results generally indicate the greatest change in leaf inclination angles for spring, with the changes usually being the most pronounced at the top of the canopy. It should also be noted, however, that whereas the temporal variation proved to be rather consistent for different species, the vertical variation differed more between species. The leveled camera method was additionally tested in terms of sensitivity to different users. Ten people were asked to measure the leaf angles for four different species. The results indicate the method is quite robust in providing coinciding distributions irrespective of the user and level of previous experience with the method. However, certain caution must be exercised when measuring long narrow leaves. References [1] G.G. Mc

  17. Aircraft Altitude Estimation Using Un-calibrated Onboard Cameras

    NASA Astrophysics Data System (ADS)

    Naidu, V. P. S.; Mukherjee, J.

    2012-10-01

    In the present study, implementation and study of aircraft altitude estimation using un-calibrated onboard camera is obtained. A camera model has been implemented to simulate the test data. From the results, it was observed that the rounding nature of pixel coordinates creates fluctuations around the true vanishing point (VP) angle and height computations. These fluctuations were smoothened using a Kalman filter based state estimator. The effects of camera tilt and focal length on VP angle and height computations were also studied. It is concluded that the camera should be perpendicular to the runway for there to be no effect of the focal length on the height computation. It is being planned to apply this algorithm for real time imaging data along with Integrated Enhanced Synthetic Vision (IESVS) on HANSA aircraft.

  18. Stellar Occultations in the Coma of Comet 67/P Chuyumov-Gerasimenko Observed by the OSIRIS Camera System

    NASA Astrophysics Data System (ADS)

    Moissl, Richard; Kueppers, Michael

    2016-10-01

    In this paper we present the results of an analysis on a large part of the existing Image data from the OSIRIS camera system onboard the Rosetta Spacecraft, in which stars of sufficient brightness (down to a limiting magnitude of 6) have been observed through the coma of Comet 67/P Churyumov-Gerasimenko ("C-G"). Over the course of the Rosetta main mission the Coma of the comet underwent large changes in density and structure, owed to the changing insolation along the orbit of C-G. We report on the changes of the stellar signals in the wavelength ranges, covered by the filters of the OSIRIS Narrow-Angle (NAC) and Wide-Angle (WAC) cameras.Acknowledgements: OSIRIS was built by a consortium led by the Max-Planck-Institut für Sonnensystemforschung, Göttingen, Germany, in collaboration with CISAS, University of Padova, Italy, the Laboratoire d'Astrophysique de Marseille, France, the Instituto de Astrofísica de Andalucia, CSIC, Granada, Spain, the Scientific Support Office of the European Space Agency, Noordwijk, The Netherlands, the Instituto Nacional de Técnica Aeroespacial, Madrid, Spain, the Universidad Politéchnica de Madrid, Spain, the Department of Physics and Astronomy of Uppsala University, Sweden, and the Institut für Datentechnik und Kommunikationsnetze der Technischen Universität Braunschweig, Germany.

  19. Calibration Procedures in Mid Format Camera Setups

    NASA Astrophysics Data System (ADS)

    Pivnicka, F.; Kemper, G.; Geissler, S.

    2012-07-01

    A growing number of mid-format cameras are used for aerial surveying projects. To achieve a reliable and geometrically precise result also in the photogrammetric workflow, awareness on the sensitive parts is important. The use of direct referencing systems (GPS/IMU), the mounting on a stabilizing camera platform and the specific values of the mid format camera make a professional setup with various calibration and misalignment operations necessary. An important part is to have a proper camera calibration. Using aerial images over a well designed test field with 3D structures and/or different flight altitudes enable the determination of calibration values in Bingo software. It will be demonstrated how such a calibration can be performed. The direct referencing device must be mounted in a solid and reliable way to the camera. Beside the mechanical work especially in mounting the camera beside the IMU, 2 lever arms have to be measured in mm accuracy. Important are the lever arms from the GPS Antenna to the IMU's calibrated centre and also the lever arm from the IMU centre to the Camera projection centre. In fact, the measurement with a total station is not a difficult task but the definition of the right centres and the need for using rotation matrices can cause serious accuracy problems. The benefit of small and medium format cameras is that also smaller aircrafts can be used. Like that, a gyro bases stabilized platform is recommended. This causes, that the IMU must be mounted beside the camera on the stabilizer. The advantage is, that the IMU can be used to control the platform, the problematic thing is, that the IMU to GPS antenna lever arm is floating. In fact we have to deal with an additional data stream, the values of the movement of the stabiliser to correct the floating lever arm distances. If the post-processing of the GPS-IMU data by taking the floating levers into account, delivers an expected result, the lever arms between IMU and camera can be applied

  20. Deployable Wireless Camera Penetrators

    NASA Technical Reports Server (NTRS)

    Badescu, Mircea; Jones, Jack; Sherrit, Stewart; Wu, Jiunn Jeng

    2008-01-01

    A lightweight, low-power camera dart has been designed and tested for context imaging of sampling sites and ground surveys from an aerobot or an orbiting spacecraft in a microgravity environment. The camera penetrators also can be used to image any line-of-sight surface, such as cliff walls, that is difficult to access. Tethered cameras to inspect the surfaces of planetary bodies use both power and signal transmission lines to operate. A tether adds the possibility of inadvertently anchoring the aerobot, and requires some form of station-keeping capability of the aerobot if extended examination time is required. The new camera penetrators are deployed without a tether, weigh less than 30 grams, and are disposable. They are designed to drop from any altitude with the boost in transmitting power currently demonstrated at approximately 100-m line-of-sight. The penetrators also can be deployed to monitor lander or rover operations from a distance, and can be used for surface surveys or for context information gathering from a touch-and-go sampling site. Thanks to wireless operation, the complexity of the sampling or survey mechanisms may be reduced. The penetrators may be battery powered for short-duration missions, or have solar panels for longer or intermittent duration missions. The imaging device is embedded in the penetrator, which is dropped or projected at the surface of a study site at 90 to the surface. Mirrors can be used in the design to image the ground or the horizon. Some of the camera features were tested using commercial "nanny" or "spy" camera components with the charge-coupled device (CCD) looking at a direction parallel to the ground. Figure 1 shows components of one camera that weighs less than 8 g and occupies a volume of 11 cm3. This camera could transmit a standard television signal, including sound, up to 100 m. Figure 2 shows the CAD models of a version of the penetrator. A low-volume array of such penetrator cameras could be deployed from an

  1. Auto-converging stereo cameras for 3D robotic tele-operation

    NASA Astrophysics Data System (ADS)

    Edmondson, Richard; Aycock, Todd; Chenault, David

    2012-06-01

    Polaris Sensor Technologies has developed a Stereovision Upgrade Kit for TALON robot to provide enhanced depth perception to the operator. This kit previously required the TALON Operator Control Unit to be equipped with the optional touchscreen interface to allow for operator control of the camera convergence angle adjustment. This adjustment allowed for optimal camera convergence independent of the distance from the camera to the object being viewed. Polaris has recently improved the performance of the stereo camera by implementing an Automatic Convergence algorithm in a field programmable gate array in the camera assembly. This algorithm uses scene content to automatically adjust the camera convergence angle, freeing the operator to focus on the task rather than adjustment of the vision system. The autoconvergence capability has been demonstrated on both visible zoom cameras and longwave infrared microbolometer stereo pairs.

  2. Sensing driver awareness by combining fisheye camera and Kinect

    NASA Astrophysics Data System (ADS)

    Wuhe, Z.; Lei, Z.; Ning, D.

    2014-11-01

    In this paper, we propose a Driver's Awareness Catching System to sense the driver's awareness. The system consists of a fisheye camera and a Kinect. The Kinect mounted inside vehicle is used to recognize and locate the 3D face of the driver. The fisheye camera mounted outside vehicle is used to monitor the road. The relative pose between two cameras is calibrated via a state-of-the-art method for calibrating cameras with non-overlapping field of view. The camera system works in this way: First, the SDK of Kinect released by Microsoft is used to tracking driver's face and capture eye's location together with sight direction. Secondly, the eye's location and the sight direction are transformed to the coordinate system of fisheye camera. Thirdly, corresponding view field is extracted from fisheye image. As there is a small displacement between driver's eyes and the optical center of fisheye camera, it will lead to a view angle deviation. Finally, we did a systematic analysis of the error distribution by numerical simulation and proved the feasibility of our camera system. On the other hand, we realized this camera system and achieved desired effect in realworld experiment.

  3. THE DARK ENERGY CAMERA

    SciTech Connect

    Flaugher, B.; Diehl, H. T.; Alvarez, O.; Angstadt, R.; Annis, J. T.; Buckley-Geer, E. J.; Honscheid, K.; Abbott, T. M. C.; Bonati, M.; Antonik, M.; Brooks, D.; Ballester, O.; Cardiel-Sas, L.; Beaufore, L.; Bernstein, G. M.; Bernstein, R. A.; Bigelow, B.; Boprie, D.; Campa, J.; Castander, F. J.; Collaboration: DES Collaboration; and others

    2015-11-15

    The Dark Energy Camera is a new imager with a 2.°2 diameter field of view mounted at the prime focus of the Victor M. Blanco 4 m telescope on Cerro Tololo near La Serena, Chile. The camera was designed and constructed by the Dark Energy Survey Collaboration and meets or exceeds the stringent requirements designed for the wide-field and supernova surveys for which the collaboration uses it. The camera consists of a five-element optical corrector, seven filters, a shutter with a 60 cm aperture, and a charge-coupled device (CCD) focal plane of 250 μm thick fully depleted CCDs cooled inside a vacuum Dewar. The 570 megapixel focal plane comprises 62 2k × 4k CCDs for imaging and 12 2k × 2k CCDs for guiding and focus. The CCDs have 15 μm × 15 μm pixels with a plate scale of 0.″263 pixel{sup −1}. A hexapod system provides state-of-the-art focus and alignment capability. The camera is read out in 20 s with 6–9 electron readout noise. This paper provides a technical description of the camera's engineering, construction, installation, and current status.

  4. The CAMCAO infrared camera

    NASA Astrophysics Data System (ADS)

    Amorim, Antonio; Melo, Antonio; Alves, Joao; Rebordao, Jose; Pinhao, Jose; Bonfait, Gregoire; Lima, Jorge; Barros, Rui; Fernandes, Rui; Catarino, Isabel; Carvalho, Marta; Marques, Rui; Poncet, Jean-Marc; Duarte Santos, Filipe; Finger, Gert; Hubin, Norbert; Huster, Gotthard; Koch, Franz; Lizon, Jean-Louis; Marchetti, Enrico

    2004-09-01

    The CAMCAO instrument is a high resolution near infrared (NIR) camera conceived to operate together with the new ESO Multi-conjugate Adaptive optics Demonstrator (MAD) with the goal of evaluating the feasibility of Multi-Conjugate Adaptive Optics techniques (MCAO) on the sky. It is a high-resolution wide field of view (FoV) camera that is optimized to use the extended correction of the atmospheric turbulence provided by MCAO. While the first purpose of this camera is the sky observation, in the MAD setup, to validate the MCAO technology, in a second phase, the CAMCAO camera is planned to attach directly to the VLT for scientific astrophysical studies. The camera is based on the 2kx2k HAWAII2 infrared detector controlled by an ESO external IRACE system and includes standard IR band filters mounted on a positional filter wheel. The CAMCAO design requires that the optical components and the IR detector should be kept at low temperatures in order to avoid emitting radiation and lower detector noise in the region analysis. The cryogenic system inclues a LN2 tank and a sptially developed pulse tube cryocooler. Field and pupil cold stops are implemented to reduce the infrared background and the stray-light. The CAMCAO optics provide diffraction limited performance down to J Band, but the detector sampling fulfills the Nyquist criterion for the K band (2.2mm).

  5. CAOS-CMOS camera.

    PubMed

    Riza, Nabeel A; La Torre, Juan Pablo; Amin, M Junaid

    2016-06-13

    Proposed and experimentally demonstrated is the CAOS-CMOS camera design that combines the coded access optical sensor (CAOS) imager platform with the CMOS multi-pixel optical sensor. The unique CAOS-CMOS camera engages the classic CMOS sensor light staring mode with the time-frequency-space agile pixel CAOS imager mode within one programmable optical unit to realize a high dynamic range imager for extreme light contrast conditions. The experimentally demonstrated CAOS-CMOS camera is built using a digital micromirror device, a silicon point-photo-detector with a variable gain amplifier, and a silicon CMOS sensor with a maximum rated 51.3 dB dynamic range. White light imaging of three different brightness simultaneously viewed targets, that is not possible by the CMOS sensor, is achieved by the CAOS-CMOS camera demonstrating an 82.06 dB dynamic range. Applications for the camera include industrial machine vision, welding, laser analysis, automotive, night vision, surveillance and multispectral military systems.

  6. The Dark Energy Camera

    SciTech Connect

    Flaugher, B.

    2015-04-11

    The Dark Energy Camera is a new imager with a 2.2-degree diameter field of view mounted at the prime focus of the Victor M. Blanco 4-meter telescope on Cerro Tololo near La Serena, Chile. The camera was designed and constructed by the Dark Energy Survey Collaboration, and meets or exceeds the stringent requirements designed for the wide-field and supernova surveys for which the collaboration uses it. The camera consists of a five element optical corrector, seven filters, a shutter with a 60 cm aperture, and a CCD focal plane of 250-μm thick fully depleted CCDs cooled inside a vacuum Dewar. The 570 Mpixel focal plane comprises 62 2k x 4k CCDs for imaging and 12 2k x 2k CCDs for guiding and focus. The CCDs have 15μm x 15μm pixels with a plate scale of 0.263" per pixel. A hexapod system provides state-of-the-art focus and alignment capability. The camera is read out in 20 seconds with 6-9 electrons readout noise. This paper provides a technical description of the camera's engineering, construction, installation, and current status.

  7. CAOS-CMOS camera.

    PubMed

    Riza, Nabeel A; La Torre, Juan Pablo; Amin, M Junaid

    2016-06-13

    Proposed and experimentally demonstrated is the CAOS-CMOS camera design that combines the coded access optical sensor (CAOS) imager platform with the CMOS multi-pixel optical sensor. The unique CAOS-CMOS camera engages the classic CMOS sensor light staring mode with the time-frequency-space agile pixel CAOS imager mode within one programmable optical unit to realize a high dynamic range imager for extreme light contrast conditions. The experimentally demonstrated CAOS-CMOS camera is built using a digital micromirror device, a silicon point-photo-detector with a variable gain amplifier, and a silicon CMOS sensor with a maximum rated 51.3 dB dynamic range. White light imaging of three different brightness simultaneously viewed targets, that is not possible by the CMOS sensor, is achieved by the CAOS-CMOS camera demonstrating an 82.06 dB dynamic range. Applications for the camera include industrial machine vision, welding, laser analysis, automotive, night vision, surveillance and multispectral military systems. PMID:27410361

  8. Satellite camera image navigation

    NASA Technical Reports Server (NTRS)

    Kamel, Ahmed A. (Inventor); Graul, Donald W. (Inventor); Savides, John (Inventor); Hanson, Charles W. (Inventor)

    1987-01-01

    Pixels within a satellite camera (1, 2) image are precisely located in terms of latitude and longitude on a celestial body, such as the earth, being imaged. A computer (60) on the earth generates models (40, 50) of the satellite's orbit and attitude, respectively. The orbit model (40) is generated from measurements of stars and landmarks taken by the camera (1, 2), and by range data. The orbit model (40) is an expression of the satellite's latitude and longitude at the subsatellite point, and of the altitude of the satellite, as a function of time, using as coefficients (K) the six Keplerian elements at epoch. The attitude model (50) is based upon star measurements taken by each camera (1, 2). The attitude model (50) is a set of expressions for the deviations in a set of mutually orthogonal reference optical axes (x, y, z) as a function of time, for each camera (1, 2). Measured data is fit into the models (40, 50) using a walking least squares fit algorithm. A transformation computer (66 ) transforms pixel coordinates as telemetered by the camera (1, 2) into earth latitude and longitude coordinates, using the orbit and attitude models (40, 50).

  9. The Dark Energy Camera

    NASA Astrophysics Data System (ADS)

    Flaugher, B.; Diehl, H. T.; Honscheid, K.; Abbott, T. M. C.; Alvarez, O.; Angstadt, R.; Annis, J. T.; Antonik, M.; Ballester, O.; Beaufore, L.; Bernstein, G. M.; Bernstein, R. A.; Bigelow, B.; Bonati, M.; Boprie, D.; Brooks, D.; Buckley-Geer, E. J.; Campa, J.; Cardiel-Sas, L.; Castander, F. J.; Castilla, J.; Cease, H.; Cela-Ruiz, J. M.; Chappa, S.; Chi, E.; Cooper, C.; da Costa, L. N.; Dede, E.; Derylo, G.; DePoy, D. L.; de Vicente, J.; Doel, P.; Drlica-Wagner, A.; Eiting, J.; Elliott, A. E.; Emes, J.; Estrada, J.; Fausti Neto, A.; Finley, D. A.; Flores, R.; Frieman, J.; Gerdes, D.; Gladders, M. D.; Gregory, B.; Gutierrez, G. R.; Hao, J.; Holland, S. E.; Holm, S.; Huffman, D.; Jackson, C.; James, D. J.; Jonas, M.; Karcher, A.; Karliner, I.; Kent, S.; Kessler, R.; Kozlovsky, M.; Kron, R. G.; Kubik, D.; Kuehn, K.; Kuhlmann, S.; Kuk, K.; Lahav, O.; Lathrop, A.; Lee, J.; Levi, M. E.; Lewis, P.; Li, T. S.; Mandrichenko, I.; Marshall, J. L.; Martinez, G.; Merritt, K. W.; Miquel, R.; Muñoz, F.; Neilsen, E. H.; Nichol, R. C.; Nord, B.; Ogando, R.; Olsen, J.; Palaio, N.; Patton, K.; Peoples, J.; Plazas, A. A.; Rauch, J.; Reil, K.; Rheault, J.-P.; Roe, N. A.; Rogers, H.; Roodman, A.; Sanchez, E.; Scarpine, V.; Schindler, R. H.; Schmidt, R.; Schmitt, R.; Schubnell, M.; Schultz, K.; Schurter, P.; Scott, L.; Serrano, S.; Shaw, T. M.; Smith, R. C.; Soares-Santos, M.; Stefanik, A.; Stuermer, W.; Suchyta, E.; Sypniewski, A.; Tarle, G.; Thaler, J.; Tighe, R.; Tran, C.; Tucker, D.; Walker, A. R.; Wang, G.; Watson, M.; Weaverdyck, C.; Wester, W.; Woods, R.; Yanny, B.; DES Collaboration

    2015-11-01

    The Dark Energy Camera is a new imager with a 2.°2 diameter field of view mounted at the prime focus of the Victor M. Blanco 4 m telescope on Cerro Tololo near La Serena, Chile. The camera was designed and constructed by the Dark Energy Survey Collaboration and meets or exceeds the stringent requirements designed for the wide-field and supernova surveys for which the collaboration uses it. The camera consists of a five-element optical corrector, seven filters, a shutter with a 60 cm aperture, and a charge-coupled device (CCD) focal plane of 250 μm thick fully depleted CCDs cooled inside a vacuum Dewar. The 570 megapixel focal plane comprises 62 2k × 4k CCDs for imaging and 12 2k × 2k CCDs for guiding and focus. The CCDs have 15 μm × 15 μm pixels with a plate scale of 0.″263 pixel-1. A hexapod system provides state-of-the-art focus and alignment capability. The camera is read out in 20 s with 6-9 electron readout noise. This paper provides a technical description of the camera's engineering, construction, installation, and current status.

  10. Binocular Camera for cockpit visibility of general aviation aircraft

    NASA Astrophysics Data System (ADS)

    Barile, A. J.

    1981-04-01

    A history of cockpit visibility studies and requirements with regard to aircraft safety, human factors, collision avoidance, and accident investigations is presented. The Federal Aviation Administration's development of the Binocular Camera is reviewed, and the technical details of a new and improved camera are discussed. The Binocular Camera uses two 65 mm wide angle F6.8 lenses and covers an 88 1/2 deg field of vision. The camera produces images, representative of what the human eyes see before the brain integrates them into one, thus making it possible to analyze the effect of obstruction to vision. The improvements, applications, and uses of the camera in the research, development, and operations of general aviation aircraft are discussed.

  11. Limits on neutrino oscillations in the Fermilab narrow band beam

    SciTech Connect

    Brucker, E.B.; Jacques, P.F.; Kalelkar, M.; Koller, E.L.; Plano, R.J.; Stamer, P.E.; Baker, N.J.; Connolly, P.L.; Kahn, S.A.; Murtagh, M.J.

    1986-01-01

    A search for neutrino oscillations was made using the Fermilab narrow-band neutrino beam and the 15 ft. bubble chamber. No positive signal for neutrino oscillations was observed. Limits were obtained for mixing angles and neutrino mass differences for nu/sub ..mu../ ..-->.. nu/sub e/, nu/sub ..mu../ ..-->.. nu/sub tau/, nu/sub e/ ..-->.. nu/sub e/. 5 refs.

  12. Solid state television camera

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The design, fabrication, and tests of a solid state television camera using a new charge-coupled imaging device are reported. An RCA charge-coupled device arranged in a 512 by 320 format and directly compatible with EIA format standards was the sensor selected. This is a three-phase, sealed surface-channel array that has 163,840 sensor elements, which employs a vertical frame transfer system for image readout. Included are test results of the complete camera system, circuit description and changes to such circuits as a result of integration and test, maintenance and operation section, recommendations to improve the camera system, and a complete set of electrical and mechanical drawing sketches.

  13. Selective-imaging camera

    NASA Astrophysics Data System (ADS)

    Szu, Harold; Hsu, Charles; Landa, Joseph; Cha, Jae H.; Krapels, Keith A.

    2015-05-01

    How can we design cameras that image selectively in Full Electro-Magnetic (FEM) spectra? Without selective imaging, we cannot use, for example, ordinary tourist cameras to see through fire, smoke, or other obscurants contributing to creating a Visually Degraded Environment (VDE). This paper addresses a possible new design of selective-imaging cameras at firmware level. The design is consistent with physics of the irreversible thermodynamics of Boltzmann's molecular entropy. It enables imaging in appropriate FEM spectra for sensing through the VDE, and displaying in color spectra for Human Visual System (HVS). We sense within the spectra the largest entropy value of obscurants such as fire, smoke, etc. Then we apply a smart firmware implementation of Blind Sources Separation (BSS) to separate all entropy sources associated with specific Kelvin temperatures. Finally, we recompose the scene using specific RGB colors constrained by the HVS, by up/down shifting Planck spectra at each pixel and time.

  14. Study on the diagnostic system of scoliosis by using infrared camera.

    PubMed

    Jeong, Jin-hyoung; Park, Eun-jeong; Cho, Chang-ok; Kim, Yoon-jeong; Lee, Sang-sik

    2015-01-01

    In this study, the radiation generated in the diagnosis of scoliosis, to solve the problems by using an infrared camera and an optical marker system that can diagnose scoliosis developed. System developed by the infrared camera attached to the optical spinal curvature is recognized as a marker to shoot the angle between the two optical markers are measured. Measurement of angle, we used the Cobb's Angle method used in the diagnosis of spinal scoliosis. We developed a software to be able to output to the screen using an infrared camera to diagnose spinal scoliosis. Software is composed of camera output unit was manufactured in Labview, angle measurement unit, in Cobb's Angle measurement unit. In the future, kyphosis, Hallux Valgus, such as the diagnosis of orthopedic disorders that require the use of a diagnostic system is expected case.

  15. Study on the diagnostic system of scoliosis by using infrared camera.

    PubMed

    Jeong, Jin-hyoung; Park, Eun-jeong; Cho, Chang-ok; Kim, Yoon-jeong; Lee, Sang-sik

    2015-01-01

    In this study, the radiation generated in the diagnosis of scoliosis, to solve the problems by using an infrared camera and an optical marker system that can diagnose scoliosis developed. System developed by the infrared camera attached to the optical spinal curvature is recognized as a marker to shoot the angle between the two optical markers are measured. Measurement of angle, we used the Cobb's Angle method used in the diagnosis of spinal scoliosis. We developed a software to be able to output to the screen using an infrared camera to diagnose spinal scoliosis. Software is composed of camera output unit was manufactured in Labview, angle measurement unit, in Cobb's Angle measurement unit. In the future, kyphosis, Hallux Valgus, such as the diagnosis of orthopedic disorders that require the use of a diagnostic system is expected case. PMID:26405878

  16. What convention is used for the illumination and view angles?

    Atmospheric Science Data Center

    2014-12-08

    ... Azimuth angles are measured clockwise from the direction of travel to local north. For both the Sun and cameras, azimuth describes the ... to the equator, because of its morning equator crossing time. Additionally, the difference in view and solar azimuth angle will be near ...

  17. GROT in NICMOS Cameras

    NASA Astrophysics Data System (ADS)

    Sosey, M.; Bergeron, E.

    1999-09-01

    Grot is exhibited as small areas of reduced sensitivity, most likely due to flecks of antireflective paint scraped off the optical baffles as they were forced against each other. This paper characterizes grot associated with all three cameras. Flat field images taken from March 1997 through January 1999 have been investigated for changes in the grot, including possible wavelength dependency and throughput characteristics. The main products of this analysis are grot masks for each of the cameras which may also contain any new cold or dead pixels not specified in the data quality arrays.

  18. Artificial human vision camera

    NASA Astrophysics Data System (ADS)

    Goudou, J.-F.; Maggio, S.; Fagno, M.

    2014-10-01

    In this paper we present a real-time vision system modeling the human vision system. Our purpose is to inspire from human vision bio-mechanics to improve robotic capabilities for tasks such as objects detection and tracking. This work describes first the bio-mechanical discrepancies between human vision and classic cameras and the retinal processing stage that takes place in the eye, before the optic nerve. The second part describes our implementation of these principles on a 3-camera optical, mechanical and software model of the human eyes and associated bio-inspired attention model.

  19. Lightweight, Compact, Long Range Camera Design

    NASA Astrophysics Data System (ADS)

    Shafer, Donald V.

    1983-08-01

    The model 700 camera is the latest in a 30-year series of LOROP cameras developed by McDonnell Douglas Astronautics Company (MDAC) and their predecessor companies. The design achieves minimum size and weight and is optimized for low-contrast performance. The optical system includes a 66-inch focal length, f/5.6, apochromatic lens and three folding mirrors imaging on a 4.5-inch square format. A three-axis active stabilization system provides the capability for long exposure time and, hence, fine grain films can be used. The optical path forms a figure "4" behind the lens. In front of the lens is a 45° pointing mirror. This folded configuration contributed greatly to the lightweight and compact design. This sequential autocycle frame camera has three modes of operation with one, two, and three step positions to provide a choice of swath widths within the range of lateral coverage. The magazine/shutter assembly rotates in relationship with the pointing mirror and aircraft drift angle to maintain film format alignment with the flight path. The entire camera is angular rate stabilized in roll, pitch, and yaw. It also employs a lightweight, electro-magnetically damped, low-natural-frequency spring suspension for passive isolation from aircraft vibration inputs. The combined film transport and forward motion compensation (FMC) mechanism, which is operated by a single motor, is contained in a magazine that can, depending on accessibility which is installation dependent, be changed in flight. The design also stresses thermal control, focus control, structural stiffness, and maintainability. The camera is operated from a remote control panel. This paper describes the leading particulars and features of the camera as related to weight and configuration.

  20. Kinetic narrowing of size distribution

    NASA Astrophysics Data System (ADS)

    Dubrovskii, V. G.

    2016-05-01

    We present a model that reveals an interesting possibility for narrowing the size distribution of nanostructures when the deterministic growth rate changes its sign from positive to negative at a certain stationary size. Such a behavior occurs in self-catalyzed one-dimensional III-V nanowires and more generally whenever a negative "adsorption-desorption" term in the growth rate is compensated by a positive "diffusion flux." By asymptotically solving the Fokker-Planck equation, we derive an explicit representation for the size distribution that describes either Poissonian broadening or self-regulated narrowing depending on the parameters. We show how the fluctuation-induced spreading of the size distribution can be completely suppressed in systems with size self-stabilization. These results can be used for obtaining size-uniform ensembles of different nanostructures.

  1. Camera Calibration Based on Perspective Geometry and Its Application in LDWS

    NASA Astrophysics Data System (ADS)

    Xu, Huarong; Wang, Xiaodong

    In this paper, we present a novel algorithm to calibrate cameras for lane departure warning system(LDWS). The algorithm only need a set of parallel lane markings and parallel lines perpendicular to the ground plane to determine the camera parameters such as the roll angle, the tilt angle, the pan angle and the focal length. Then with the camera height, the positions of objects in world space can be easily obtained from the image. We apply the proposed method to our lane departure warning system which monitors the distance between the car and road boundaries. Experiments show that the proposed method is easy to operate, and can achieve accurate results.

  2. Snapshot polarimeter fundus camera.

    PubMed

    DeHoog, Edward; Luo, Haitao; Oka, Kazuhiko; Dereniak, Eustace; Schwiegerling, James

    2009-03-20

    A snapshot imaging polarimeter utilizing Savart plates is integrated into a fundus camera for retinal imaging. Acquired retinal images can be processed to reconstruct Stokes vector images, giving insight into the polarization properties of the retina. Results for images from a normal healthy retina and retinas with pathology are examined and compared. PMID:19305463

  3. Spas color camera

    NASA Technical Reports Server (NTRS)

    Toffales, C.

    1983-01-01

    The procedures to be followed in assessing the performance of the MOS color camera are defined. Aspects considered include: horizontal and vertical resolution; value of the video signal; gray scale rendition; environmental (vibration and temperature) tests; signal to noise ratios; and white balance correction.

  4. The LSST Camera Overview

    SciTech Connect

    Gilmore, Kirk; Kahn, Steven A.; Nordby, Martin; Burke, David; O'Connor, Paul; Oliver, John; Radeka, Veljko; Schalk, Terry; Schindler, Rafe; /SLAC

    2007-01-10

    The LSST camera is a wide-field optical (0.35-1um) imager designed to provide a 3.5 degree FOV with better than 0.2 arcsecond sampling. The detector format will be a circular mosaic providing approximately 3.2 Gigapixels per image. The camera includes a filter mechanism and, shuttering capability. It is positioned in the middle of the telescope where cross-sectional area is constrained by optical vignetting and heat dissipation must be controlled to limit thermal gradients in the optical beam. The fast, f/1.2 beam will require tight tolerances on the focal plane mechanical assembly. The focal plane array operates at a temperature of approximately -100 C to achieve desired detector performance. The focal plane array is contained within an evacuated cryostat, which incorporates detector front-end electronics and thermal control. The cryostat lens serves as an entrance window and vacuum seal for the cryostat. Similarly, the camera body lens serves as an entrance window and gas seal for the camera housing, which is filled with a suitable gas to provide the operating environment for the shutter and filter change mechanisms. The filter carousel can accommodate 5 filters, each 75 cm in diameter, for rapid exchange without external intervention.

  5. Jack & the Video Camera

    ERIC Educational Resources Information Center

    Charlan, Nathan

    2010-01-01

    This article narrates how the use of video camera has transformed the life of Jack Williams, a 10-year-old boy from Colorado Springs, Colorado, who has autism. The way autism affected Jack was unique. For the first nine years of his life, Jack remained in his world, alone. Functionally non-verbal and with motor skill problems that affected his…

  6. Photogrammetric camera calibration

    USGS Publications Warehouse

    Tayman, W.P.; Ziemann, H.

    1984-01-01

    Section 2 (Calibration) of the document "Recommended Procedures for Calibrating Photogrammetric Cameras and Related Optical Tests" from the International Archives of Photogrammetry, Vol. XIII, Part 4, is reviewed in the light of recent practical work, and suggestions for changes are made. These suggestions are intended as a basis for a further discussion. ?? 1984.

  7. Communities, Cameras, and Conservation

    ERIC Educational Resources Information Center

    Patterson, Barbara

    2012-01-01

    Communities, Cameras, and Conservation (CCC) is the most exciting and valuable program the author has seen in her 30 years of teaching field science courses. In this citizen science project, students and community volunteers collect data on mountain lions ("Puma concolor") at four natural areas and public parks along the Front Range of Colorado.…

  8. Anger Camera Firmware

    2010-11-19

    The firmware is responsible for the operation of Anger Camera Electronics, calculation of position, time of flight and digital communications. It provides a first stage analysis of 48 signals from 48 analog signals that have been converted to digital values using A/D convertors.

  9. Make a Pinhole Camera

    ERIC Educational Resources Information Center

    Fisher, Diane K.; Novati, Alexander

    2009-01-01

    On Earth, using ordinary visible light, one can create a single image of light recorded over time. Of course a movie or video is light recorded over time, but it is a series of instantaneous snapshots, rather than light and time both recorded on the same medium. A pinhole camera, which is simple to make out of ordinary materials and using ordinary…

  10. Advanced Virgo phase cameras

    NASA Astrophysics Data System (ADS)

    van der Schaaf, L.; Agatsuma, K.; van Beuzekom, M.; Gebyehu, M.; van den Brand, J.

    2016-05-01

    A century after the prediction of gravitational waves, detectors have reached the sensitivity needed to proof their existence. One of them, the Virgo interferometer in Pisa, is presently being upgraded to Advanced Virgo (AdV) and will come into operation in 2016. The power stored in the interferometer arms raises from 20 to 700 kW. This increase is expected to introduce higher order modes in the beam, which could reduce the circulating power in the interferometer, limiting the sensitivity of the instrument. To suppress these higher-order modes, the core optics of Advanced Virgo is equipped with a thermal compensation system. Phase cameras, monitoring the real-time status of the beam constitute a critical component of this compensation system. These cameras measure the phases and amplitudes of the laser-light fields at the frequencies selected to control the interferometer. The measurement combines heterodyne detection with a scan of the wave front over a photodetector with pin-hole aperture. Three cameras observe the phase front of these laser sidebands. Two of them monitor the in-and output of the interferometer arms and the third one is used in the control of the aberrations introduced by the power recycling cavity. In this paper the working principle of the phase cameras is explained and some characteristic parameters are described.

  11. Imaging phoswich anger camera

    NASA Astrophysics Data System (ADS)

    Manchanda, R. K.; Sood, R. K.

    1991-08-01

    High angular resolution and low background are the primary requisites for detectors for future astronomy experiments in the low energy gamma-ray region. Scintillation counters are still the only available large area detector for studies in this energy range. Preliminary details of a large area phoswich anger camera designed for coded aperture imaging is described and its background and position characteristics are discussed.

  12. Millisecond readout CCD camera

    NASA Astrophysics Data System (ADS)

    Prokop, Mark; McCurnin, Thomas W.; Stradling, Gary L.

    1993-01-01

    We have developed a prototype of a fast-scanning CCD readout system to record a 1024 X 256 pixel image and transport the image to a recording station within 1 ms of the experimental event. The system is designed to have a dynamic range of greater than 1000 with adequate sensitivity to read single-electron excitations of a CRT phosphor when amplified by a microchannel plate image intensifier. This readout camera is intended for recording images from oscilloscopes, streak, and framing cameras. The sensor is a custom CCD chip, designed by LORAL Aeroneutronics. This CCD chip is designed with 16 parallel output ports to supply the necessary image transfer speed. The CCD is designed as an interline structure to allow fast clearing of the image and on-chip fast sputtering. Special antiblooming provisions are also included. The camera is designed to be modular and to allow CCD chips of other sizes to be used with minimal reengineering of the camera head.

  13. Millisecond readout CCD camera

    NASA Astrophysics Data System (ADS)

    Prokop, M.; McCurnin, T. W.; Stradling, G.

    We have developed a prototype of a fast-scanning CCD readout system to record a 1024 x 256 pixel image and transport the image to a recording station within 1 ms of the experimental event. The system is designed to have a dynamic range of greater than 1000 with adequate sensitivity to read single-electron excitations of a CRT phosphor when amplified by a microchannel plate image intensifier. This readout camera is intended for recording images from oscilloscopes, streak, and framing cameras. The sensor is a custom CCD chip, designed by LORAL Aeroneutronics. This CCD chip is designed with 16 parallel output ports to supply the necessary image transfer speed. The CCD is designed as an interline structure to allow fast clearing of the image and on-chip fast shuttering. Special antiblooming provisions are also included. The camera is designed to be modular and to allow CCD chips of other sizes to be used with minimal reengineering of the camera head.

  14. Image Sensors Enhance Camera Technologies

    NASA Technical Reports Server (NTRS)

    2010-01-01

    In the 1990s, a Jet Propulsion Laboratory team led by Eric Fossum researched ways of improving complementary metal-oxide semiconductor (CMOS) image sensors in order to miniaturize cameras on spacecraft while maintaining scientific image quality. Fossum s team founded a company to commercialize the resulting CMOS active pixel sensor. Now called the Aptina Imaging Corporation, based in San Jose, California, the company has shipped over 1 billion sensors for use in applications such as digital cameras, camera phones, Web cameras, and automotive cameras. Today, one of every three cell phone cameras on the planet feature Aptina s sensor technology.

  15. The Martian Atmosphere as seen by the OSIRIS camera

    NASA Astrophysics Data System (ADS)

    Moissl, R.; Pajola, M.; Määttänen, A.; Küppers, M.

    2013-09-01

    Despite the long time that has passed since the observations, only few studies based on the data from the wide- (WAC) and narrow- (NAC) angle camera systems of OSIRIS have been published to date. In this Paper we will present the results on the observations of the Martian Limbs acquired by the OSIRIS [1] instrument on board the ESA mission Rosetta during its swing-by maneuver around February 25th, 2007 on the way to Comet 67P/Churyumov-Gerasimenko, during the onset of the very active dust storm season of Mars year 28 (at Ls ~190). Although OSIRIS did only capture the Planet during a relatively short time interval of several hours, the obtained global view and the spectral coverage, from the UV (245 nm) over the full visible range to the near IR (1000 nm), allow for a valuable global overview over the state of the Martian atmosphere. The image acquisition started a February 24 around 18:00 UTC from a distance of about 260.000 km and continued until 04:51 UTC on February 25 to a distance of 105.000 km. During the Closest Approach to the Planet at 01:54 UTC o February 25 at a distance of 250 km. All images have been manually co-registered with the help of SPICE data, and vertical profiles have been extracted over the limb in intervals of ~0.5 degrees (se Figures 1 and 2). Despite the long time that has passed since the observations, only few studies based on the data from the wide- (WAC) and narrow- (NAC) angle camera systems of OSIRIS have been published to date. In this Paper we will present the results on the observations of the Martian Limbs acquired by the OSIRIS [1] instrument on board the ESA mission Rosetta during its swing-by maneuver around February 25th, 2007 on the way to Comet 67P/Churyumov-Gerasimenko, during the onset of the very active dust storm season of Mars year 28 (at Ls ~190). Although OSIRIS did only capture the Planet during a relatively short time interval of several hours, the obtained global view and the spectral coverage, from the UV (245

  16. Quality criterion for digital still camera

    NASA Astrophysics Data System (ADS)

    Bezryadin, Sergey

    2007-02-01

    The main quality requirements for a digital still camera are color capturing accuracy, low noise level, and quantum efficiency. Different consumers assign different priorities to the listed parameters, and camera designers need clearly formulated methods for their evaluation. While there are procedures providing noise level and quantum efficiency estimation, there are no effective means for color capturing accuracy estimation. Introduced in this paper criterion allows to fill this gap. Luther-Ives condition for correct color reproduction system became known in the beginning of the last century. However, since no detector system satisfies Luther-Ives condition, there are always stimuli that are distinctly different for an observer, but which detectors are unable to distinguish. To estimate conformity of a detector set with Luther-Ives condition and calculate a measure of discrepancy, an angle between detector sensor sensitivity and Cohen's Fundamental Color Space may be used. In this paper, the divergence angle is calculated for some typical CCD sensors and a demonstration provided on how this angle might be reduced with a corrective filter. In addition, it is shown that with a specific corrective filter Foveon sensors turn into a detector system with a good Luther-Ives condition compliance.

  17. Dual cameras acquisition and display system of retina-like sensor camera and rectangular sensor camera

    NASA Astrophysics Data System (ADS)

    Cao, Nan; Cao, Fengmei; Lin, Yabin; Bai, Tingzhu; Song, Shengyu

    2015-04-01

    For a new kind of retina-like senor camera and a traditional rectangular sensor camera, dual cameras acquisition and display system need to be built. We introduce the principle and the development of retina-like senor. Image coordinates transformation and interpolation based on sub-pixel interpolation need to be realized for our retina-like sensor's special pixels distribution. The hardware platform is composed of retina-like senor camera, rectangular sensor camera, image grabber and PC. Combined the MIL and OpenCV library, the software program is composed in VC++ on VS 2010. Experience results show that the system can realizes two cameras' acquisition and display.

  18. Depth estimation and camera calibration of a focused plenoptic camera for visual odometry

    NASA Astrophysics Data System (ADS)

    Zeller, Niclas; Quint, Franz; Stilla, Uwe

    2016-08-01

    This paper presents new and improved methods of depth estimation and camera calibration for visual odometry with a focused plenoptic camera. For depth estimation we adapt an algorithm previously used in structure-from-motion approaches to work with images of a focused plenoptic camera. In the raw image of a plenoptic camera, scene patches are recorded in several micro-images under slightly different angles. This leads to a multi-view stereo-problem. To reduce the complexity, we divide this into multiple binocular stereo problems. For each pixel with sufficient gradient we estimate a virtual (uncalibrated) depth based on local intensity error minimization. The estimated depth is characterized by the variance of the estimate and is subsequently updated with the estimates from other micro-images. Updating is performed in a Kalman-like fashion. The result of depth estimation in a single image of the plenoptic camera is a probabilistic depth map, where each depth pixel consists of an estimated virtual depth and a corresponding variance. Since the resulting image of the plenoptic camera contains two plains: the optical image and the depth map, camera calibration is divided into two separate sub-problems. The optical path is calibrated based on a traditional calibration method. For calibrating the depth map we introduce two novel model based methods, which define the relation of the virtual depth, which has been estimated based on the light-field image, and the metric object distance. These two methods are compared to a well known curve fitting approach. Both model based methods show significant advantages compared to the curve fitting method. For visual odometry we fuse the probabilistic depth map gained from one shot of the plenoptic camera with the depth data gained by finding stereo correspondences between subsequent synthesized intensity images of the plenoptic camera. These images can be synthesized totally focused and thus finding stereo correspondences is enhanced

  19. Streak camera receiver definition study

    NASA Technical Reports Server (NTRS)

    Johnson, C. B.; Hunkler, L. T., Sr.; Letzring, S. A.; Jaanimagi, P.

    1990-01-01

    Detailed streak camera definition studies were made as a first step toward full flight qualification of a dual channel picosecond resolution streak camera receiver for the Geoscience Laser Altimeter and Ranging System (GLRS). The streak camera receiver requirements are discussed as they pertain specifically to the GLRS system, and estimates of the characteristics of the streak camera are given, based upon existing and near-term technological capabilities. Important problem areas are highlighted, and possible corresponding solutions are discussed.

  20. Automated Camera Array Fine Calibration

    NASA Technical Reports Server (NTRS)

    Clouse, Daniel; Padgett, Curtis; Ansar, Adnan; Cheng, Yang

    2008-01-01

    Using aerial imagery, the JPL FineCalibration (JPL FineCal) software automatically tunes a set of existing CAHVOR camera models for an array of cameras. The software finds matching features in the overlap region between images from adjacent cameras, and uses these features to refine the camera models. It is not necessary to take special imagery of a known target and no surveying is required. JPL FineCal was developed for use with an aerial, persistent surveillance platform.

  1. Combustion pinhole camera system

    DOEpatents

    Witte, A.B.

    1984-02-21

    A pinhole camera system is described utilizing a sealed optical-purge assembly which provides optical access into a coal combustor or other energy conversion reactors. The camera system basically consists of a focused-purge pinhole optical port assembly, a conventional TV vidicon receiver, an external, variable density light filter which is coupled electronically to the vidicon automatic gain control (agc). The key component of this system is the focused-purge pinhole optical port assembly which utilizes a purging inert gas to keep debris from entering the port and a lens arrangement which transfers the pinhole to the outside of the port assembly. One additional feature of the port assembly is that it is not flush with the interior of the combustor. 2 figs.

  2. Combustion pinhole camera system

    DOEpatents

    Witte, Arvel B.

    1984-02-21

    A pinhole camera system utilizing a sealed optical-purge assembly which provides optical access into a coal combustor or other energy conversion reactors. The camera system basically consists of a focused-purge pinhole optical port assembly, a conventional TV vidicon receiver, an external, variable density light filter which is coupled electronically to the vidicon automatic gain control (agc). The key component of this system is the focused-purge pinhole optical port assembly which utilizes a purging inert gas to keep debris from entering the port and a lens arrangement which transfers the pinhole to the outside of the port assembly. One additional feature of the port assembly is that it is not flush with the interior of the combustor.

  3. LSST Camera Optics

    SciTech Connect

    Olivier, S S; Seppala, L; Gilmore, K; Hale, L; Whistler, W

    2006-06-05

    The Large Synoptic Survey Telescope (LSST) is a unique, three-mirror, modified Paul-Baker design with an 8.4m primary, a 3.4m secondary, and a 5.0m tertiary feeding a camera system that includes corrector optics to produce a 3.5 degree field of view with excellent image quality (<0.3 arcsecond 80% encircled diffracted energy) over the entire field from blue to near infra-red wavelengths. We describe the design of the LSST camera optics, consisting of three refractive lenses with diameters of 1.6m, 1.0m and 0.7m, along with a set of interchangeable, broad-band, interference filters with diameters of 0.75m. We also describe current plans for fabricating, coating, mounting and testing these lenses and filters.

  4. NSTX Tangential Divertor Camera

    SciTech Connect

    A.L. Roquemore; Ted Biewer; D. Johnson; S.J. Zweben; Nobuhiro Nishino; V.A. Soukhanovskii

    2004-07-16

    Strong magnetic field shear around the divertor x-point is numerically predicted to lead to strong spatial asymmetries in turbulence driven particle fluxes. To visualize the turbulence and associated impurity line emission near the lower x-point region, a new tangential observation port has been recently installed on NSTX. A reentrant sapphire window with a moveable in-vessel mirror images the divertor region from the center stack out to R 80 cm and views the x-point for most plasma configurations. A coherent fiber optic bundle transmits the image through a remotely selected filter to a fast camera, for example a 40500 frames/sec Photron CCD camera. A gas puffer located in the lower inboard divertor will localize the turbulence in the region near the x-point. Edge fluid and turbulent codes UEDGE and BOUT will be used to interpret impurity and deuterium emission fluctuation measurements in the divertor.

  5. Spectral Narrowing in Semiconductor Microcavities

    NASA Astrophysics Data System (ADS)

    La Rocca, G. C.; Bassani, F.; Agranovich, V. M.

    1998-03-01

    The notion of in-plane motional narrowing of cavity polariton (CP) lines has been recently considered ( D.M. Whittaker et al.), Phys. Rev. Lett. 77, 4792 (1996); V. Savona et al.. Phys. Rev. Lett. 78, 4470 (1997). We point out that, in the presence of N>1 resonating quantum wells (QWs), the exciton component in a CP is coherently delocalized over all the individual QWs. Besides the two CP branches, also a dark exciton branch is present given by N-1 states similarly delocalized, but orthogonal to the cavity photon mode. If the QW disorder potential is weak compared to the Rabi splitting, it is seen by a CP as reduced by a factor 1/√N because of averaging along the cavity axis (G.C. La Rocca, F. Bassani, V.M. Agranovich, JOSA B 15), (1998). As for the in-plane motional narrowing, a simple scaling procedure shows that it would imply that the inhomogeneous linewidth of a CP be reduced by about four orders of magnitude compared to a QW exciton, which is incompatible with the experimental observations. The physical reason of such a shortcoming is that the disorder introduces localized exciton states which can resonantly scatter CPs, mixing them with states having a large k vector as well as with dark exciton states.

  6. Hemispherical Laue camera

    DOEpatents

    Li, James C. M.; Chu, Sungnee G.

    1980-01-01

    A hemispherical Laue camera comprises a crystal sample mount for positioning a sample to be analyzed at the center of sphere of a hemispherical, X-radiation sensitive film cassette, a collimator, a stationary or rotating sample mount and a set of standard spherical projection spheres. X-radiation generated from an external source is directed through the collimator to impinge onto the single crystal sample on the stationary mount. The diffracted beam is recorded on the hemispherical X-radiation sensitive film mounted inside the hemispherical film cassette in either transmission or back-reflection geometry. The distances travelled by X-radiation diffracted from the crystal to the hemispherical film are the same for all crystal planes which satisfy Bragg's Law. The recorded diffraction spots or Laue spots on the film thereby preserve both the symmetry information of the crystal structure and the relative intensities which are directly related to the relative structure factors of the crystal orientations. The diffraction pattern on the exposed film is compared with the known diffraction pattern on one of the standard spherical projection spheres for a specific crystal structure to determine the orientation of the crystal sample. By replacing the stationary sample support with a rotating sample mount, the hemispherical Laue camera can be used for crystal structure determination in a manner previously provided in conventional Debye-Scherrer cameras.

  7. Gamma ray camera

    DOEpatents

    Perez-Mendez, V.

    1997-01-21

    A gamma ray camera is disclosed for detecting rays emanating from a radiation source such as an isotope. The gamma ray camera includes a sensor array formed of a visible light crystal for converting incident gamma rays to a plurality of corresponding visible light photons, and a photosensor array responsive to the visible light photons in order to form an electronic image of the radiation therefrom. The photosensor array is adapted to record an integrated amount of charge proportional to the incident gamma rays closest to it, and includes a transparent metallic layer, photodiode consisting of a p-i-n structure formed on one side of the transparent metallic layer, and comprising an upper p-type layer, an intermediate layer and a lower n-type layer. In the preferred mode, the scintillator crystal is composed essentially of a cesium iodide (CsI) crystal preferably doped with a predetermined amount impurity, and the p-type upper intermediate layers and said n-type layer are essentially composed of hydrogenated amorphous silicon (a-Si:H). The gamma ray camera further includes a collimator interposed between the radiation source and the sensor array, and a readout circuit formed on one side of the photosensor array. 6 figs.

  8. Gamma ray camera

    DOEpatents

    Perez-Mendez, Victor

    1997-01-01

    A gamma ray camera for detecting rays emanating from a radiation source such as an isotope. The gamma ray camera includes a sensor array formed of a visible light crystal for converting incident gamma rays to a plurality of corresponding visible light photons, and a photosensor array responsive to the visible light photons in order to form an electronic image of the radiation therefrom. The photosensor array is adapted to record an integrated amount of charge proportional to the incident gamma rays closest to it, and includes a transparent metallic layer, photodiode consisting of a p-i-n structure formed on one side of the transparent metallic layer, and comprising an upper p-type layer, an intermediate layer and a lower n-type layer. In the preferred mode, the scintillator crystal is composed essentially of a cesium iodide (CsI) crystal preferably doped with a predetermined amount impurity, and the p-type upper intermediate layers and said n-type layer are essentially composed of hydrogenated amorphous silicon (a-Si:H). The gamma ray camera further includes a collimator interposed between the radiation source and the sensor array, and a readout circuit formed on one side of the photosensor array.

  9. Orbiter Camera Payload System

    NASA Technical Reports Server (NTRS)

    1980-01-01

    Components for an orbiting camera payload system (OCPS) include the large format camera (LFC), a gas supply assembly, and ground test, handling, and calibration hardware. The LFC, a high resolution large format photogrammetric camera for use in the cargo bay of the space transport system, is also adaptable to use on an RB-57 aircraft or on a free flyer satellite. Carrying 4000 feet of film, the LFC is usable over the visible to near IR, at V/h rates of from 11 to 41 milliradians per second, overlap of 10, 60, 70 or 80 percent and exposure times of from 4 to 32 milliseconds. With a 12 inch focal length it produces a 9 by 18 inch format (long dimension in line of flight) with full format low contrast resolution of 88 lines per millimeter (AWAR), full format distortion of less than 14 microns and a complement of 45 Reseau marks and 12 fiducial marks. Weight of the OCPS as supplied, fully loaded is 944 pounds and power dissipation is 273 watts average when in operation, 95 watts in standby. The LFC contains an internal exposure sensor, or will respond to external command. It is able to photograph starfields for inflight calibration upon command.

  10. Visual Feedback Stabilization of Balancing Tasks with Camera Misalignment

    NASA Astrophysics Data System (ADS)

    Hirata, Kentaro; Mizuno, Takashi

    In this paper, we consider visual feedback stabilization which tolerates small camera misalignment. Specifically, a balancing task with a cart-pendulum system using camera image is examined. Such a task is known to rely heavily on the detection of the vertical direction and the angle measurement error due to the camera misalignment could be fatal for stabilization. From a mathematical model of the measurement error, the effect of the misalignment is naturally represented by affine perturbation to the coefficient matrix of the output equation. Motivated by this fact, a special type of robust dynamic output feedback stabilization against polytopic uncertainty is investigated. By solving the related BMI, one can design a controller which tolerates the camera misalignment to some extent. The result is verified via experiments.

  11. An aerial composite imaging method with multiple upright cameras based on axis-shift theory

    NASA Astrophysics Data System (ADS)

    Fang, Junyong; Liu, Xue; Xue, Yongqi; Tong, Qingxi

    2010-11-01

    Several composite camera systems were made for wide coverage by using 3 or 4 oblique cameras. A virtual projecting center and image was used for geometrical correction and mosaic with different projecting angles and different spatial resolutions caused by oblique cameras. An imaging method based axis-shift theory is proposed to acquire wide coverage images by several upright cameras. Four upright camera lenses have the same wide angle of view. The optic axis of lens is not on the center of CCD, and each CCD in each camera covers only one part of the whole focus plane. Oblique deformation caused by oblique camera would be avoided by this axis-shift imaging method. The principle and parameters are given and discussed. A prototype camera system is constructed by common DLSR (digital single lens reflex) cameras. The angle of view could exceed 80 degrees along the flight direction when the focal length is 24mm, and the ratio of base line to height could exceed 0.7 when longitudinal overlap is 60%. Some original and mosaic images captured by this prototype system in some ground and airborne experiments are given at last. Experimental results of image test show that the upright imaging method can effectively avoid the oblique deformation and meet the geometrical precision of image mosaic.

  12. Multispectral Photometry of the Moon and Absolute Calibration of the Clementine UV/Vis Camera

    NASA Astrophysics Data System (ADS)

    Hillier, John K.; Buratti, Bonnie J.; Hill, Kathryn

    1999-10-01

    We present a multispectral photometric study of the Moon between solar phase angles of 0 and 85°. Using Clementine images obtained between 0.4 and 1.0 μm, we produce a comprehensive study of the lunar surface containing the following results: (1) empirical photometric functions for the spectral range and viewing and illumination geometries mentioned, (2) photometric modeling that derives the physical properties of the upper regolith and includes a detailed study of the causes for the lunar opposition surge, (3) an absolute calibration of the Clementine UV/Vis camera. The calibration procedure given on the Clementine calibration web site produces reflectances relative to a halon standard and further appear significantly higher than those seen in groundbased observations. By comparing Clementine observations with prior groundbased observations of 15 sites on the Moon we have determined a good absolute calibration of the Clementine UV/Vis camera. A correction factor of 0.532 has been determined to convert the web site (www.planetary.brown.edu/clementine/calibration.html) reflectances to absolute values. From the calibrated data, we calculate empirical phase functions useful for performing photometric corrections to observations of the Moon between solar phase angles of 0 and 85° and in the spectral range 0.4 to 1.0μm. Finally, the calibrated data is used to fit a version of Hapke's photometric model modified to incorporate a new formulation, developed in this paper, of the lunar opposition surge which includes coherent backscatter. Recent studies of the lunar opposition effect have yielded contradictory results as to the mechanism responsible: shadow hiding, coherent backscatter, or both. We find that most of the surge can be explained by shadow hiding with a halfwidth of ˜8°. However, for the brightest regions (the highlands at 0.75-1.0μm) a small additional narrow component (halfwidth of <2°) of total amplitude ˜1/6 to 1/4 that of the shadow hiding surge is

  13. Role of Optical Coherence Tomography in Assessing Anterior Chamber Angles

    PubMed Central

    Kochupurakal, Reema Thomas; Jha, Kirti Nath; Rajalakshmi, A.R.; Nagarajan, Swathi; Ezhumalai, G.

    2016-01-01

    Introduction Gonioscopy is the gold standard in assessing anterior chamber angles. However, interobserver variations are common and there is a need for reliable objective method of assessment. Aim To compare the anterior chamber angle by gonioscopy and Spectral Domain Optical Coherence Tomography (SD-OCT) in individuals with shallow anterior chamber. Materials and Methods This comparative observational study was conducted in a rural tertiary multi-speciality teaching hospital. A total of 101 eyes of 54 patients with shallow anterior chamber on slit lamp evaluation were included. Anterior chamber angle was graded by gonioscopy using the shaffer grading system. Angles were also assessed by SD-OCT with Trabecular Iris Angle (TIA) and Angle Opening Distance (AOD). Chi-square test, sensitivity, specificity, positive and negative predictive value to find correlation between OCT parameters and gonioscopy grading. Results Females represented 72.7%. The mean age was 53.93 ±8.24 years and mean anterior chamber depth was 2.47 ± 0.152 mm. Shaffer grade ≤ 2 were identified in 95(94%) superior, 42(41.5%) inferior, 65(64.3%) nasal and 57(56.4%) temporal quadrants. Cut-off values of TIA ≤ 22° and AOD ≤ 290 μm were taken as narrow angles on SD-OCT. TIA of ≤ 22° were found in 88(92.6%) nasal and 87(87%) temporal angles. AOD of ≤ 290 μm was found in 73(76.8%) nasal and 83(83%) temporal quadrants. Sensitivity in detecting narrow angles was 90.7% and 82.2% for TIA and AOD, while specificity was 11.7% and 23.4%, respectively. Conclusion Individuals were found to have narrow angles more with SD-OCT. Sensitivity was high and specificity was low in detecting narrow angles compared to gonioscopy, making it an unreliable tool for screening. PMID:27190851

  14. Wide-angle imaging system with fiberoptic components providing angle-dependent virtual material stops

    NASA Technical Reports Server (NTRS)

    Vaughan, Arthur H. (Inventor)

    1993-01-01

    A strip imaging wide angle optical system is provided. The optical system is provided with a 'virtual' material stop to avoid aberrational effects inherent in wide angle optical systems. The optical system includes a spherical mirror section for receiving light from a 180 deg strip or arc of a target image. Light received by the spherical mirror section is reflected to a frustoconical mirror section for subsequent rereflection to a row of optical fibers. Each optical fiber transmits a portion of the received light to a detector. The optical system exploits the narrow cone of acceptance associated with optical fibers to substantially eliminate vignetting effects inherent in wide angle systems. Further, the optical system exploits the narrow cone of acceptance of the optical fibers to substantially limit spherical aberration. The optical system is ideally suited for any application wherein a 180 deg strip image need be detected, and is particularly well adapted for use in hostile environments such as in planetary exploration.

  15. 980 nm narrow linewidth Yb-doped phosphate fiber laser

    NASA Astrophysics Data System (ADS)

    Li, Pingxue; Yao, Yifei; Hu, Haowei; Chi, Junjie; Yang, Chun; Zhao, Ziqiang; Zhang, Guangju

    2014-12-01

    A narrow-linewidth ytterbium (Yb)-doped phosphate fiber laser based on fiber Bragg grating (FBG) operating around 980 nm is reported. Two different kinds of cavity are applied to obtain the 980 nm narrow-linewidth output. One kind of the cavity consists of a 0.35 nm broadband lindwidth high-reflection FBG and the Yb-doped phosphate fiber end with 0° angle, which generates a maximum output power of 25 mW. The other kind of resonator is composed of a single mode Yb-doped phosphate fiber and a pair of FBGs. Over 10.7 mW stable continuous wave are obtained with two longitudinal modes at 980 nm. We have given a detailed analysis and discussion for the results.

  16. DEVICE CONTROLLER, CAMERA CONTROL

    1998-07-20

    This is a C++ application that is the server for the cameral control system. Devserv drives serial devices, such as cameras and videoswitchers used in a videoconference, upon request from a client such as the camxfgbfbx ccint program. cc Deverv listens on UPD ports for clients to make network contractions. After a client connects and sends a request to control a device (such as to pan,tilt, or zooma camera or do picture-in-picture with a videoswitcher),more » devserv formats the request into an RS232 message appropriate for the device and sends this message over the serial port to which the device is connected. Devserv then reads the reply from the device from the serial port to which the device is connected. Devserv then reads the reply from the device from the serial port and then formats and sends via multicast a status message. In addition, devserv periodically multicasts status or description messages so that all clients connected to the multicast channel know what devices are supported and their ranges of motion and the current position. The software design employs a class hierarchy such that an abstract base class for devices can be subclassed into classes for various device categories(e.g. sonyevid30, cononvco4, panasonicwjmx50, etc.). which are further subclassed into classes for various device categories. The devices currently supported are the Sony evi-D30, Canon, VCC1, Canon VCC3, and Canon VCC4 cameras and the Panasonic WJ-MX50 videoswitcher. However, developers can extend the class hierarchy to support other devices.« less

  17. Adaptive compressive sensing camera

    NASA Astrophysics Data System (ADS)

    Hsu, Charles; Hsu, Ming K.; Cha, Jae; Iwamura, Tomo; Landa, Joseph; Nguyen, Charles; Szu, Harold

    2013-05-01

    We have embedded Adaptive Compressive Sensing (ACS) algorithm on Charge-Coupled-Device (CCD) camera based on the simplest concept that each pixel is a charge bucket, and the charges comes from Einstein photoelectric conversion effect. Applying the manufactory design principle, we only allow altering each working component at a minimum one step. We then simulated what would be such a camera can do for real world persistent surveillance taking into account of diurnal, all weather, and seasonal variations. The data storage has saved immensely, and the order of magnitude of saving is inversely proportional to target angular speed. We did design two new components of CCD camera. Due to the matured CMOS (Complementary metal-oxide-semiconductor) technology, the on-chip Sample and Hold (SAH) circuitry can be designed for a dual Photon Detector (PD) analog circuitry for changedetection that predicts skipping or going forward at a sufficient sampling frame rate. For an admitted frame, there is a purely random sparse matrix [Φ] which is implemented at each bucket pixel level the charge transport bias voltage toward its neighborhood buckets or not, and if not, it goes to the ground drainage. Since the snapshot image is not a video, we could not apply the usual MPEG video compression and Hoffman entropy codec as well as powerful WaveNet Wrapper on sensor level. We shall compare (i) Pre-Processing FFT and a threshold of significant Fourier mode components and inverse FFT to check PSNR; (ii) Post-Processing image recovery will be selectively done by CDT&D adaptive version of linear programming at L1 minimization and L2 similarity. For (ii) we need to determine in new frames selection by SAH circuitry (i) the degree of information (d.o.i) K(t) dictates the purely random linear sparse combination of measurement data a la [Φ]M,N M(t) = K(t) Log N(t).

  18. DEVICE CONTROLLER, CAMERA CONTROL

    SciTech Connect

    Perry, Marcia

    1998-07-20

    This is a C++ application that is the server for the cameral control system. Devserv drives serial devices, such as cameras and videoswitchers used in a videoconference, upon request from a client such as the camxfgbfbx ccint program. cc Deverv listens on UPD ports for clients to make network contractions. After a client connects and sends a request to control a device (such as to pan,tilt, or zooma camera or do picture-in-picture with a videoswitcher), devserv formats the request into an RS232 message appropriate for the device and sends this message over the serial port to which the device is connected. Devserv then reads the reply from the device from the serial port to which the device is connected. Devserv then reads the reply from the device from the serial port and then formats and sends via multicast a status message. In addition, devserv periodically multicasts status or description messages so that all clients connected to the multicast channel know what devices are supported and their ranges of motion and the current position. The software design employs a class hierarchy such that an abstract base class for devices can be subclassed into classes for various device categories(e.g. sonyevid30, cononvco4, panasonicwjmx50, etc.). which are further subclassed into classes for various device categories. The devices currently supported are the Sony evi-D30, Canon, VCC1, Canon VCC3, and Canon VCC4 cameras and the Panasonic WJ-MX50 videoswitcher. However, developers can extend the class hierarchy to support other devices.

  19. Neutron imaging camera

    NASA Astrophysics Data System (ADS)

    Hunter, S. D.; de Nolfo, G. A.; Barbier, L. M.; Link, J. T.; Son, S.; Floyd, S. R.; Guardala, N.; Skopec, M.; Stark, B.

    2008-04-01

    The Neutron Imaging Camera (NIC) is based on the Three-dimensional Track Imager (3_DTI) technology developed at GSFC for gamma-ray astrophysics applications. The 3-DTI, a large volume time-projection chamber, provides accurate, ~0.4 mm resolution, 3-D tracking of charged particles. The incident direction of fast neutrons, En > 0.5 MeV, are reconstructed from the momenta and energies of the proton and triton fragments resulting from 3He(n,p)3H interactions in the 3-DTI volume. The performance of the NIC from laboratory is presented.

  20. Neutron Imaging Camera

    NASA Technical Reports Server (NTRS)

    Hunter, Stanley; deNolfo, G. A.; Barbier, L. M.; Link, J. T.; Son, S.; Floyd, S. R.; Guardala, N.; Skopec, M.; Stark, B.

    2008-01-01

    The Neutron Imaging Camera (NIC) is based on the Three-dimensional Track Imager (3DTI) technology developed at GSFC for gamma-ray astrophysics applications. The 3-DTI, a large volume time-projection chamber, provides accurate, approximately 0.4 mm resolution, 3-D tracking of charged particles. The incident direction of fast neutrons, En > 0.5 MeV, are reconstructed from the momenta and energies of the proton and triton fragments resulting from (sup 3)He(n,p) (sup 3)H interactions in the 3-DTI volume. The performance of the NIC from laboratory and accelerator tests is presented.

  1. Cryogenic Detectors (Narrow Field Instruments)

    NASA Astrophysics Data System (ADS)

    Hoevers, H.; Verhoeve, P.

    Two cryogenic imaging spectrometer arrays are currently considered as focal plane instruments for XEUS. The narrow field imager 1 (NFI 1) will cover the energy range from 0.05 to 3 keV with an energy resolution of 2 eV, or better, at 500 eV. A second narrow field imager (NFI 2) covers the energy range from 1 to 15 keV with an energy resolution of 2 eV (at 1 keV) and 5 eV (at 7 keV), creating some overlap with part of the NFI 1 energy window. Both narrow field imagers have a 0.5 arcmin field of view. Their imaging capabilities are matched to the XEUS optics of 2 to 5 arcsec leading to 1 arcsec pixels. The detector arrays will be cooled by a closed cycle system comprising a mechanical cooler with a base temperature of 2.5 K and either a low temperature 3He sorption pump providing the very low temperature stage and/or an Adiabatic Demagnetization Refrigerator (ADR). The ADR cooler is explicitly needed to cool the NFI 2 array. The narrow field imager 1} Currently a 48 times 48 element array of superconducting tunnel junctions (STJ) is envisaged. Its operating temperature is in the range between 30 and 350 mK. Small, single Ta STJs (20-50 mum on a side) have shown 3.5 eV (FWHM) resolution at E = 525 eV and small arrays have been successfully demonstrated (6 times 6 pixels), or are currently tested (10 times 12 pixels). Alternatively, a prototype Distributed Read-Out Imaging Device (DROID), consisting of a linear superconducting Ta absorber of 20 times 100 mum2, including a 20 times 20 mum STJ for readout at either end, has shown a measured energy resolution of 2.4 eV (FWHM) at E = 500 eV. Simulations involving the diffusion properties as well as loss and tunnel rates have shown that the performance can be further improved by slight modifications in the geometry, and that the size of the DROIDS can be increased to 0.5-1.0 mm without loss in energy resolution. The relatively large areas and good energy resolution compared to single STJs make DROIDS good candidates for the

  2. 91. 22'X34' original blueprint, VariableAngle Launcher, 'CONNECTING BRIDGE, REAR VIEW ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    91. 22'X34' original blueprint, Variable-Angle Launcher, 'CONNECTING BRIDGE, REAR VIEW CAMERA HOUSE ASSEMBLY' drawn at 3/8=1'-0', 3'=1'-0'. (BUORD Sketch # 209042). - Variable Angle Launcher Complex, Variable Angle Launcher, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  3. Mars Science Laboratory Engineering Cameras

    NASA Technical Reports Server (NTRS)

    Maki, Justin N.; Thiessen, David L.; Pourangi, Ali M.; Kobzeff, Peter A.; Lee, Steven W.; Dingizian, Arsham; Schwochert, Mark A.

    2012-01-01

    NASA's Mars Science Laboratory (MSL) Rover, which launched to Mars in 2011, is equipped with a set of 12 engineering cameras. These cameras are build-to-print copies of the Mars Exploration Rover (MER) cameras, which were sent to Mars in 2003. The engineering cameras weigh less than 300 grams each and use less than 3 W of power. Images returned from the engineering cameras are used to navigate the rover on the Martian surface, deploy the rover robotic arm, and ingest samples into the rover sample processing system. The navigation cameras (Navcams) are mounted to a pan/tilt mast and have a 45-degree square field of view (FOV) with a pixel scale of 0.82 mrad/pixel. The hazard avoidance cameras (Haz - cams) are body-mounted to the rover chassis in the front and rear of the vehicle and have a 124-degree square FOV with a pixel scale of 2.1 mrad/pixel. All of the cameras utilize a frame-transfer CCD (charge-coupled device) with a 1024x1024 imaging region and red/near IR bandpass filters centered at 650 nm. The MSL engineering cameras are grouped into two sets of six: one set of cameras is connected to rover computer A and the other set is connected to rover computer B. The MSL rover carries 8 Hazcams and 4 Navcams.

  4. Angled Layers in Super Resolution

    NASA Technical Reports Server (NTRS)

    2004-01-01

    Researchers used a special imaging technique with the panoramic camera on NASA's Mars Exploration Rover Opportunity to get as detailed a look as possible at a target region near eastern foot of 'Burns Cliff.' The intervening terrain was too difficult for driving the rover closer. The target is the boundary between two sections of layered rock. The layers in lower section (left) run at a marked angle to the layers in next higher section (right).

    This view is the product of a technique called super resolution. It was generated from data acquired on sol 288 of Opportunity's mission (Nov. 14, 2004) from a position along the southeast wall of 'Endurance Crater.' Resolution slightly higher than normal for the panoramic camera was synthesized for this view by combining 17 separate images of this scene, each one 'dithered' or pointed slightly differently from the previous one. Computer manipulation of the individual images was then used to generate a new synthetic view of the scene in a process known mathematically as iterative deconvolution, but referred to informally as super resolution. Similar methods have been used to enhance the resolution of images from the Mars Pathfinder mission and the Hubble Space Telescope.

  5. LRO Camera Imaging of the Moon: Apollo 17 and other Sites for Ground Truth

    NASA Astrophysics Data System (ADS)

    Jolliff, B. L.; Wiseman, S. M.; Robinson, M. S.; Lawrence, S.; Denevi, B. W.; Bell, J. F.

    2009-12-01

    One of the fundamental goals of the Lunar Reconnaissance Orbiter (LRO) is the determination of mineralogic and compositional distributions and their relation to geologic features on the Moon’s surface. Through a combination of imaging with the LRO narrow-angle cameras and wide-angle camera (NAC, WAC), very fine-scale geologic features are resolved with better than meter-per-pixel resolution (NAC) and correlated to spectral variations mapped with the lower resolution, 7-band WAC (400-m/pix, ultraviolet bands centered at 321 and 360 nm; 100-m/pix, visible bands centered at 415, 566, 604, 643, and 689 nm). Keys to understanding spectral variations in terms of composition, and relationships between compositional variations and surface geology, are ground-truth sites where surface compositions and mineralogy, as well as geology and geologic history, are well known. The Apollo 17 site is especially useful because the site geology includes a range of features from high-Ti mare basalts to Serenitatis-Basin-related massifs containing basin impact-melt breccia and feldspathic highlands materials, and a regional black and orange pyroclastic deposit. Moreover, relative and absolute ages of these features are known. In addition to rock samples, astronauts collected well-documented soil samples at 22 different sample locations across this diverse area. Many of these sample sites can be located in the multispectral data using the co-registered NAC images. Digital elevation data are used to normalize illumination geometry and thus fully exploit the multispectral data and compare derived compositional parameters for different geologic units. Regolith characteristics that are known in detail from the Apollo 17 samples, such as maturity and petrography of mineral, glass, and lithic components, contribute to spectral variations and are considered in the assessment of spectral variability at the landing site. In this work, we focus on variations associated with the ilmenite content

  6. HONEY -- The Honeywell Camera

    NASA Astrophysics Data System (ADS)

    Clayton, C. A.; Wilkins, T. N.

    The Honeywell model 3000 colour graphic recorder system (hereafter referred to simply as Honeywell) has been bought by Starlink for producing publishable quality photographic hardcopy from the IKON image displays. Full colour and black & white images can be recorded on positive or negative 35mm film. The Honeywell consists of a built-in high resolution flat-faced monochrome video monitor, a red/green/blue colour filter mechanism and a 35mm camera. The device works on the direct video signals from the IKON. This means that changing the brightness or contrast on the IKON monitor will not affect any photographs that you take. The video signals from the IKON consist of separate red, green and blue signals. When you take a picture, the Honeywell takes the red, green and blue signals in turn and displays three pictures consecutively on its internal monitor. It takes an exposure through each of three filters (red, green and blue) onto the film in the camera. This builds up the complete colour picture on the film. Honeywell systems are installed at nine Starlink sites, namely Belfast (locally funded), Birmingham, Cambridge, Durham, Leicester, Manchester, Rutherford, ROE and UCL.

  7. WFOV star tracker camera

    SciTech Connect

    Lewis, I.T. ); Ledebuhr, A.G.; Axelrod, T.S.; Kordas, J.F.; Hills, R.F. )

    1991-04-01

    A prototype wide-field-of-view (WFOV) star tracker camera has been fabricated and tested for use in spacecraft navigation. The most unique feature of this device is its 28{degrees} {times} 44{degrees} FOV, which views a large enough sector of the sky to ensure the existence of at least 5 stars of m{sub v} = 4.5 or brighter in all viewing directions. The WFOV requirement and the need to maximize both collection aperture (F/1.28) and spectral input band (0.4 to 1.1 {mu}m) to meet the light gathering needs for the dimmest star have dictated the use of a novel concentric optical design, which employs a fiber optic faceplate field flattener. The main advantage of the WFOV configuration is the smaller star map required for position processing, which results in less processing power and faster matching. Additionally, a size and mass benefit is seen with a larger FOV/smaller effective focal length (efl) sensor. Prototype hardware versions have included both image intensified and un-intensified CCD cameras. Integration times of {le} 50 msec have been demonstrated with both the intensified and un-intensified versions. 3 refs., 16 figs.

  8. PAU camera: detectors characterization

    NASA Astrophysics Data System (ADS)

    Casas, Ricard; Ballester, Otger; Cardiel-Sas, Laia; Castilla, Javier; Jiménez, Jorge; Maiorino, Marino; Pío, Cristóbal; Sevilla, Ignacio; de Vicente, Juan

    2012-07-01

    The PAU Camera (PAUCam) [1,2] is a wide field camera that will be mounted at the corrected prime focus of the William Herschel Telescope (Observatorio del Roque de los Muchachos, Canary Islands, Spain) in the next months. The focal plane of PAUCam is composed by a mosaic of 18 CCD detectors of 2,048 x 4,176 pixels each one with a pixel size of 15 microns, manufactured by Hamamatsu Photonics K. K. This mosaic covers a field of view (FoV) of 60 arcmin (minutes of arc), 40 of them are unvignetted. The behaviour of these 18 devices, plus four spares, and their electronic response should be characterized and optimized for the use in PAUCam. This job is being carried out in the laboratories of the ICE/IFAE and the CIEMAT. The electronic optimization of the CCD detectors is being carried out by means of an OG (Output Gate) scan and maximizing it CTE (Charge Transfer Efficiency) while the read-out noise is minimized. The device characterization itself is obtained with different tests. The photon transfer curve (PTC) that allows to obtain the electronic gain, the linearity vs. light stimulus, the full-well capacity and the cosmetic defects. The read-out noise, the dark current, the stability vs. temperature and the light remanence.

  9. MEMS digital camera

    NASA Astrophysics Data System (ADS)

    Gutierrez, R. C.; Tang, T. K.; Calvet, R.; Fossum, E. R.

    2007-02-01

    MEMS technology uses photolithography and etching of silicon wafers to enable mechanical structures with less than 1 μm tolerance, important for the miniaturization of imaging systems. In this paper, we present the first silicon MEMS digital auto-focus camera for use in cell phones with a focus range of 10 cm to infinity. At the heart of the new silicon MEMS digital camera, a simple and low-cost electromagnetic actuator impels a silicon MEMS motion control stage on which a lens is mounted. The silicon stage ensures precise alignment of the lens with respect to the imager, and enables precision motion of the lens over a range of 300 μm with < 5 μm hysteresis and < 2 μm repeatability. Settling time is < 15 ms for 200 μm step, and < 5ms for 20 μm step enabling AF within 0.36 sec at 30 fps. The precise motion allows COTS optics to maintain MTF > 0.8 at 20 cy/mm up to 80% field over the full range of motion. Accelerated lifetime testing has shown that the alignment and precision of motion is maintained after 8,000 g shocks, thermal cycling from - 40 C to 85 C, and operation over 20 million cycles.

  10. Stereoscopic camera design

    NASA Astrophysics Data System (ADS)

    Montgomery, David J.; Jones, Christopher K.; Stewart, James N.; Smith, Alan

    2002-05-01

    It is clear from the literature that the majority of work in stereoscopic imaging is directed towards the development of modern stereoscopic displays. As costs come down, wider public interest in this technology is expected to increase. This new technology would require new methods of image formation. Advances in stereo computer graphics will of course lead to the creation of new stereo computer games, graphics in films etc. However, the consumer would also like to see real-world stereoscopic images, pictures of family, holiday snaps etc. Such scenery would have wide ranges of depth to accommodate and would need also to cope with moving objects, such as cars, and in particular other people. Thus, the consumer acceptance of auto/stereoscopic displays and 3D in general would be greatly enhanced by the existence of a quality stereoscopic camera. This paper will cover an analysis of existing stereoscopic camera designs and show that they can be categorized into four different types, with inherent advantages and disadvantages. A recommendation is then made with regard to 3D consumer still and video photography. The paper will go on to discuss this recommendation and describe its advantages and how it can be realized in practice.

  11. Perception of Perspective Angles.

    PubMed

    Erkelens, Casper J

    2015-06-01

    We perceive perspective angles, that is, angles that have an orientation in depth, differently from what they are in physical space. Extreme examples are angles between rails of a railway line or between lane dividers of a long and straight road. In this study, subjects judged perspective angles between bars lying on the floor of the laboratory. Perspective angles were also estimated from pictures taken from the same point of view. Converging and diverging angles were judged to test three models of visual space. Four subjects evaluated the perspective angles by matching them to nonperspective angles, that is, angles between the legs of a compass oriented in the frontal plane. All subjects judged both converging and diverging angles larger than the physical angle and smaller than the angles in the proximal stimuli. A model of shallow visual space describes the results. According to the model, lines parallel to visual lines, vanishing at infinity in physical space, converge to visual lines in visual space. The perceived shape of perspective angles is incompatible with the perceived length and width of the bars. The results have significance for models of visual perception and practical implications for driving and flying in poor visibility conditions. PMID:27433312

  12. Transmission electron microscope CCD camera

    DOEpatents

    Downing, Kenneth H.

    1999-01-01

    In order to improve the performance of a CCD camera on a high voltage electron microscope, an electron decelerator is inserted between the microscope column and the CCD. This arrangement optimizes the interaction of the electron beam with the scintillator of the CCD camera while retaining optimization of the microscope optics and of the interaction of the beam with the specimen. Changing the electron beam energy between the specimen and camera allows both to be optimized.

  13. Finding success in ACA narrow networks.

    PubMed

    Daly, Rich

    2015-12-01

    Health systems should carefully consider the specific details of their local market before deciding to launch a narrow network plan or to join an existing insurer's narrow network. Key steps to take in the evaluation process include: Determining insurer interest in forming a true partnership. Assessing capability for greater efficiency. Assessing insurer priorities. Obtaining access to enrollee data. Identifying capabilities that differentiate it from other narrow networks. PMID:26793944

  14. Narrow gap electronegative capacitive discharges

    NASA Astrophysics Data System (ADS)

    Kawamura, E.; Lieberman, M. A.; Lichtenberg, A. J.

    2013-10-01

    Narrow gap electronegative (EN) capacitive discharges are widely used in industry and have unique features not found in conventional discharges. In this paper, plasma parameters are determined over a range of decreasing gap length L from values for which an electropositive (EP) edge exists (2-region case) to smaller L-values for which the EN region connects directly to the sheath (1-region case). Parametric studies are performed at applied voltage Vrf=500 V for pressures of 10, 25, 50, and 100 mTorr, and additionally at 50 mTorr for 1000 and 2000 V. Numerical results are given for a parallel plate oxygen discharge using a planar 1D3v (1 spatial dimension, 3 velocity components) particle-in-cell (PIC) code. New interesting phenomena are found for the case in which an EP edge does not exist. This 1-region case has not previously been investigated in detail, either numerically or analytically. In particular, attachment in the sheaths is important, and the central electron density ne0 is depressed below the density nesh at the sheath edge. The sheath oscillations also extend into the EN core, creating an edge region lying within the sheath and not characterized by the standard diffusion in an EN plasma. An analytical model is developed using minimal inputs from the PIC results, and compared to the PIC results for a base case at Vrf=500 V and 50 mTorr, showing good agreement. Selected comparisons are made at the other voltages and pressures. A self-consistent model is also developed and compared to the PIC results, giving reasonable agreement.

  15. Narrow gap electronegative capacitive discharges

    SciTech Connect

    Kawamura, E.; Lieberman, M. A.; Lichtenberg, A. J.

    2013-10-15

    Narrow gap electronegative (EN) capacitive discharges are widely used in industry and have unique features not found in conventional discharges. In this paper, plasma parameters are determined over a range of decreasing gap length L from values for which an electropositive (EP) edge exists (2-region case) to smaller L-values for which the EN region connects directly to the sheath (1-region case). Parametric studies are performed at applied voltage V{sub rf}=500 V for pressures of 10, 25, 50, and 100 mTorr, and additionally at 50 mTorr for 1000 and 2000 V. Numerical results are given for a parallel plate oxygen discharge using a planar 1D3v (1 spatial dimension, 3 velocity components) particle-in-cell (PIC) code. New interesting phenomena are found for the case in which an EP edge does not exist. This 1-region case has not previously been investigated in detail, either numerically or analytically. In particular, attachment in the sheaths is important, and the central electron density n{sub e0} is depressed below the density n{sub esh} at the sheath edge. The sheath oscillations also extend into the EN core, creating an edge region lying within the sheath and not characterized by the standard diffusion in an EN plasma. An analytical model is developed using minimal inputs from the PIC results, and compared to the PIC results for a base case at V{sub rf}=500 V and 50 mTorr, showing good agreement. Selected comparisons are made at the other voltages and pressures. A self-consistent model is also developed and compared to the PIC results, giving reasonable agreement.

  16. Neutron Imaging Camera

    NASA Technical Reports Server (NTRS)

    Hunter, Stanley D.; DeNolfo, Georgia; Floyd, Sam; Krizmanic, John; Link, Jason; Son, Seunghee; Guardala, Noel; Skopec, Marlene; Stark, Robert

    2008-01-01

    We describe the Neutron Imaging Camera (NIC) being developed for DTRA applications by NASA/GSFC and NSWC/Carderock. The NIC is based on the Three-dimensional Track Imager (3-DTI) technology developed at GSFC for gamma-ray astrophysics applications. The 3-DTI, a large volume time-projection chamber, provides accurate, approximately 0.4 mm resolution. 3-D tracking of charged particles. The incident direction of fast neutrons, E(sub N) > 0.5 MeV. arc reconstructed from the momenta and energies of the proton and triton fragments resulting from 3He(n,p)3H interactions in the 3-DTI volume. We present angular and energy resolution performance of the NIC derived from accelerator tests.

  17. A Motionless Camera

    NASA Technical Reports Server (NTRS)

    1994-01-01

    Omniview, a motionless, noiseless, exceptionally versatile camera was developed for NASA as a receiving device for guiding space robots. The system can see in one direction and provide as many as four views simultaneously. Developed by Omniview, Inc. (formerly TRI) under a NASA Small Business Innovation Research (SBIR) grant, the system's image transformation electronics produce a real-time image from anywhere within a hemispherical field. Lens distortion is removed, and a corrected "flat" view appears on a monitor. Key elements are a high resolution charge coupled device (CCD), image correction circuitry and a microcomputer for image processing. The system can be adapted to existing installations. Applications include security and surveillance, teleconferencing, imaging, virtual reality, broadcast video and military operations. Omniview technology is now called IPIX. The company was founded in 1986 as TeleRobotics International, became Omniview in 1995, and changed its name to Interactive Pictures Corporation in 1997.

  18. Reflectance characteristics of the Viking lander camera reference test charts

    NASA Technical Reports Server (NTRS)

    Wall, S. D.; Burcher, E. E.; Jabson, D. J.

    1975-01-01

    Reference test charts provide radiometric, colorimetric, and spatial resolution references for the Viking lander cameras on Mars. Reflectance measurements of these references are described, including the absolute bidirectional reflectance of the radiometric references and the relative spectral reflectance of both radiometric and colorimetric references. Results show that the bidirection reflectance of the radiometric references is Lambertian to within + or - 7% for incidence angles between 20 deg and 60 deg, and that their spectral reflectance is constant with wavelength to within + or - 5% over the spectral range of the cameras. Estimated accuracy of the measurements is + or - 0.05 in relative spectral reflectance.

  19. Anger perceptually and conceptually narrows cognitive scope.

    PubMed

    Gable, Philip A; Poole, Bryan D; Harmon-Jones, Eddie

    2015-07-01

    For the last 50 years, research investigating the effect of emotions on scope of cognitive processing was based on models proposing that affective valence determined cognitive scope. More recently, our motivational intensity model suggests that this past work had confounded valence with motivational intensity. Research derived from this model supports the idea that motivational intensity, rather than affective valence, explains much of the variance emotions have on cognitive scope. However, the motivational intensity model is limited in that the empirical work has examined only positive affects high in approach and negative affects high in avoidance motivation. Thus, perhaps only approach-positive and avoidance-negative states narrow cognitive scope. The present research was designed to clarify these conceptual issues by examining the effect of anger, a negatively valenced approach-motivated state, on cognitive scope. Results revealed that anger narrowed attentional scope relative to a neutral state and that attentional narrowing to anger was similar to the attentional narrowing caused by high approach-motivated positive affects (Study 1). This narrowing of attention was related to trait approach motivation (Studies 2 and Study 3). Anger also narrowed conceptual cognitive categorization (Study 4). Narrowing of categorization related to participants' approach motivation toward anger stimuli. Together, these results suggest that anger, an approach-motivated negative affect, narrows perceptual and conceptual cognitive scope. More broadly, these results support the conceptual model that motivational intensity per se, rather than approach-positive and avoidance-negative states, causes a narrowing of cognitive scope.

  20. Narrow band 3 × 3 Mueller polarimetric endoscopy

    PubMed Central

    Qi, Ji; Ye, Menglong; Singh, Mohan; Clancy, Neil T.; Elson, Daniel S.

    2013-01-01

    Mueller matrix polarimetric imaging has shown potential in tissue diagnosis but is challenging to implement endoscopically. In this work, a narrow band 3 × 3 Mueller matrix polarimetric endoscope was designed by rotating the endoscope to generate 0°, 45° and 90° linearly polarized illumination and positioning a rotating filter wheel in front of the camera containing three polarisers to permit polarization state analysis for backscattered light. The system was validated with a rotating linear polarizer and a diffuse reflection target. Initial measurements of 3 × 3 Mueller matrices on a rat are demonstrated, followed by matrix decomposition into the depolarization and retardance matrices for further analysis. Our work shows the feasibility of implementing polarimetric imaging in a rigid endoscope conveniently and economically in order to reveal diagnostic information. PMID:24298405

  1. Comparison and evaluation of datasets for off-angle iris recognition

    NASA Astrophysics Data System (ADS)

    Kurtuncu, Osman M.; Cerme, Gamze N.; Karakaya, Mahmut

    2016-05-01

    In this paper, we investigated the publicly available iris recognition datasets and their data capture procedures in order to determine if they are suitable for the stand-off iris recognition research. Majority of the iris recognition datasets include only frontal iris images. Even if a few datasets include off-angle iris images, the frontal and off-angle iris images are not captured at the same time. The comparison of the frontal and off-angle iris images shows not only differences in the gaze angle but also change in pupil dilation and accommodation as well. In order to isolate the effect of the gaze angle from other challenging issues including dilation and accommodation, the frontal and off-angle iris images are supposed to be captured at the same time by using two different cameras. Therefore, we developed an iris image acquisition platform by using two cameras in this work where one camera captures frontal iris image and the other one captures iris images from off-angle. Based on the comparison of Hamming distance between frontal and off-angle iris images captured with the two-camera- setup and one-camera-setup, we observed that Hamming distance in two-camera-setup is less than one-camera-setup ranging from 0.05 to 0.001. These results show that in order to have accurate results in the off-angle iris recognition research, two-camera-setup is necessary in order to distinguish the challenging issues from each other.

  2. Camera Calibration for Uav Application Using Sensor of Mobile Camera

    NASA Astrophysics Data System (ADS)

    Takahashi, Y.; Chikatsu, H.

    2015-05-01

    Recently, 3D measurements using small unmanned aerial vehicles (UAVs) have increased in Japan, because small type UAVs is easily available at low cost and the analysis software can be created the easily 3D models. However, small type UAVs have a problem: they have very short flight times and a small payload. In particular, as the payload of a small type UAV increases, its flight time decreases. Therefore, it is advantageous to use lightweight sensors in small type UAVs. A mobile camera is lightweight and has many sensors such as an accelerometer, a magnetic field, and a gyroscope. Moreover, these sensors can be used simultaneously. Therefore, the authors think that the problems of small UAVs can be solved using the mobile camera. The authors executed camera calibration using a test target for evaluating sensor values measured using a mobile camera. Consequently, the authors confirmed the same accuracy with normal camera calibration.

  3. Bundle Adjustment for Multi-Camera Systems with Points at Infinity

    NASA Astrophysics Data System (ADS)

    Schneider, J.; Schindler, F.; Läbe, T.; Förstner, W.

    2012-07-01

    We present a novel approach for a rigorous bundle adjustment for omnidirectional and multi-view cameras, which enables an efficient maximum-likelihood estimation with image and scene points at infinity. Multi-camera systems are used to increase the resolution, to combine cameras with different spectral sensitivities (Z/I DMC, Vexcel Ultracam) or - like omnidirectional cameras - to augment the effective aperture angle (Blom Pictometry, Rollei Panoscan Mark III). Additionally multi-camera systems gain in importance for the acquisition of complex 3D structures. For stabilizing camera orientations - especially rotations - one should generally use points at the horizon over long periods of time within the bundle adjustment that classical bundle adjustment programs are not capable of. We use a minimal representation of homogeneous coordinates for image and scene points. Instead of eliminating the scale factor of the homogeneous vectors by Euclidean normalization, we normalize the homogeneous coordinates spherically. This way we can use images of omnidirectional cameras with single-view point like fisheye cameras and scene points, which are far away or at infinity. We demonstrate the feasibility and the potential of our approach on real data taken with a single camera, the stereo camera FinePix Real 3D W3 from Fujifilm and the multi-camera system Ladybug 3 from Point Grey.

  4. A view of the ET camera on STS-112

    NASA Technical Reports Server (NTRS)

    2002-01-01

    KENNEDY SPACE CENTER, FLA. - A closeup view of the camera mounted on the external tank of Space Shuttle Atlantis. The color video camera mounted to the top of Atlantis' external tank will provide a view of the front and belly of the orbiter and a portion of the solid rocket boosters (SRBs) and external tank during the launch of Atlantis on mission STS-112. It will offer the STS-112 team an opportunity to monitor the shuttle's performance from a new angle. The camera will be turned on fifteen minutes prior to launch and will show the orbiter and solid rocket boosters on the launch pad. The video will be downlinked from the external tank during flight to several NASA data-receiving sites and then relayed to the live television broadcast. The camera is expected to operate for about 15 minutes following liftoff. At liftoff, viewers will see the shuttle clearing the launch tower and, at two minutes after liftoff, see the right SRB separate from the external tank. When the external tank separates from Atlantis about eight minutes into the flight, the camera is expected to continue its live feed for about six more minutes although NASA may be unable to pick up the camera's signal because the tank may have moved out of range.

  5. A view of the ET camera on STS-112

    NASA Technical Reports Server (NTRS)

    2002-01-01

    KENNEDY SPACE CENTER, FLA. - A view of the camera mounted on the external tank of Space Shuttle Atlantis. The color video camera mounted to the top of Atlantis' external tank will provide a view of the front and belly of the orbiter and a portion of the solid rocket boosters (SRBs) and external tank during the launch of Atlantis on mission STS-112. It will offer the STS-112 team an opportunity to monitor the shuttle's performance from a new angle. The camera will be turned on fifteen minutes prior to launch and will show the orbiter and solid rocket boosters on the launch pad. The video will be downlinked from the external tank during flight to several NASA data-receiving sites and then relayed to the live television broadcast. The camera is expected to operate for about 15 minutes following liftoff. At liftoff, viewers will see the shuttle clearing the launch tower and, at two minutes after liftoff, see the right SRB separate from the external tank. When the external tank separates from Atlantis about eight minutes into the flight, the camera is expected to continue its live feed for about six more minutes although NASA may be unable to pick up the camera's signal because the tank may have moved out of range.

  6. New optical receiving system design for portable camera lidar

    NASA Astrophysics Data System (ADS)

    Qin, Laian; He, Feng; Jing, Xu; Tan, Fengfu

    2015-10-01

    For its better spectral response characterization, higher quantum efficiency and signal-to-noise ratio, camera is more and more used in atmospheric parameters measurement lidar. Camera lidars retrieval atmospheric parameters by analyzing the light column images acquired by the cameras and objectives through gathering the backscatter light of the laser beam. Lidars of this kind usually have higher spatial resolution and better real time performance. However, because of its limited depth of field (DOF), the measurement accuracy of the area out of the DOF is influenced by optical defocus in different degree. In the meantime, it is also not suitable for portable equipments for using small relative aperture receiving objective. Based on improving the design of the receiving objective, a new design scheme is proposed in this paper about improving the optical receiving system of the camera lidar. This scheme can improve the measurement accuracy of the area out of the DOF in traditional structure by using large DOF, large relative aperture offaxis objective and the special using mode of the camera. The optical receiving system designed according to this scheme is more compact and is especially suitable for portable instrument. Furthermore, the relation among the focus length, the distance between laser and objective and the installation angle is also analyzed in this paper. The formula is given at the same time. This scheme is carried out in camera lidar system in laboratory and the results are satisfactory.

  7. Narrow-band radiation wavelength measurement by processing digital photographs in RAW format

    SciTech Connect

    Kraiskii, A V; Mironova, T V; Sultanov, T T

    2012-12-31

    The technique of measuring the mean wavelength of narrow-band radiation in the 455 - 625-nm range using the image of the emitting surface is presented. The data from the camera array unprocessed by the built-in processor (RAW format) are used. The method is applied for determining the parameters of response of holographic sensors. Depending on the wavelength and brightness of the image fragment, the mean square deviation of the wavelength amounts to 0.3 - 3 nm. (experimental techniques)

  8. Digital synchroballistic schlieren camera for high-speed photography of bullets and rocket sleds

    NASA Astrophysics Data System (ADS)

    Buckner, Benjamin D.; L'Esperance, Drew

    2013-08-01

    A high-speed digital streak camera designed for simultaneous high-resolution color photography and focusing schlieren imaging is described. The camera uses a computer-controlled galvanometer scanner to achieve synchroballistic imaging through a narrow slit. Full color 20 megapixel images of a rocket sled moving at 480 m/s and of projectiles fired at around 400 m/s were captured, with high-resolution schlieren imaging in the latter cases, using conventional photographic flash illumination. The streak camera can achieve a line rate for streak imaging of up to 2.4 million lines/s.

  9. An Educational PET Camera Model

    ERIC Educational Resources Information Center

    Johansson, K. E.; Nilsson, Ch.; Tegner, P. E.

    2006-01-01

    Positron emission tomography (PET) cameras are now in widespread use in hospitals. A model of a PET camera has been installed in Stockholm House of Science and is used to explain the principles of PET to school pupils as described here.

  10. Radiation camera motion correction system

    DOEpatents

    Hoffer, P.B.

    1973-12-18

    The device determines the ratio of the intensity of radiation received by a radiation camera from two separate portions of the object. A correction signal is developed to maintain this ratio at a substantially constant value and this correction signal is combined with the camera signal to correct for object motion. (Official Gazette)

  11. Airborne ballistic camera tracking systems

    NASA Technical Reports Server (NTRS)

    Redish, W. L.

    1976-01-01

    An operational airborne ballistic camera tracking system was tested for operational and data reduction feasibility. The acquisition and data processing requirements of the system are discussed. Suggestions for future improvements are also noted. A description of the data reduction mathematics is outlined. Results from a successful reentry test mission are tabulated. The test mission indicated that airborne ballistic camera tracking systems are feasible.

  12. Camera artifacts in IUE spectra

    NASA Technical Reports Server (NTRS)

    Bruegman, O. W.; Crenshaw, D. M.

    1994-01-01

    This study of emission line mimicking features in the IUE cameras has produced an atlas of artifiacts in high-dispersion images with an accompanying table of prominent artifacts and a table of prominent artifacts in the raw images along with a medium image of the sky background for each IUE camera.

  13. Multi-PSPMT scintillation camera

    SciTech Connect

    Pani, R.; Pellegrini, R.; Trotta, G.; Scopinaro, F.; Soluri, A.; Vincentis, G. de; Scafe, R.; Pergola, A.

    1999-06-01

    Gamma ray imaging is usually accomplished by the use of a relatively large scintillating crystal coupled to either a number of photomultipliers (PMTs) (Anger Camera) or to a single large Position Sensitive PMT (PSPMT). Recently the development of new diagnostic techniques, such as scintimammography and radio-guided surgery, have highlighted a number of significant limitations of the Anger camera in such imaging procedures. In this paper a dedicated gamma camera is proposed for clinical applications with the aim of improving image quality by utilizing detectors with an appropriate size and shape for the part of the body under examination. This novel scintillation camera is based upon an array of PSPMTs (Hamamatsu R5900-C8). The basic concept of this camera is identical to the Anger Camera with the exception of the substitution of PSPMTs for the PMTs. In this configuration it is possible to use the high resolution of the PSPMTs and still correctly position events lying between PSPMTs. In this work the test configuration is a 2 by 2 array of PSPMTs. Some advantages of this camera are: spatial resolution less than 2 mm FWHM, good linearity, thickness less than 3 cm, light weight, lower cost than equivalent area PSPMT, large detection area when coupled to scintillating arrays, small dead boundary zone (< 3 mm) and flexibility in the shape of the camera.

  14. Mars Exploration Rover engineering cameras

    USGS Publications Warehouse

    Maki, J.N.; Bell, J.F.; Herkenhoff, K. E.; Squyres, S. W.; Kiely, A.; Klimesh, M.; Schwochert, M.; Litwin, T.; Willson, R.; Johnson, Aaron H.; Maimone, M.; Baumgartner, E.; Collins, A.; Wadsworth, M.; Elliot, S.T.; Dingizian, A.; Brown, D.; Hagerott, E.C.; Scherr, L.; Deen, R.; Alexander, D.; Lorre, J.

    2003-01-01

    NASA's Mars Exploration Rover (MER) Mission will place a total of 20 cameras (10 per rover) onto the surface of Mars in early 2004. Fourteen of the 20 cameras are designated as engineering cameras and will support the operation of the vehicles on the Martian surface. Images returned from the engineering cameras will also be of significant importance to the scientific community for investigative studies of rock and soil morphology. The Navigation cameras (Navcams, two per rover) are a mast-mounted stereo pair each with a 45?? square field of view (FOV) and an angular resolution of 0.82 milliradians per pixel (mrad/pixel). The Hazard Avoidance cameras (Hazcams, four per rover) are a body-mounted, front- and rear-facing set of stereo pairs, each with a 124?? square FOV and an angular resolution of 2.1 mrad/pixel. The Descent camera (one per rover), mounted to the lander, has a 45?? square FOV and will return images with spatial resolutions of ???4 m/pixel. All of the engineering cameras utilize broadband visible filters and 1024 x 1024 pixel detectors. Copyright 2003 by the American Geophysical Union.

  15. The "All Sky Camera Network"

    ERIC Educational Resources Information Center

    Caldwell, Andy

    2005-01-01

    In 2001, the "All Sky Camera Network" came to life as an outreach program to connect the Denver Museum of Nature and Science (DMNS) exhibit "Space Odyssey" with Colorado schools. The network is comprised of cameras placed strategically at schools throughout Colorado to capture fireballs--rare events that produce meteorites. Meteorites have great…

  16. Sublimation of icy aggregates in the coma of comet 67P/Churyumov-Gerasimenko detected with the OSIRIS cameras on boardRosetta

    NASA Astrophysics Data System (ADS)

    Gicquel, A.; Vincent, J.-B.; Agarwal, J.; A'Hearn, M. F.; Bertini, I.; Bodewits, D.; Sierks, H.; Lin, Z.-Y.; Barbieri, C.; Lamy, P. L.; Rodrigo, R.; Koschny, D.; Rickman, H.; Keller, H. U.; Barucci, M. A.; Bertaux, J.-L.; Besse, S.; Cremonese, G.; Da Deppo, V.; Davidsson, B.; Debei, S.; Deller, J.; De Cecco, M.; Frattin, E.; El-Maarry, M. R.; Fornasier, S.; Fulle, M.; Groussin, O.; Gutiérrez, P. J.; Gutiérrez-Marquez, P.; Güttler, C.; Höfner, S.; Hofmann, M.; Hu, X.; Hviid, S. F.; Ip, W.-H.; Jorda, L.; Knollenberg, J.; Kovacs, G.; Kramm, J.-R.; Kührt, E.; Küppers, M.; Lara, L. M.; Lazzarin, M.; Moreno, J. J. Lopez; Lowry, S.; Marzari, F.; Masoumzadeh, N.; Massironi, M.; Moreno, F.; Mottola, S.; Naletto, G.; Oklay, N.; Pajola, M.; Pommerol, A.; Preusker, F.; Scholten, F.; Shi, X.; Thomas, N.; Toth, I.; Tubiana, C.

    2016-11-01

    Beginning in March 2014, the OSIRIS (Optical, Spectroscopic, and Infrared Remote Imaging System) cameras began capturing images of the nucleus and coma (gas and dust) of comet 67P/Churyumov-Gerasimenko using both the wide angle camera (WAC) and the narrow angle camera (NAC). The many observations taken since July of 2014 have been used to study the morphology, location, and temporal variation of the comet's dust jets. We analyzed the dust monitoring observations shortly after the southern vernal equinox on May 30 and 31, 2015 with the WAC at the heliocentric distance Rh = 1.53 AU, where it is possible to observe that the jet rotates with the nucleus. We found that the decline of brightness as a function of the distance of the jet is much steeper than the background coma, which is a first indication of sublimation. We adapted a model of sublimation of icy aggregates and studied the effect as a function of the physical properties of the aggregates (composition and size). The major finding of this article was that through the sublimation of the aggregates of dirty grains (radius a between 5 microm and 50 microm) we were able to completely reproduce the radial brightness profile of a jet beyond 4 km from the nucleus. To reproduce the data we needed to inject a number of aggregates between 8.5 x $10^{13}$ and 8.5 x $10^{10}$ for a = 5 microm and 50 microm respectively, or an initial mass of $H_2O$ ice around 22kg.

  17. Sublimation of icy aggregates in the coma of comet 67P/Churyumov-Gerasimenko detected with the OSIRIS cameras onboard Rosetta.

    NASA Astrophysics Data System (ADS)

    Gicquel, A.; Vincent, J.-B.; Agarwal, J.; A'Hearn, M. F.; Bertini, I.; Bodewits, D.; Sierks, H.; Lin, Z.-Y.; Barbieri, C.; Lamy, P. L.; Rodrigo, R.; Koschny, D.; Rickman, H.; Keller, H. U.; Barucci, M. A.; Bertaux, J.-L.; Besse, S.; Cremonese, G.; Da Deppo, V.; Davidsson, B.; Debei, S.; Deller, J.; De Cecco, M.; Frattin, E.; El-Maarry, M. R.; Fornasier, S.; Fulle, M.; Groussin, O.; Gutiérrez, P. J.; Gutiérrez-Marquez, P.; Güttler, C.; Höfner, S.; Hofmann, M.; Hu, X.; Hviid, S. F.; Ip, W.-H.; Jorda, L.; Knollenberg, J.; Kovacs, G.; Kramm, J.-R.; Kührt, E.; Küppers, M.; Lara, L. M.; Lazzarin, M.; Moreno, J. J. Lopez; Lowry, S.; Marzari, F.; Masoumzadeh, N.; Massironi, M.; Moreno, F.; Mottola, S.; Naletto, G.; Oklay, N.; Pajola, M.; Pommerol, A.; Preusker, F.; Scholten, F.; Shi, X.; Thomas, N.; Toth, I.; Tubiana, C.

    2016-08-01

    Beginning in March 2014, the OSIRIS (Optical, Spectroscopic, and Infrared Remote Imaging System) cameras began capturing images of the nucleus and coma (gas and dust) of comet 67P/Churyumov-Gerasimenko using both the wide angle camera (WAC) and the narrow angle camera (NAC). The many observations taken since July of 2014 have been used to study the morphology, location, and temporal variation of the comet's dust jets. We analyzed the dust monitoring observations shortly after the southern vernal equinox on May 30 and 31, 2015 with the WAC at the heliocentric distance R_h = 1.53 AU, where it is possible to observe that the jet rotates with the nucleus. We found that the decline of brightness as a function of the distance of the jet is much steeper than the background coma, which is a first indication of sublimation. We adapted a model of sublimation of icy aggregates and studied the effect as a function of the physical properties of the aggregates (composition and size). The major finding of this article was that through the sublimation of the aggregates of dirty grains (radius a between 5μm and 50μm) we were able to completely reproduce the radial brightness profile of a jet beyond 4 km from the nucleus. To reproduce the data we needed to inject a number of aggregates between 8.5 × 1013 and 8.5 × 1010 for a = 5μm and 50μm respectively, or an initial mass of H_2O ice around 22kg.

  18. 31. TACOMA NARROWS BRIDGE, LOOKING WEST ACROSS TOLL PLAZA, 29 ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    31. TACOMA NARROWS BRIDGE, LOOKING WEST ACROSS TOLL PLAZA, 29 AUGUST 1940. (ELDRIDGE, CLARK M. TACOMA NARROWS BRIDGE, TACOMA, WASHINGTON, FINAL REPORT ON DESIGN AND CONSTRUCTION, 1941) - Tacoma Narrows Bridge, Spanning Narrows at State Route 16, Tacoma, Pierce County, WA

  19. 30. TACOMA NARROWS BRIDGE, LOOKING EAST THROUGH TOLL LANES, 29 ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    30. TACOMA NARROWS BRIDGE, LOOKING EAST THROUGH TOLL LANES, 29 AUGUST 1940. (ELDRIDGE, CLARK H. TACOMA NARROWS BRIDGE, TACOMA, WASHINGTON, FINAL REPORT ON DESIGN AND CONSTRUCTION, 1941) - Tacoma Narrows Bridge, Spanning Narrows at State Route 16, Tacoma, Pierce County, WA

  20. IMAX camera (12-IML-1)

    NASA Technical Reports Server (NTRS)

    1992-01-01

    The IMAX camera system is used to record on-orbit activities of interest to the public. Because of the extremely high resolution of the IMAX camera, projector, and audio systems, the audience is afforded a motion picture experience unlike any other. IMAX and OMNIMAX motion picture systems were designed to create motion picture images of superior quality and audience impact. The IMAX camera is a 65 mm, single lens, reflex viewing design with a 15 perforation per frame horizontal pull across. The frame size is 2.06 x 2.77 inches. Film travels through the camera at a rate of 336 feet per minute when the camera is running at the standard 24 frames/sec.

  1. Infants Experience Perceptual Narrowing for Nonprimate Faces

    ERIC Educational Resources Information Center

    Simpson, Elizabeth A.; Varga, Krisztina; Frick, Janet E.; Fragaszy, Dorothy

    2011-01-01

    Perceptual narrowing--a phenomenon in which perception is broad from birth, but narrows as a function of experience--has previously been tested with primate faces. In the first 6 months of life, infants can discriminate among individual human and monkey faces. Though the ability to discriminate monkey faces is lost after about 9 months, infants…

  2. Do narrow {Sigma}-hypernuclear states exist?

    SciTech Connect

    Chrien, R.E.

    1995-12-31

    Reports of narrow states in {Sigma}-hypernucleus production have appeared from time to time. The present experiment is a repeat of the first and seemingly most definitive such experiment, that on a target of {sup 9}Be, but with much better statistics. No narrow states were observed.

  3. Narrow band gap amorphous silicon semiconductors

    DOEpatents

    Madan, A.; Mahan, A.H.

    1985-01-10

    Disclosed is a narrow band gap amorphous silicon semiconductor comprising an alloy of amorphous silicon and a band gap narrowing element selected from the group consisting of Sn, Ge, and Pb, with an electron donor dopant selected from the group consisting of P, As, Sb, Bi and N. The process for producing the narrow band gap amorphous silicon semiconductor comprises the steps of forming an alloy comprising amorphous silicon and at least one of the aforesaid band gap narrowing elements in amount sufficient to narrow the band gap of the silicon semiconductor alloy below that of amorphous silicon, and also utilizing sufficient amounts of the aforesaid electron donor dopant to maintain the amorphous silicon alloy as an n-type semiconductor.

  4. Lights, Camera, Courtroom? Should Trials Be Televised?

    ERIC Educational Resources Information Center

    Kirtley, Jane E.; Brothers, Thomas W.; Veal, Harlan K.

    1999-01-01

    Presents three differing perspectives from American Bar Association members on whether television cameras should be allowed in the courtroom. Contends that cameras should be allowed with differing degrees of certainty: cameras truly open the courts to the public; cameras must be strategically placed; and cameras should be used only with the…

  5. Proportional counter radiation camera

    DOEpatents

    Borkowski, C.J.; Kopp, M.K.

    1974-01-15

    A gas-filled proportional counter camera that images photon emitting sources is described. A two-dimensional, positionsensitive proportional multiwire counter is provided as the detector. The counter consists of a high- voltage anode screen sandwiched between orthogonally disposed planar arrays of multiple parallel strung, resistively coupled cathode wires. Two terminals from each of the cathode arrays are connected to separate timing circuitry to obtain separate X and Y coordinate signal values from pulse shape measurements to define the position of an event within the counter arrays which may be recorded by various means for data display. The counter is further provided with a linear drift field which effectively enlarges the active gas volume of the counter and constrains the recoil electrons produced from ionizing radiation entering the counter to drift perpendicularly toward the planar detection arrays. A collimator is interposed between a subject to be imaged and the counter to transmit only the radiation from the subject which has a perpendicular trajectory with respect to the planar cathode arrays of the detector. (Official Gazette)

  6. Camera sensitivity study

    NASA Astrophysics Data System (ADS)

    Schlueter, Jonathan; Murphey, Yi L.; Miller, John W. V.; Shridhar, Malayappan; Luo, Yun; Khairallah, Farid

    2004-12-01

    As the cost/performance Ratio of vision systems improves with time, new classes of applications become feasible. One such area, automotive applications, is currently being investigated. Applications include occupant detection, collision avoidance and lane tracking. Interest in occupant detection has been spurred by federal automotive safety rules in response to injuries and fatalities caused by deployment of occupant-side air bags. In principle, a vision system could control airbag deployment to prevent this type of mishap. Employing vision technology here, however, presents a variety of challenges, which include controlling costs, inability to control illumination, developing and training a reliable classification system and loss of performance due to production variations due to manufacturing tolerances and customer options. This paper describes the measures that have been developed to evaluate the sensitivity of an occupant detection system to these types of variations. Two procedures are described for evaluating how sensitive the classifier is to camera variations. The first procedure is based on classification accuracy while the second evaluates feature differences.

  7. Wide Angle View of Arsia Mons Volcano

    NASA Technical Reports Server (NTRS)

    1999-01-01

    Arsia Mons (above) is one of the largest volcanoes known. This shield volcano is part of an aligned trio known as the Tharsis Montes--the others are Pavonis Mons and Ascraeus Mons. Arsia Mons is rivaled only by Olympus Mons in terms of its volume. The summit of Arsia Mons is more than 9 kilometers (5.6 miles) higher than the surrounding plains. The crater--or caldera--at the volcano summit is approximately 110 km (68 mi) across. This view of Arsia Mons was taken by the red and blue wide angle cameras of the Mars Global Surveyor Mars Orbiter Camera (MOC) system. Bright water ice clouds (the whitish/bluish wisps) hang above the volcano--a common sight every martian afternoon in this region. Arsia Mons is located at 120o west longitude and 9o south latitude. Illumination is from the left.

  8. CCD Camera Observations

    NASA Astrophysics Data System (ADS)

    Buchheim, Bob; Argyle, R. W.

    One night late in 1918, astronomer William Milburn, observing the region of Cassiopeia from Reverend T.H.E.C. Espin's observatory in Tow Law (England), discovered a hitherto unrecorded double star (Wright 1993). He reported it to Rev. Espin, who measured the pair using his 24-in. reflector: the fainter star was 6.0 arcsec from the primary, at position angle 162.4 ^{circ } (i.e. the fainter star was south-by-southeast from the primary) (Espin 1919). Some time later, it was recognized that the astrograph of the Vatican Observatory had taken an image of the same star-field a dozen years earlier, in late 1906. At that earlier epoch, the fainter star had been separated from the brighter one by only 4.8 arcsec, at position angle 186.2 ^{circ } (i.e. almost due south). Were these stars a binary pair, or were they just two unrelated stars sailing past each other? Some additional measurements might have begun to answer this question. If the secondary star was following a curved path, that would be a clue of orbital motion; if it followed a straight-line path, that would be a clue that these are just two stars passing in the night. Unfortunately, nobody took the trouble to re-examine this pair for almost a century, until the 2MASS astrometric/photometric survey recorded it in late 1998. After almost another decade, this amateur astronomer took some CCD images of the field in 2007, and added another data point on the star's trajectory, as shown in Fig. 15.1.

  9. Vision Sensors and Cameras

    NASA Astrophysics Data System (ADS)

    Hoefflinger, Bernd

    Silicon charge-coupled-device (CCD) imagers have been and are a specialty market ruled by a few companies for decades. Based on CMOS technologies, active-pixel sensors (APS) began to appear in 1990 at the 1 μm technology node. These pixels allow random access, global shutters, and they are compatible with focal-plane imaging systems combining sensing and first-level image processing. The progress towards smaller features and towards ultra-low leakage currents has provided reduced dark currents and μm-size pixels. All chips offer Mega-pixel resolution, and many have very high sensitivities equivalent to ASA 12.800. As a result, HDTV video cameras will become a commodity. Because charge-integration sensors suffer from a limited dynamic range, significant processing effort is spent on multiple exposure and piece-wise analog-digital conversion to reach ranges >10,000:1. The fundamental alternative is log-converting pixels with an eye-like response. This offers a range of almost a million to 1, constant contrast sensitivity and constant colors, important features in professional, technical and medical applications. 3D retino-morphic stacking of sensing and processing on top of each other is being revisited with sub-100 nm CMOS circuits and with TSV technology. With sensor outputs directly on top of neurons, neural focal-plane processing will regain momentum, and new levels of intelligent vision will be achieved. The industry push towards thinned wafers and TSV enables backside-illuminated and other pixels with a 100% fill-factor. 3D vision, which relies on stereo or on time-of-flight, high-speed circuitry, will also benefit from scaled-down CMOS technologies both because of their size as well as their higher speed.

  10. Observation of Planetary Motion Using a Digital Camera

    ERIC Educational Resources Information Center

    Meyn, Jan-Peter

    2008-01-01

    A digital SLR camera with a standard lens (50 mm focal length, f/1.4) on a fixed tripod is used to obtain photographs of the sky which contain stars up to 8[superscript m] apparent magnitude. The angle of view is large enough to ensure visual identification of the photograph with a large sky region in a stellar map. The resolution is sufficient to…

  11. Vertical ocean reflectance at low altitudes for narrow laser beams

    NASA Astrophysics Data System (ADS)

    Crittenden, Eugene C., Jr.; Rodeback, G. W.; Milne, Edmund A.; Cooper, Alfred W.

    1991-09-01

    A narrow-beam laser altimeter was used to measure the reflected signal from the ocean surface as represented by the waters beneath the Golden Gate Bridge. This site allowed precise measurements as a function of angle from the vertical not possible from flying platforms. For short-wavelength water waves superimposed on swell, the signal amplitude probability distribution for the reflected signals showed periods of zero reflection, even for vertical incidence, apparently due to tipping of the water surface. The nonzero signals showed a distribution that could be fitted with an antilog-normal distribution. This is skewed toward higher signals than a normal (Gaussian) distribution. With incidence angle displaced from the vertical, the distribution shape was retained but with more frequent zero reflections. The decrease with angle of the average signal, including the zeroes, is well fitted with a Gram- Charlier distribution, as seen by earlier observers using photographic techniques which masked these details of the structure. For the simpler wave pattern due to a long sustained wind direction, the probability distribution is log-normal with no zero signal periods. At large angles from the vertical the log-normal distribution shifts toward exponential. For surface states intermediate between the above two extremes the distribution is often normal. The larger return signals resulting from the skew toward larger amplitudes from lognormal are more favorable for disposable laser altimeters than previously believed. Also, for an altimeter which may be swinging from a parachute or balloon, the return remains high at angles other than vertical. The presence of occasional zero return signal does somewhat degrade the accuracy of altitude measurement for a descending altimeter, but the signal available assures performance at larger altitudes than previously expected.

  12. Focal Plane Metrology for the LSST Camera

    SciTech Connect

    A Rasmussen, Andrew P.; Hale, Layton; Kim, Peter; Lee, Eric; Perl, Martin; Schindler, Rafe; Takacs, Peter; Thurston, Timothy; /SLAC

    2007-01-10

    Meeting the science goals for the Large Synoptic Survey Telescope (LSST) translates into a demanding set of imaging performance requirements for the optical system over a wide (3.5{sup o}) field of view. In turn, meeting those imaging requirements necessitates maintaining precise control of the focal plane surface (10 {micro}m P-V) over the entire field of view (640 mm diameter) at the operating temperature (T {approx} -100 C) and over the operational elevation angle range. We briefly describe the hierarchical design approach for the LSST Camera focal plane and the baseline design for assembling the flat focal plane at room temperature. Preliminary results of gravity load and thermal distortion calculations are provided, and early metrological verification of candidate materials under cold thermal conditions are presented. A detailed, generalized method for stitching together sparse metrology data originating from differential, non-contact metrological data acquisition spanning multiple (non-continuous) sensor surfaces making up the focal plane, is described and demonstrated. Finally, we describe some in situ alignment verification alternatives, some of which may be integrated into the camera's focal plane.

  13. X-ray Pinhole Camera Measurements

    SciTech Connect

    Nelson, D. S.; Berninger, M. J.; Flores, P. A.; Good, D. E.; Henderson, D. J.; Hogge, K. W.; Huber, S. R.; Lutz, S. S.; Mitchell, S. E.; Howe, R. A.; Mitton, C. V.; Molina, I.; Bozman, D. R.; Cordova, S. R.; Mitchell, D. R.; Oliver, B. V.; Ormond, E. C.

    2013-07-01

    The development of the rod pinch diode [1] has led to high-resolution radiography for dynamic events such as explosive tests. Rod pinch diodes use a small diameter anode rod, which extends through the aperture of a cathode plate. Electrons borne off the aperture surface can self-insulate and pinch onto the tip of the rod, creating an intense, small x-ray source (Primary Pinch). This source has been utilized as the main diagnostic on numerous experiments that include high-value, single-shot events. In such applications there is an emphasis on machine reliability, x-ray reproducibility, and x-ray quality [2]. In tests with the baseline rod pinch diode, we have observed that an additional pinch (Secondary Pinch) occurs at the interface near the anode rod and the rod holder. This suggests that stray electrons exist that are not associated with the Primary Pinch. In this paper we present measurements on both pinches using an x-ray pinhole camera. The camera is placed downstream of the Primary Pinch at an angle of 60° with respect to the diode centerline. This diagnostic will be employed to diagnose x-ray reproducibility and quality. In addition, we will investigate the performance of hybrid diodes relating to the formation of the Primary and Secondary Pinches.

  14. An Inexpensive Digital Infrared Camera

    ERIC Educational Resources Information Center

    Mills, Allan

    2012-01-01

    Details are given for the conversion of an inexpensive webcam to a camera specifically sensitive to the near infrared (700-1000 nm). Some experiments and practical applications are suggested and illustrated. (Contains 9 figures.)

  15. Reading Angles in Maps

    ERIC Educational Resources Information Center

    Izard, Véronique; O'Donnell, Evan; Spelke, Elizabeth S.

    2014-01-01

    Preschool children can navigate by simple geometric maps of the environment, but the nature of the geometric relations they use in map reading remains unclear. Here, children were tested specifically on their sensitivity to angle. Forty-eight children (age 47:15-53:30 months) were presented with fragments of geometric maps, in which angle sections…

  16. Solid State Television Camera (CID)

    NASA Technical Reports Server (NTRS)

    Steele, D. W.; Green, W. T.

    1976-01-01

    The design, development and test are described of a charge injection device (CID) camera using a 244x248 element array. A number of video signal processing functions are included which maximize the output video dynamic range while retaining the inherently good resolution response of the CID. Some of the unique features of the camera are: low light level performance, high S/N ratio, antiblooming, geometric distortion, sequential scanning and AGC.

  17. The future of consumer cameras

    NASA Astrophysics Data System (ADS)

    Battiato, Sebastiano; Moltisanti, Marco

    2015-03-01

    In the last two decades multimedia, and in particular imaging devices (camcorders, tablets, mobile phones, etc.) have been dramatically diffused. Moreover the increasing of their computational performances, combined with an higher storage capability, allows them to process large amount of data. In this paper an overview of the current trends of consumer cameras market and technology will be given, providing also some details about the recent past (from Digital Still Camera up today) and forthcoming key issues.

  18. Astronomy and the camera obscura

    NASA Astrophysics Data System (ADS)

    Feist, M.

    2000-02-01

    The camera obscura (from Latin meaning darkened chamber) is a simple optical device with a long history. In the form considered here, it can be traced back to 1550. It had its heyday during the Victorian era when it was to be found at the seaside as a tourist attraction or sideshow. It was also used as an artist's drawing aid and, in 1620, the famous astronomer-mathematician, Johannes Kepler used a small tent camera obscura to trace the scenery.

  19. Picosecond (picoframe) framing camera evaluations.

    PubMed

    Liu, Y; Sibbett, W; Walker, D R

    1992-03-01

    Detailed theoretical evaluations of picoframe-I- and II-type framing cameras are presented, and predicted performance characteristics are compared with experimental results. The methods of theoretical simulations are described, and a suite of computer programs was developed. The theoretical analyses indicate that the existence of fringe fields in the vicinity of the deflectors is the main factor that limits the dynamic spatial resolutions and frame times of these particular designs of framing camera, and possible refinements are outlined. PMID:20720702

  20. Two-Step Camera Calibration Method Developed for Micro UAV'S

    NASA Astrophysics Data System (ADS)

    Gašparović, M.; Gajski, D.

    2016-06-01

    The development of unmanned aerial vehicles (UAVs) and continuous price reduction of unmanned systems attracted us to this research. Professional measuring systems are dozens of times more expensive and often heavier than "amateur", non-metric UAVs. For this reason, we tested the DJI Phantom 2 Vision Plus UAV. Phantom's smaller mass and velocity can develop less kinetic energy in relation to the professional measurement platforms, which makes it potentially less dangerous for use in populated areas. In this research, we wanted to investigate the ability of such non-metric UAV and find the procedures under which this kind of UAV may be used for the photogrammetric survey. It is important to emphasize that UAV is equipped with an ultra wide-angle camera with 14MP sensor. Calibration of such cameras is a complex process. In the research, a new two-step process is presented and developed, and the results are compared with standard one-step camera calibration procedure. Two-step process involves initially removed distortion on all images, and then uses these images in the phototriangulation with self-calibration. The paper presents statistical indicators which proved that the proposed two-step process is better and more accurate procedure for calibrating those types of cameras than standard one-step calibration. Also, we suggest two-step calibration process as the standard for ultra-wideangle cameras for unmanned aircraft.

  1. A Three-Line Stereo Camera Concept for Planetary Exploration

    NASA Technical Reports Server (NTRS)

    Sandau, Rainer; Hilbert, Stefan; Venus, Holger; Walter, Ingo; Fang, Wai-Chi; Alkalai, Leon

    1997-01-01

    This paper presents a low-weight stereo camera concept for planetary exploration. The camera uses three CCD lines within the image plane of one single objective. Some of the main features of the camera include: focal length-90 mm, FOV-18.5 deg, IFOV-78 (mu)rad, convergence angles-(+/-)10 deg, radiometric dynamics-14 bit, weight-2 kg, and power consumption-12.5 Watts. From an orbit altitude of 250 km the ground pixel size is 20m x 20m and the swath width is 82 km. The CCD line data is buffered in the camera internal mass memory of 1 Gbit. After performing radiometric correction and application-dependent preprocessing the data is compressed and ready for downlink. Due to the aggressive application of advanced technologies in the area of microelectronics and innovative optics, the low mass and power budgets of 2 kg and 12.5 Watts is achieved, while still maintaining high performance. The design of the proposed light-weight camera is also general purpose enough to be applicable to other planetary missions such as the exploration of Mars, Mercury, and the Moon. Moreover, it is an example of excellent international collaboration on advanced technology concepts developed at DLR, Germany, and NASA's Jet Propulsion Laboratory, USA.

  2. Hot Wax Sweeps Debris From Narrow Passages

    NASA Technical Reports Server (NTRS)

    Ricklefs, Steven K.

    1990-01-01

    Safe and effective technique for removal of debris and contaminants from narrow passages involves entrainment of undesired material in thermoplastic casting material. Semisolid wax slightly below melting temperature pushed along passage by pressurized nitrogen to remove debris. Devised to clean out fuel passages in main combustion chamber of Space Shuttle main engine. Also applied to narrow, intricate passages in internal-combustion-engine blocks, carburetors, injection molds, and other complicated parts.

  3. Science, conservation, and camera traps

    USGS Publications Warehouse

    Nichols, James D.; Karanth, K. Ullas; O'Connel, Allan F.; O'Connell, Allan F.; Nichols, James D.; Karanth, K. Ullas

    2011-01-01

    Biologists commonly perceive camera traps as a new tool that enables them to enter the hitherto secret world of wild animals. Camera traps are being used in a wide range of studies dealing with animal ecology, behavior, and conservation. Our intention in this volume is not to simply present the various uses of camera traps, but to focus on their use in the conduct of science and conservation. In this chapter, we provide an overview of these two broad classes of endeavor and sketch the manner in which camera traps are likely to be able to contribute to them. Our main point here is that neither photographs of individual animals, nor detection history data, nor parameter estimates generated from detection histories are the ultimate objective of a camera trap study directed at either science or management. Instead, the ultimate objectives are best viewed as either gaining an understanding of how ecological systems work (science) or trying to make wise decisions that move systems from less desirable to more desirable states (conservation, management). Therefore, we briefly describe here basic approaches to science and management, emphasizing the role of field data and associated analyses in these processes. We provide examples of ways in which camera trap data can inform science and management.

  4. Person re-identification over camera networks using multi-task distance metric learning.

    PubMed

    Ma, Lianyang; Yang, Xiaokang; Tao, Dacheng

    2014-08-01

    Person reidentification in a camera network is a valuable yet challenging problem to solve. Existing methods learn a common Mahalanobis distance metric by using the data collected from different cameras and then exploit the learned metric for identifying people in the images. However, the cameras in a camera network have different settings and the recorded images are seriously affected by variability in illumination conditions, camera viewing angles, and background clutter. Using a common metric to conduct person reidentification tasks on different camera pairs overlooks the differences in camera settings; however, it is very time-consuming to label people manually in images from surveillance videos. For example, in most existing person reidentification data sets, only one image of a person is collected from each of only two cameras; therefore, directly learning a unique Mahalanobis distance metric for each camera pair is susceptible to over-fitting by using insufficiently labeled data. In this paper, we reformulate person reidentification in a camera network as a multitask distance metric learning problem. The proposed method designs multiple Mahalanobis distance metrics to cope with the complicated conditions that exist in typical camera networks. We address the fact that these Mahalanobis distance metrics are different but related, and learned by adding joint regularization to alleviate over-fitting. Furthermore, by extending, we present a novel multitask maximally collapsing metric learning (MtMCML) model for person reidentification in a camera network. Experimental results demonstrate that formulating person reidentification over camera networks as multitask distance metric learning problem can improve performance, and our proposed MtMCML works substantially better than other current state-of-the-art person reidentification methods.

  5. Further improvements in the design of a positron camera with dense drift space MWPCs

    NASA Astrophysics Data System (ADS)

    Perez-Mendez, V.; Schwartz, G.; Nelson, W. R.; Bellazzini, R.; Del Guerra, A.; Massai, M. M.; Spandre, G.

    1983-11-01

    We describe the improvements achieved in the last three years towards the construction of a large solid angle positron camera with dense drift space MWPCs. A multiplane three-dimensional tomograph is proposed, made of six MWPC modules (active area 45 × 45 cm 2 each), arranged to form the lateral surface of a hexagonal prism. Its expected performance is presented and is shown to be very competitive with the multiring scintillator positron camera.

  6. Sub-Camera Calibration of a Penta-Camera

    NASA Astrophysics Data System (ADS)

    Jacobsen, K.; Gerke, M.

    2016-03-01

    Penta cameras consisting of a nadir and four inclined cameras are becoming more and more popular, having the advantage of imaging also facades in built up areas from four directions. Such system cameras require a boresight calibration of the geometric relation of the cameras to each other, but also a calibration of the sub-cameras. Based on data sets of the ISPRS/EuroSDR benchmark for multi platform photogrammetry the inner orientation of the used IGI Penta DigiCAM has been analyzed. The required image coordinates of the blocks Dortmund and Zeche Zollern have been determined by Pix4Dmapper and have been independently adjusted and analyzed by program system BLUH. With 4.1 million image points in 314 images respectively 3.9 million image points in 248 images a dense matching was provided by Pix4Dmapper. With up to 19 respectively 29 images per object point the images are well connected, nevertheless the high number of images per object point are concentrated to the block centres while the inclined images outside the block centre are satisfying but not very strongly connected. This leads to very high values for the Student test (T-test) of the finally used additional parameters or in other words, additional parameters are highly significant. The estimated radial symmetric distortion of the nadir sub-camera corresponds to the laboratory calibration of IGI, but there are still radial symmetric distortions also for the inclined cameras with a size exceeding 5μm even if mentioned as negligible based on the laboratory calibration. Radial and tangential effects of the image corners are limited but still available. Remarkable angular affine systematic image errors can be seen especially in the block Zeche Zollern. Such deformations are unusual for digital matrix cameras, but it can be caused by the correlation between inner and exterior orientation if only parallel flight lines are used. With exception of the angular affinity the systematic image errors for corresponding

  7. Vibration detection and calibration method used to remote sensing optical camera

    NASA Astrophysics Data System (ADS)

    Li, Qi; Dong, Wende; Xu, Zhihai; Feng, Huajun

    2013-09-01

    In order to obtain sharp remote sensing images, the image stabilization technology of space camera and the remote sensing image restoration technology are usually used now. Vibration detection is the key to realize these technologies: an image stabilization system needs the displacement vector derived from vibration detection to drive the compensation mechanism; and the remote sensing image restoration technology needs the vibration displacement vector to construct the point spread function (PSF). Vibration detection not only can be used to improve image quality of panchromatic camera, infrared cameras and other optical camera, also is motion compensation basis of satellite radar equipment. In this paper we have constructed a vibration measuring method based on Fiber optic gyro (FOG). FOG is a device sensitive to angular velocity or angular displacement. High-precision FOG can be used to measure the jitter angle of the optic axis of a space camera fixed on satellite platform. According to the measured data, the vibration displacement vector of the imaging plane can be calculated. Consequently the vibration data provide a basis for image stabilization of space camera and restoration of remote sensing images. We simulated the vibration of a space camera by using a piezoelectric ceramic deflection platform, and calibrated vibration measurement by using laser beam and a high-speed linear array camera. We compared the feedback output of the deflection platform, the FOG measured data and the calibrated data of the linear array camera, and obtained a calibration accuracy better than 1.5 μrad.

  8. Method for pan-tilt camera calibration using single control point.

    PubMed

    Li, Yunting; Zhang, Jun; Hu, Wenwen; Tian, Jinwen

    2015-01-01

    The pan-tilt (PT) camera is widely used in video surveillance systems due to its rotatable property and low cost. The rough output of a PT camera may not satisfy the demand of practical applications; hence an accurate calibration method of a PT camera is desired. However, high-precision camera calibration methods usually require sufficient control points not guaranteed in some practical cases of a PT camera. In this paper, we present a novel method to online calibrate the rotation angles of a PT camera by using only one control point. This is achieved by assuming that the intrinsic parameters and position of the camera are known in advance. More specifically, we first build a nonlinear PT camera model with respect to two parameters Pan and Tilt. We then convert the nonlinear model into a linear model according to sine and cosine of Tilt, where each element in the augmented coefficient matrix is a function of the single variable Pan. A closed-form solution of Pan and Tilt can then be derived by solving a quadratic equation of tangent of Pan. Our method is noniterative and does not need features matching; thus its time efficiency is better. We evaluate our calibration method on various synthetic and real data. The quantitative results demonstrate that the proposed method outperforms other state-of-the-art methods if the intrinsic parameters and position of the camera are known in advance.

  9. The study of dual camera 3D coordinate vision measurement system using a special probe

    NASA Astrophysics Data System (ADS)

    Liu, Shugui; Peng, Kai; Zhang, Xuefei; Zhang, Haifeng; Huang, Fengshan

    2006-11-01

    Due to high precision and convenient operation, the vision coordinate measurement machine with one probe has become the research focus in visual industry. In general such a visual system can be setup conveniently with just one CCD camera and probe. However, the price of the system will surge up too high to accept while the top performance hardware, such as CCD camera, image captured card and etc, have to be applied in the system to obtain the high axis-oriented measurement precision. In this paper, a new dual CCD camera vision coordinate measurement system based on redundancy principle is proposed to achieve high precision by moderate price. Since two CCD cameras are placed with the angle of camera axis like about 90 degrees to build the system, two sub-systems can be built by each CCD camera and the probe. With the help of the probe the inner and outer parameters of camera are first calibrated, the system by use of redundancy technique is set up now. When axis-oriented error is eliminated within the two sub-systems, which is so large and always exits in the single camera system, the high precision measurement is obtained by the system. The result of experiment compared to that from CMM shows that the system proposed is more excellent in stableness and precision with the uncertainty beyond +/-0.1 mm in xyz orient within the distance of 2m using two common CCD cameras.

  10. Solder wetting kinetics in narrow V-grooves

    SciTech Connect

    Yost, F.G.; Rye, R.R.; Mann, J.A. Jr.

    1997-12-01

    Experiments are performed to observe capillary flow in grooves cut into copper surfaces. Flow kinetics of two liquids, 1-heptanol and eutectic Sn-Pb solder, are modeled with modified Washburn kinetics and compared to flow data. It is shown that both liquids flow parabolically in narrow V-grooves, and the data scale as predicted by the modified Washburn model. The early portions of the flow kinetics are characterized by curvature in the length vs time relationship which is not accounted for in the modified Washburn model. This effect is interpreted in terms of a dynamic contact angle. It is concluded that under conditions of rapid flow, solder spreading can be understood as a simple fluid flow process. Slower kinetics, e.g. solder droplet spreading on flat surfaces, may be affected by subsidiary chemical processes such as reaction.

  11. Motion tracking in narrow spaces: a structured light approach.

    PubMed

    Olesen, Oline Vinter; Paulsen, Rasmus R; Højgaar, Liselotte; Roed, Bjarne; Larsen, Rasmus

    2010-01-01

    We present a novel tracking system for patient head motion inside 3D medical scanners. Currently, the system is targeted at the Siemens High Resolution Research Tomograph (HRRT) PET scanner. Partial face surfaces are reconstructed using a miniaturized structured light system. The reconstructed 3D point clouds are matched to a reference surface using a robust iterative closest point algorithm. A main challenge is the narrow geometry requiring a compact structured light system and an oblique angle of observation. The system is validated using a mannequin head mounted on a rotary stage. We compare the system to a standard optical motion tracker based on a rigid tracking tool. Our system achieves an angular RMSE of 0.11 degrees demonstrating its relevance for motion compensated 3D scan image reconstructions as well as its competitiveness against the standard optical system with an RMSE of 0.08 degrees. Finally, we demonstrate qualitative result on real face motion estimation.

  12. Angles, Time, and Proportion

    ERIC Educational Resources Information Center

    Pagni, David L.

    2005-01-01

    This article describes an investigation making connections between the time on an analog clock and the angle between the minute hand and the hour hand. It was posed by a middle school mathematics teacher. (Contains 8 tables and 6 figures.)

  13. Reading angles in maps.

    PubMed

    Izard, Véronique; O'Donnell, Evan; Spelke, Elizabeth S

    2014-01-01

    Preschool children can navigate by simple geometric maps of the environment, but the nature of the geometric relations they use in map reading remains unclear. Here, children were tested specifically on their sensitivity to angle. Forty-eight children (age 47:15-53:30 months) were presented with fragments of geometric maps, in which angle sections appeared without any relevant length or distance information. Children were able to read these map fragments and compare two-dimensional to three-dimensional angles. However, this ability appeared both variable and fragile among the youngest children of the sample. These findings suggest that 4-year-old children begin to form an abstract concept of angle that applies both to two-dimensional and three-dimensional displays and that serves to interpret novel spatial symbols. PMID:23647223

  14. Photometric Calibration of Consumer Video Cameras

    NASA Technical Reports Server (NTRS)

    Suggs, Robert; Swift, Wesley, Jr.

    2007-01-01

    Equipment and techniques have been developed to implement a method of photometric calibration of consumer video cameras for imaging of objects that are sufficiently narrow or sufficiently distant to be optically equivalent to point or line sources. Heretofore, it has been difficult to calibrate consumer video cameras, especially in cases of image saturation, because they exhibit nonlinear responses with dynamic ranges much smaller than those of scientific-grade video cameras. The present method not only takes this difficulty in stride but also makes it possible to extend effective dynamic ranges to several powers of ten beyond saturation levels. The method will likely be primarily useful in astronomical photometry. There are also potential commercial applications in medical and industrial imaging of point or line sources in the presence of saturation.This development was prompted by the need to measure brightnesses of debris in amateur video images of the breakup of the Space Shuttle Columbia. The purpose of these measurements is to use the brightness values to estimate relative masses of debris objects. In most of the images, the brightness of the main body of Columbia was found to exceed the dynamic ranges of the cameras. A similar problem arose a few years ago in the analysis of video images of Leonid meteors. The present method is a refined version of the calibration method developed to solve the Leonid calibration problem. In this method, one performs an endto- end calibration of the entire imaging system, including not only the imaging optics and imaging photodetector array but also analog tape recording and playback equipment (if used) and any frame grabber or other analog-to-digital converter (if used). To automatically incorporate the effects of nonlinearity and any other distortions into the calibration, the calibration images are processed in precisely the same manner as are the images of meteors, space-shuttle debris, or other objects that one seeks to

  15. CARTOGAM: a portable gamma camera

    NASA Astrophysics Data System (ADS)

    Gal, O.; Izac, C.; Lainé, F.; Nguyen, A.

    1997-02-01

    The gamma camera is devised to establish the cartography of radioactive sources against a visible background in quasi real time. This device is designed to spot sources from a distance during the preparation of interventions on active areas of nuclear installations. This implement will permit to optimize interventions especially on the dosimetric level. The camera consists of a double cone collimator, a scintillator and an intensified CCD camera. This chain of detection provides the formation of both gamma images and visible images. Even though it is wrapped in a denal shield, the camera is still portable (mass < 15 kg) and compact (external diameter = 8 cm). The angular resolution is of the order of one degree for gamma rays of 1 MeV. In a few minutes, the device is able to measure a dose rate of 10 μGy/h delivered for instance by a source of 60Co of 90 mCi located at 10 m from the detector. The first images recorded in the laboratory will be presented and will illustrate the performances obtained with this camera.

  16. The Clementine longwave infrared camera

    SciTech Connect

    Priest, R.E.; Lewis, I.T.; Sewall, N.R.; Park, H.S.; Shannon, M.J.; Ledebuhr, A.G.; Pleasance, L.D.; Massie, M.A.; Metschuleit, K.

    1995-04-01

    The Clementine mission provided the first ever complete, systematic surface mapping of the moon from the ultra-violet to the near-infrared regions. More than 1.7 million images of the moon, earth and space were returned from this mission. The longwave-infrared (LWIR) camera supplemented the UV/Visible and near-infrared mapping cameras providing limited strip coverage of the moon, giving insight to the thermal properties of the soils. This camera provided {approximately}100 m spatial resolution at 400 km periselene, and a 7 km across-track swath. This 2.1 kg camera using a 128 x 128 Mercury-Cadmium-Telluride (MCT) FPA viewed thermal emission of the lunar surface and lunar horizon in the 8.0 to 9.5 {micro}m wavelength region. A description of this light-weight, low power LWIR camera along with a summary of lessons learned is presented. Design goals and preliminary on-orbit performance estimates are addressed in terms of meeting the mission`s primary objective for flight qualifying the sensors for future Department of Defense flights.

  17. 'Magic Angle Precession'

    SciTech Connect

    Binder, Bernd

    2008-01-21

    An advanced and exact geometric description of nonlinear precession dynamics modeling very accurately natural and artificial couplings showing Lorentz symmetry is derived. In the linear description it is usually ignored that the geometric phase of relativistic motion couples back to the orbital motion providing for a non-linear recursive precession dynamics. The high coupling strength in the nonlinear case is found to be a gravitomagnetic charge proportional to the precession angle and angular velocity generated by geometric phases, which are induced by high-speed relativistic rotations and are relevant to propulsion technologies but also to basic interactions. In the quantum range some magic precession angles indicating strong coupling in a phase-locked chaotic system are identified, emerging from a discrete time dynamical system known as the cosine map showing bifurcations at special precession angles relevant to heavy nuclei stability. The 'Magic Angle Precession' (MAP) dynamics can be simulated and visualized by cones rolling in or on each other, where the apex and precession angles are indexed by spin, charge or precession quantum numbers, and corresponding magic angles. The most extreme relativistic warping and twisting effect is given by the Dirac spinor half spin constellation with 'Hyperdiamond' MAP, which resembles quark confinement.

  18. Medium format cameras used by NASA astronauts

    NASA Technical Reports Server (NTRS)

    Amsbury, David; Bremer, Jeff

    1989-01-01

    The medium format cameras and other hardware used for photographing the earth from the Space Shuttle are discussed. Illustrations and descriptions are given for the two types of cameras used for most earth photography, the NASA-modified Hasselblad 500 EL/M 70-mm cameras and the Linhof AeroTechnika 45 camera. Also, the data recording modules used on Space Shuttle missions and a mounting device to produce simultaneous photography using two cameras are examined.

  19. Atomic momentum patterns with narrower intervals

    NASA Astrophysics Data System (ADS)

    Yang, Baoguo; Jin, Shengjie; Dong, Xiangyu; Liu, Zhe; Yin, Lan; Zhou, Xiaoji

    2016-10-01

    We studied the atomic momentum distribution of a superposition of Bloch states in the lowest band of an optical lattice after the action of the standing-wave pulse. By designing the imposing pulse on this superposed state, an atomic momentum pattern appears with a narrow interval between the adjacent peaks that can be far less than double recoil momentum. The patterns with narrower interval come from the effect of the designed pulse on the superposition of many Bloch states with quasimomenta throughout the first Brillouin zone. Our experimental result of narrow interval peaks is consistent with the theoretical simulation. The patterns of multiple modes with different quasimomenta may be helpful for precise measurement and atomic manipulation.

  20. Narrow-bandwidth unstable laser resonator

    SciTech Connect

    Reintjes, J.F.; Tankersley, L.L.; Cooper, D.

    1988-10-21

    The present invention relates to unstable laser resonators, particularly to unstable laser resonators, and particularly to an unstable laser resonator that produces optical radiation that simultaneously has the high output power diffraction-limited divergence characteristic of an unstable laser resonator and also the narrow bandwidth that can usually be obtained only with a stable laser resonator. Some success was achieved in the frequency narrowing of the laser radiation from an unstable laser resonator cavity by using a diffraction grating. This technique was works best with lasers that have sharp line structure, such as molecular lasers. For example, selection of a single line in a hydrogen-fluoride laser has been reported in several configurations involving the insertion of a diffraction grating into a standard unstable laser resonator cavity. Although currently available, unstable laser resonators have the configuration of choice for producing high-power, low-divergence radiation from laser cavities; they are not compatible with a simultaneous requirement of narrow bandwidth.

  1. Discovery of a narrow line quasar

    NASA Technical Reports Server (NTRS)

    Stocke, J.; Liebert, J.; Maccacaro, T.; Griffiths, R. E.; Steiner, J. E.

    1982-01-01

    A stellar object is reported which, while having X-ray and optical luminosities typical of quasars, has narrow permitted and forbidden emission lines over the observed spectral range. The narrow-line spectrum is high-excitation, the Balmer lines seem to be recombinational, and a redder optical spectrum than that of most quasars is exhibited, despite detection as a weak radio source. The object does not conform to the relationships between H-beta parameters and X-ray flux previously claimed for a large sample of the active galactic nuclei. Because reddish quasars with narrow lines, such as the object identified, may not be found by the standard techniques for the discovery of quasars, the object may be a prototype of a new class of quasars analogous to high-luminosity Seyfert type 2 galaxies. It is suggested that these objects cannot comprise more than 10% of all quasars.

  2. The GISMO-2 Bolometer Camera

    NASA Technical Reports Server (NTRS)

    Staguhn, Johannes G.; Benford, Dominic J.; Fixsen, Dale J.; Hilton, Gene; Irwin, Kent D.; Jhabvala, Christine A.; Kovacs, Attila; Leclercq, Samuel; Maher, Stephen F.; Miller, Timothy M.; Moseley, Samuel H.; Sharp, Elemer H.; Wollack, Edward J.

    2012-01-01

    We present the concept for the GISMO-2 bolometer camera) which we build for background-limited operation at the IRAM 30 m telescope on Pico Veleta, Spain. GISM0-2 will operate Simultaneously in the 1 mm and 2 mm atmospherical windows. The 1 mm channel uses a 32 x 40 TES-based Backshort Under Grid (BUG) bolometer array, the 2 mm channel operates with a 16 x 16 BUG array. The camera utilizes almost the entire full field of view provided by the telescope. The optical design of GISM0-2 was strongly influenced by our experience with the GISMO 2 mm bolometer camera which is successfully operating at the 30m telescope. GISMO is accessible to the astronomical community through the regular IRAM call for proposals.

  3. Perceptual Color Characterization of Cameras

    PubMed Central

    Vazquez-Corral, Javier; Connah, David; Bertalmío, Marcelo

    2014-01-01

    Color camera characterization, mapping outputs from the camera sensors to an independent color space, such as XY Z, is an important step in the camera processing pipeline. Until now, this procedure has been primarily solved by using a 3 × 3 matrix obtained via a least-squares optimization. In this paper, we propose to use the spherical sampling method, recently published by Finlayson et al., to perform a perceptual color characterization. In particular, we search for the 3 × 3 matrix that minimizes three different perceptual errors, one pixel based and two spatially based. For the pixel-based case, we minimize the CIE ΔE error, while for the spatial-based case, we minimize both the S-CIELAB error and the CID error measure. Our results demonstrate an improvement of approximately 3% for the ΔE error, 7% for the S-CIELAB error and 13% for the CID error measures. PMID:25490586

  4. Dark Energy Camera for Blanco

    SciTech Connect

    Binder, Gary A.; /Caltech /SLAC

    2010-08-25

    In order to make accurate measurements of dark energy, a system is needed to monitor the focus and alignment of the Dark Energy Camera (DECam) to be located on the Blanco 4m Telescope for the upcoming Dark Energy Survey. One new approach under development is to fit out-of-focus star images to a point spread function from which information about the focus and tilt of the camera can be obtained. As a first test of a new algorithm using this idea, simulated star images produced from a model of DECam in the optics software Zemax were fitted. Then, real images from the Mosaic II imager currently installed on the Blanco telescope were used to investigate the algorithm's capabilities. A number of problems with the algorithm were found, and more work is needed to understand its limitations and improve its capabilities so it can reliably predict camera alignment and focus.

  5. Camera-on-a-Chip

    NASA Technical Reports Server (NTRS)

    1999-01-01

    Jet Propulsion Laboratory's research on a second generation, solid-state image sensor technology has resulted in the Complementary Metal- Oxide Semiconductor Active Pixel Sensor (CMOS), establishing an alternative to the Charged Coupled Device (CCD). Photobit Corporation, the leading supplier of CMOS image sensors, has commercialized two products of their own based on this technology: the PB-100 and PB-300. These devices are cameras on a chip, combining all camera functions. CMOS "active-pixel" digital image sensors offer several advantages over CCDs, a technology used in video and still-camera applications for 30 years. The CMOS sensors draw less energy, they use the same manufacturing platform as most microprocessors and memory chips, and they allow on-chip programming of frame size, exposure, and other parameters.

  6. Dynamic calibration of pan-tilt-zoom cameras for traffic monitoring.

    PubMed

    Song, Kai-Tai; Tai, Jen-Chao

    2006-10-01

    Pan-tilt-zoom (PTZ) cameras have been widely used in recent years for monitoring and surveillance applications. These cameras provide flexible view selection as well as a wider observation range. This makes them suitable for vision-based traffic monitoring and enforcement systems. To employ PTZ cameras for image measurement applications, one first needs to calibrate the camera to obtain meaningful results. For instance, the accuracy of estimating vehicle speed depends on the accuracy of camera calibration and that of vehicle tracking results. This paper presents a novel calibration method for a PTZ camera overlooking a traffic scene. The proposed approach requires no manual operation to select the positions of special features. It automatically uses a set of parallel lane markings and the lane width to compute the camera parameters, namely, focal length, tilt angle, and pan angle. Image processing procedures have been developed for automatically finding parallel lane markings. Interesting experimental results are presented to validate the robustness and accuracy of the proposed method.

  7. Wide field camera observations of Baade's Window

    NASA Technical Reports Server (NTRS)

    Holtzman, Jon A.; Light, R. M.; Baum, William A.; Worthey, Guy; Faber, S. M.; Hunter, Deidre A.; O'Neil, Earl J., Jr.; Kreidl, Tobias J.; Groth, E. J.; Westphal, James A.

    1993-01-01

    We have observed a field in Baade's Window using the Wide Field Camera (WFC) of the Hubble Space Telescope (HST) and obtain V- and I-band photometry down to V approximately 22.5. These data go several magnitudes fainter than previously obtained from the ground. The location of the break in the luminosity function suggests that there are a significant number of intermediate age (less than 10 Gyr) stars in the Galactic bulge. This conclusion rests on the assumptions that the extinction towards our field is similar to that seen in other parts of Baade's Window, that the distance to the bulge is approximately 8 kpc, and that we can determine fairly accurate zero points for the HST photometry. Changes in any one of these assumptions could increase the inferred age, but a conspiracy of lower reddening, a shorter distance to the bulge, and/or photometric zero-point errors would be needed to imply a population entirely older than 10 Gyr. We infer an initial mass function slope for the main-sequence stars, and find that it is consistent with that measured in the solar neighborhood; unfortunately, the slope is poorly constrained because we sample only a narrow range of stellar mass and because of uncertainties in the observed luminosity function at the faint end.

  8. A testbed for wide-field, high-resolution, gigapixel-class cameras.

    PubMed

    Kittle, David S; Marks, Daniel L; Son, Hui S; Kim, Jungsang; Brady, David J

    2013-05-01

    The high resolution and wide field of view (FOV) of the AWARE (Advanced Wide FOV Architectures for Image Reconstruction and Exploitation) gigapixel class cameras present new challenges in calibration, mechanical testing, and optical performance evaluation. The AWARE system integrates an array of micro-cameras in a multiscale design to achieve gigapixel sampling at video rates. Alignment and optical testing of the micro-cameras is vital in compositing engines, which require pixel-level accurate mappings over the entire array of cameras. A testbed has been developed to automatically calibrate and measure the optical performance of the entire camera array. This testbed utilizes translation and rotation stages to project a ray into any micro-camera of the AWARE system. A spatial light modulator is projected through a telescope to form an arbitrary object space pattern at infinity. This collimated source is then reflected by an elevation stage mirror for pointing through the aperture of the objective into the micro-optics and eventually the detector of the micro-camera. Different targets can be projected with the spatial light modulator for measuring the modulation transfer function (MTF) of the system, fiducials in the overlap regions for registration and compositing, distortion mapping, illumination profiles, thermal stability, and focus calibration. The mathematics of the testbed mechanics are derived for finding the positions of the stages to achieve a particular incident angle into the camera, along with calibration steps for alignment of the camera and testbed coordinate axes. Measurement results for the AWARE-2 gigapixel camera are presented for MTF, focus calibration, illumination profile, fiducial mapping across the micro-camera for registration and distortion correction, thermal stability, and alignment of the camera on the testbed.

  9. A testbed for wide-field, high-resolution, gigapixel-class cameras

    NASA Astrophysics Data System (ADS)

    Kittle, David S.; Marks, Daniel L.; Son, Hui S.; Kim, Jungsang; Brady, David J.

    2013-05-01

    The high resolution and wide field of view (FOV) of the AWARE (Advanced Wide FOV Architectures for Image Reconstruction and Exploitation) gigapixel class cameras present new challenges in calibration, mechanical testing, and optical performance evaluation. The AWARE system integrates an array of micro-cameras in a multiscale design to achieve gigapixel sampling at video rates. Alignment and optical testing of the micro-cameras is vital in compositing engines, which require pixel-level accurate mappings over the entire array of cameras. A testbed has been developed to automatically calibrate and measure the optical performance of the entire camera array. This testbed utilizes translation and rotation stages to project a ray into any micro-camera of the AWARE system. A spatial light modulator is projected through a telescope to form an arbitrary object space pattern at infinity. This collimated source is then reflected by an elevation stage mirror for pointing through the aperture of the objective into the micro-optics and eventually the detector of the micro-camera. Different targets can be projected with the spatial light modulator for measuring the modulation transfer function (MTF) of the system, fiducials in the overlap regions for registration and compositing, distortion mapping, illumination profiles, thermal stability, and focus calibration. The mathematics of the testbed mechanics are derived for finding the positions of the stages to achieve a particular incident angle into the camera, along with calibration steps for alignment of the camera and testbed coordinate axes. Measurement results for the AWARE-2 gigapixel camera are presented for MTF, focus calibration, illumination profile, fiducial mapping across the micro-camera for registration and distortion correction, thermal stability, and alignment of the camera on the testbed.

  10. Multi-Angle View of the Canary Islands

    NASA Technical Reports Server (NTRS)

    2000-01-01

    A multi-angle view of the Canary Islands in a dust storm, 29 February 2000. At left is a true-color image taken by the Multi-angle Imaging SpectroRadiometer (MISR) instrument on NASA's Terra satellite. This image was captured by the MISR camera looking at a 70.5-degree angle to the surface, ahead of the spacecraft. The middle image was taken by the MISR downward-looking (nadir) camera, and the right image is from the aftward 70.5-degree camera. The images are reproduced using the same radiometric scale, so variations in brightness, color, and contrast represent true variations in surface and atmospheric reflectance with angle. Windblown dust from the Sahara Desert is apparent in all three images, and is much brighter in the oblique views. This illustrates how MISR's oblique imaging capability makes the instrument a sensitive detector of dust and other particles in the atmosphere. Data for all channels are presented in a Space Oblique Mercator map projection to facilitate their co-registration. The images are about 400 km (250 miles)wide, with a spatial resolution of about 1.1 kilometers (1,200 yards). North is toward the top. MISR was built and is managed by NASA's Jet Propulsion Laboratory, Pasadena, CA, for NASA's Office of Earth Science, Washington, DC. The Terra satellite is managed by NASA's Goddard Space Flight Center, Greenbelt, MD. JPL is a division of the California Institute of Technology.

  11. Anterior chamber angle in the exfoliation syndrome.

    PubMed Central

    Wishart, P K; Spaeth, G L; Poryzees, E M

    1985-01-01

    The gonioscopic findings of 76 patients with the exfoliation syndrome were reviewed. A high frequency of narrowness of the anterior chamber (AC) angle was found (32%). 18% had angles considered occludable, and 14% had obvious angle-closure glaucoma as shown by the presence of peripheral anterior synechias (PAS). Increased pigmentation of the posterior trabecular meshwork (PTM) was noted in all cases. When this pigmentation was markedly asymmetrical, unilateral exfoliation with glaucoma was common in the more pigmented eye. In addition heavy angle pigmentation in the absence of exfoliation was noted in the fellow eye of patients with characteristic exfoliated material in the other eye. Increased pigmentation of the PTM may be the earliest detectable sign of the exfoliation syndrome (ES). The clinical significance of our estimating PTM pigmentation at the 12 o'clock position is discussed. In view of the accelerated optic nerve damage associated with the development of glaucoma secondary to ES, routine estimation of the pigmentation of the PTM at 12 o'clock is recommended in the hope of early detection of cases of otherwise inapparent ES. Images PMID:3966996

  12. Buoyancy-induced turbulent mixing in a narrow tilted tank

    NASA Astrophysics Data System (ADS)

    Lin, Tiras Y.; Caulfield, C. P.; Woods, Andrew W.

    2014-11-01

    We describe a series of experiments in which a constant buoyancy flux Bs of dyed salty water of density ρs is introduced at the top of a long narrow tank of square cross-section tilted at an angle θ from the vertical. The tank is initially filled with fresh clear water of density ρ0 <ρs , and we investigate the resulting buoyancy-driven high Reynolds number turbulent mixing at various tilt angles θ using a light-attenuation method. When θ >0° , the ensemble averaged reduced gravity develops a statically stable gradient normal to the walls of the tank, and this induces a counterflow. We model the evolution of the cross-tank and ensemble averaged reduced gravity < g ' ̲ >e as a diffusive process using Prandtl's mixing length theory, building on the model of van Sommeren et al. (JFM 701, 2012) who considered vertical tanks. We show that the counterflow acts to enhance the effective along-tank turbulent diffusivity, and from experiments, we find that the mixing length increases approximately linearly with θ, and that both the along-tank and cross-tank turbulent diffusivities are proportional to (∂ < g ' ̲ >e / ∂ z)1/2 .

  13. Narrow-band ELF events observed from South Pole Station

    NASA Astrophysics Data System (ADS)

    Heavisides, J.; Weaver, C.; Lessard, M.; Weatherwax, A. T.

    2012-12-01

    Extremely Low Frequency (ELF) waves are typically in the range of 3 Hz - 3 kHz and can play a role in acceleration and pitch-angle scattering of energetic particles in the radiation belts. Observations of a not uncommon, but not well studied ELF phenomenon are presented with ground-based data from South Pole Station. The narrow-band waves last approximately one or two minutes maintaining bandwidth over the course of the event, begin around 100 Hz, decrease to about 70 Hz, and typically show a higher frequency harmonic. The waves have only been documented at four locations - Heacock, 1974 (Alaska); Sentman and Ehring, 1994 (California); Wang et al, 2005 and Wang et al, 2011 (Taiwan); and Kim et al, 2006 (South Pole). The waves observed at the South Pole are not detected when the Sun drops below a 10 degree elevation angle, which is not true for the other locations. We extend the study of Kim et al, 2006, and explore possible generation mechanisms including sunlit ionosphere and ion cyclotron wave modes, as well as correspondence with energetic particle precipitation.

  14. Narrow-Band Applications of Communications Satellites.

    ERIC Educational Resources Information Center

    Cowlan, Bert; Horowitz, Andrew

    This paper attempts to describe the advantages of "narrow-band" applications of communications satellites for education. It begins by discussing the general controversy surrounding the use of satellites in education, by placing the concern within the larger context of the general debate over the uses of new technologies in education, and by…

  15. Progressive inflammatory subglottic narrowing responsive to steroids

    PubMed Central

    Phelan, Peter; Hey, Edmund

    1983-01-01

    Four children aged between 2½ and 13½ years developed insidious subglottic stenosis of unknown cause over 3-12 months. In all, the initial diagnosis was asthma which resulted in inappropriate treatment. Endoscopically there was circumferential subglottic narrowing, and biopsy in 3 showed non-specific inflammatory changes. Corticosteroid therapy led to rapid and complete resolution. PMID:6838258

  16. Adverse effects of prohibiting narrow provider networks.

    PubMed

    Howard, David H

    2014-08-14

    Many insurers participating in the new insurance exchanges are controlling costs by offering plans with narrow provider networks. Proposed regulations would promote network adequacy, but a pro-provider stance may not be inherently pro-consumer or even pro-patient. PMID:25119604

  17. Narrow-headed garter snake (Thamnophis rufipunctatus)

    USGS Publications Warehouse

    Nowak, Erika M.

    2006-01-01

    The narrow-headed garter snake is a harmless, nonvenomous snake that is distinguished by its elongated, triangular-shaped head and the red or dark spots on its olive to tan body. Today, the narrow-headed garter snake is a species of special concern in the United States because of its decline over much of its historic range. Arizona's Oak Creek has historically contained the largest population of narrow-headed garter snakes in the United States. The U.S. Geological Survey (USGS) and the Arizona Game and Fish Department jointly funded research by USGS scientists in Oak Creek to shed light on the factors causing declining population numbers. The research resulted in better understanding of the snake's habitat needs, winter and summer range, and dietary habits. Based on the research findings, the U.S. Forest Service has developed recommendations that visitors and local residents can adopt to help slow the decline of the narrow-headed garter snake in Oak Creek.

  18. Exploring the Moon at High-Resolution: First Results From the Lunar Reconnaissance Orbiter Camera (LROC)

    NASA Astrophysics Data System (ADS)

    Robinson, Mark; Hiesinger, Harald; McEwen, Alfred; Jolliff, Brad; Thomas, Peter C.; Turtle, Elizabeth; Eliason, Eric; Malin, Mike; Ravine, A.; Bowman-Cisneros, Ernest

    The Lunar Reconnaissance Orbiter (LRO) spacecraft was launched on an Atlas V 401 rocket from the Cape Canaveral Air Force Station Launch Complex 41 on June 18, 2009. After spending four days in Earth-Moon transit, the spacecraft entered a three month commissioning phase in an elliptical 30×200 km orbit. On September 15, 2009, LRO began its planned one-year nominal mapping mission in a quasi-circular 50 km orbit. A multi-year extended mission in a fixed 30×200 km orbit is optional. The Lunar Reconnaissance Orbiter Camera (LROC) consists of a Wide Angle Camera (WAC) and two Narrow Angle Cameras (NACs). The WAC is a 7-color push-frame camera, which images the Moon at 100 and 400 m/pixel in the visible and UV, respectively, while the two NACs are monochrome narrow-angle linescan imagers with 0.5 m/pixel spatial resolution. LROC was specifically designed to address two of the primary LRO mission requirements and six other key science objectives, including 1) assessment of meter-and smaller-scale features in order to select safe sites for potential lunar landings near polar resources and elsewhere on the Moon; 2) acquire multi-temporal synoptic 100 m/pixel images of the poles during every orbit to unambiguously identify regions of permanent shadow and permanent or near permanent illumination; 3) meter-scale mapping of regions with permanent or near-permanent illumination of polar massifs; 4) repeat observations of potential landing sites and other regions to derive high resolution topography; 5) global multispectral observations in seven wavelengths to characterize lunar resources, particularly ilmenite; 6) a global 100-m/pixel basemap with incidence angles (60° -80° ) favorable for morphological interpretations; 7) sub-meter imaging of a variety of geologic units to characterize their physical properties, the variability of the regolith, and other key science questions; 8) meter-scale coverage overlapping with Apollo-era panoramic images (1-2 m/pixel) to document

  19. Exploring the Moon at High-Resolution: First Results From the Lunar Reconnaissance Orbiter Camera (LROC)

    NASA Astrophysics Data System (ADS)

    Robinson, Mark; Hiesinger, Harald; McEwen, Alfred; Jolliff, Brad; Thomas, Peter C.; Turtle, Elizabeth; Eliason, Eric; Malin, Mike; Ravine, A.; Bowman-Cisneros, Ernest

    The Lunar Reconnaissance Orbiter (LRO) spacecraft was launched on an Atlas V 401 rocket from the Cape Canaveral Air Force Station Launch Complex 41 on June 18, 2009. After spending four days in Earth-Moon transit, the spacecraft entered a three month commissioning phase in an elliptical 30×200 km orbit. On September 15, 2009, LRO began its planned one-year nominal mapping mission in a quasi-circular 50 km orbit. A multi-year extended mission in a fixed 30×200 km orbit is optional. The Lunar Reconnaissance Orbiter Camera (LROC) consists of a Wide Angle Camera (WAC) and two Narrow Angle Cameras (NACs). The WAC is a 7-color push-frame camera, which images the Moon at 100 and 400 m/pixel in the visible and UV, respectively, while the two NACs are monochrome narrow-angle linescan imagers with 0.5 m/pixel spatial resolution. LROC was specifically designed to address two of the primary LRO mission requirements and six other key science objectives, including 1) assessment of meter-and smaller-scale features in order to select safe sites for potential lunar landings near polar resources and elsewhere on the Moon; 2) acquire multi-temporal synoptic 100 m/pixel images of the poles during every orbit to unambiguously identify regions of permanent shadow and permanent or near permanent illumination; 3) meter-scale mapping of regions with permanent or near-permanent illumination of polar massifs; 4) repeat observations of potential landing sites and other regions to derive high resolution topography; 5) global multispectral observations in seven wavelengths to characterize lunar resources, particularly ilmenite; 6) a global 100-m/pixel basemap with incidence angles (60° -80° ) favorable for morphological interpretations; 7) sub-meter imaging of a variety of geologic units to characterize their physical properties, the variability of the regolith, and other key science questions; 8) meter-scale coverage overlapping with Apollo-era panoramic images (1-2 m/pixel) to document

  20. Stratoscope 2 integrating television camera

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The development, construction, test and delivery of an integrating television camera for use as the primary data sensor on Flight 9 of Stratoscope 2 is described. The system block diagrams are presented along with the performance data, and definition of the interface of the telescope with the power, telemetry, and communication system.

  1. Making Films without a Camera.

    ERIC Educational Resources Information Center

    Cox, Carole

    1980-01-01

    Describes draw-on filmmaking as an exciting way to introduce children to the plastic, fluid nature of the film medium, to develop their appreciation and understanding of divergent cinematic techniques and themes, and to invite them into the dream world of filmmaking without the need for a camera. (AEA)

  2. High speed multiwire photon camera

    NASA Technical Reports Server (NTRS)

    Lacy, Jeffrey L. (Inventor)

    1989-01-01

    An improved multiwire proportional counter camera having particular utility in the field of clinical nuclear medicine imaging. The detector utilizes direct coupled, low impedance, high speed delay lines, the segments of which are capacitor-inductor networks. A pile-up rejection test is provided to reject confused events otherwise caused by multiple ionization events occurring during the readout window.

  3. High speed multiwire photon camera

    NASA Technical Reports Server (NTRS)

    Lacy, Jeffrey L. (Inventor)

    1991-01-01

    An improved multiwire proportional counter camera having particular utility in the field of clinical nuclear medicine imaging. The detector utilizes direct coupled, low impedance, high speed delay lines, the segments of which are capacitor-inductor networks. A pile-up rejection test is provided to reject confused events otherwise caused by multiple ionization events occuring during the readout window.

  4. Measuring Distances Using Digital Cameras

    ERIC Educational Resources Information Center

    Kendal, Dave

    2007-01-01

    This paper presents a generic method of calculating accurate horizontal and vertical object distances from digital images taken with any digital camera and lens combination, where the object plane is parallel to the image plane or tilted in the vertical plane. This method was developed for a project investigating the size, density and spatial…

  5. Camera assisted multimodal user interaction

    NASA Astrophysics Data System (ADS)

    Hannuksela, Jari; Silvén, Olli; Ronkainen, Sami; Alenius, Sakari; Vehviläinen, Markku

    2010-01-01

    Since more processing power, new sensing and display technologies are already available in mobile devices, there has been increased interest in building systems to communicate via different modalities such as speech, gesture, expression, and touch. In context identification based user interfaces, these independent modalities are combined to create new ways how the users interact with hand-helds. While these are unlikely to completely replace traditional interfaces, they will considerably enrich and improve the user experience and task performance. We demonstrate a set of novel user interface concepts that rely on built-in multiple sensors of modern mobile devices for recognizing the context and sequences of actions. In particular, we use the camera to detect whether the user is watching the device, for instance, to make the decision to turn on the display backlight. In our approach the motion sensors are first employed for detecting the handling of the device. Then, based on ambient illumination information provided by a light sensor, the cameras are turned on. The frontal camera is used for face detection, while the back camera provides for supplemental contextual information. The subsequent applications triggered by the context can be, for example, image capturing, or bar code reading.

  6. Gamma-ray camera flyby

    SciTech Connect

    2010-01-01

    Animation based on an actual classroom demonstration of the prototype CCI-2 gamma-ray camera's ability to image a hidden radioactive source, a cesium-137 line source, in three dimensions. For more information see http://newscenter.lbl.gov/feature-stories/2010/06/02/applied-nuclear-physics/.

  7. The Camera Comes to Court.

    ERIC Educational Resources Information Center

    Floren, Leola

    After the Lindbergh kidnapping trial in 1935, the American Bar Association sought to eliminate electronic equipment from courtroom proceedings. Eventually, all but two states adopted regulations applying that ban to some extent, and a 1965 Supreme Court decision encouraged the banning of television cameras at trials as well. Currently, some states…

  8. High-speed pulse camera

    NASA Technical Reports Server (NTRS)

    Lawson, J. R.

    1968-01-01

    Miniaturized, 16 mm high speed pulse camera takes spectral photometric photographs upon instantaneous command. The design includes a low-friction, low-inertia film transport, a very thin beryllium shutter driven by a low-inertia stepper motor for minimum actuation time after a pulse command, and a binary encoder.

  9. Payload topography camera of Chang'e-3

    NASA Astrophysics Data System (ADS)

    Yu, Guo-Bin; Liu, En-Hai; Zhao, Ru-Jin; Zhong, Jie; Zhou, Xiang-Dong; Zhou, Wu-Lin; Wang, Jin; Chen, Yuan-Pei; Hao, Yong-Jie

    2015-11-01

    Chang'e-3 was China's first soft-landing lunar probe that achieved a successful roving exploration on the Moon. A topography camera functioning as the lander's “eye” was one of the main scientific payloads installed on the lander. It was composed of a camera probe, an electronic component that performed image compression, and a cable assembly. Its exploration mission was to obtain optical images of the lunar topography in the landing zone for investigation and research. It also observed rover movement on the lunar surface and finished taking pictures of the lander and rover. After starting up successfully, the topography camera obtained static images and video of rover movement from different directions, 360° panoramic pictures of the lunar surface around the lander from multiple angles, and numerous pictures of the Earth. All images of the rover, lunar surface, and the Earth were clear, and those of the Chinese national flag were recorded in true color. This paper describes the exploration mission, system design, working principle, quality assessment of image compression, and color correction of the topography camera. Finally, test results from the lunar surface are provided to serve as a reference for scientific data processing and application.

  10. Study of laser reflection of infrared cameras with germanium optics

    NASA Astrophysics Data System (ADS)

    Chiu, Patrio; Shih, Ishiang; Shi, S.; Laou, Philips

    2003-09-01

    Infrared cameras are widely used in today's battlefield for surveillance purpose. Because of retroreflection, an incident laser beam entering the camera optics results in a beam reflecting back to the direction of the laser source. An IR detector positioned close to the laser source can then detect the reflected beam. This effect can reveal the location of the cameras and thus increases the risk of covert operations. In the present work, the characteristics of the retroreflection is studied. It is found that the reflection intensity is high when the incident beam enters through the middle part of the lenses while it is low and the beam is diverged when entering through the outer part of the lenses. The reflection is symmetric when the incident beam is normal to the lenses while asymmetric when it is incident with an angle to the lenses. In order to study the potential effects on retroreflection of modified camera optics, IR low index slides (ZnSe and KCl with refractive indices of 2.49 and 1.54, respectively) with different thicknesses (2mm, 4mm and 6mm) are placed in the optical system. The result shows that the focal point of the lenses is changed by the addition of the slide but the optical paths of the reflection remain unchanged. The relationship between the different slides and beam intensity is also studied.

  11. Characterization of a PET Camera Optimized for ProstateImaging

    SciTech Connect

    Huber, Jennifer S.; Choong, Woon-Seng; Moses, William W.; Qi,Jinyi; Hu, Jicun; Wang, G.C.; Wilson, David; Oh, Sang; Huesman, RonaldH.; Derenzo, Stephen E.

    2005-11-11

    We present the characterization of a positron emission tomograph for prostate imaging that centers a patient between a pair of external curved detector banks (ellipse: 45 cm minor, 70 cm major axis). The distance between detector banks adjusts to allow patient access and to position the detectors as closely as possible for maximum sensitivity with patients of various sizes. Each bank is composed of two axial rows of 20 HR+ block detectors for a total of 80 detectors in the camera. The individual detectors are angled in the transaxial plane to point towards the prostate to reduce resolution degradation in that region. The detectors are read out by modified HRRT data acquisition electronics. Compared to a standard whole-body PET camera, our dedicated-prostate camera has the same sensitivity and resolution, less background (less randoms and lower scatter fraction) and a lower cost. We have completed construction of the camera. Characterization data and reconstructed images of several phantoms are shown. Sensitivity of a point source in the center is 946 cps/mu Ci. Spatial resolution is 4 mm FWHM in the central region.

  12. Method for shaping and aiming narrow beams. [sonar mapping and target identification

    NASA Technical Reports Server (NTRS)

    Heyser, R. C. (Inventor)

    1981-01-01

    A sonar method and apparatus is discribed which utilizes a linear frequency chirp in a transmitter/receiver having a correlator to synthesize a narrow beamwidth pattern from otherwise broadbeam transducers when there is relative velocity between the transmitter/receiver and the target. The chirp is so produced in a generator in bandwidth, B, and time, T, as to produce a time bandwidth product, TB, that is increased for a narrower angle. A replica of the chirp produced in a generator is time delayed and Doppler shifted for use as a reference in the receiver for correlation of received chirps from targets. This reference is Doppler shifted to select targets preferentially, thereby to not only synthesize a narrow beam but also aim the beam in azimuth and elevation.

  13. Recent advances in digital camera optics

    NASA Astrophysics Data System (ADS)

    Ishiguro, Keizo

    2012-10-01

    The digital camera market has extremely expanded in the last ten years. The zoom lens for digital camera is especially the key determining factor of the camera body size and image quality. Its technologies have been based on several analog technological progresses including the method of aspherical lens manufacturing and the mechanism of image stabilization. Panasonic is one of the pioneers of both technologies. I will introduce the previous trend in optics of zoom lens as well as original optical technologies of Panasonic digital camera "LUMIX", and in addition optics in 3D camera system. Besides, I would like to suppose the future trend in digital cameras.

  14. Analysis of Reference Sources for the Characterization and Calibration of Infrared Cameras

    NASA Astrophysics Data System (ADS)

    Gutschwager, B.; Taubert, D.; Hollandt, J.

    2015-03-01

    This paper gives an analysis of the radiometric properties of different types of reference sources applied for the characterization and calibration of infrared cameras. For the absolute radiance measurement with an infrared camera, a metrological characterization and calibration of the instrument are essential. Similar to the calibration of radiation thermometers, this calibration is generally performed with reference sources of known radiance. As infrared cameras are optically and electronically more complex than radiation thermometers, which are equipped with a single element detector, the applied reference sources have to be carefully characterized and limitations in their performance have to be considered. Each pixel of the image measured with an infrared camera should depict correctly the desired physical quantity value of the projected object area. This should be achieved for all relevant conditions of observation, e.g., at different distances or at different incident angles. The performance of cavity radiators and plate radiators is analyzed based on ray-tracing calculations and spatially and angularly resolved radiance measurements with radiation thermometers and cameras. Relevant components of a calibration facility for infrared cameras at PTB are presented with their specifications. A first analysis of the relevant characteristics of the applied infrared calibration sources and infrared cameras is presented as the essential basic information for the realization of the calibration of infrared cameras.

  15. a Method for Self-Calibration in Satellite with High Precision of Space Linear Array Camera

    NASA Astrophysics Data System (ADS)

    Liu, Wei; Qian, Fangming; Miao, Yuzhe; Wang, Rongjian

    2016-06-01

    At present, the on-orbit calibration of the geometric parameters of a space surveying camera is usually processed by data from a ground calibration field after capturing the images. The entire process is very complicated and lengthy and cannot monitor and calibrate the geometric parameters in real time. On the basis of a large number of on-orbit calibrations, we found that owing to the influence of many factors, e.g., weather, it is often difficult to capture images of the ground calibration field. Thus, regular calibration using field data cannot be ensured. This article proposes a real time self-calibration method for a space linear array camera on a satellite using the optical auto collimation principle. A collimating light source and small matrix array CCD devices are installed inside the load system of the satellite; these use the same light path as the linear array camera. We can extract the location changes of the cross marks in the matrix array CCD to determine the real-time variations in the focal length and angle parameters of the linear array camera. The on-orbit status of the camera is rapidly obtained using this method. On one hand, the camera's change regulation can be mastered accurately and the camera's attitude can be adjusted in a timely manner to ensure optimal photography; in contrast, self-calibration of the camera aboard the satellite can be realized quickly, which improves the efficiency and reliability of photogrammetric processing.

  16. Resolution limitations and optimization of the LLNL streak camera focus

    SciTech Connect

    Lerche, R.A.; Griffith, R.L.

    1987-09-01

    The RCA C73435 image tube is biased at voltages far from its original design in the LLNL ultrafast (10 ps) streak camera. Its output resolution at streak camera operating potentials has been measured as a function of input slit width, incident-light wavelength, and focus-grid voltage. The temporal resolution is insensitive to focus-grid voltage for a narrow (100 ..mu..m) input slit, but is very sensitive to focus-grid voltage for a wide (2 mm) input slit. At the optimum wide-slit focus voltage, temporal resolution is insensitive to slit width. Spatial resolution is nearly independent of focus-grid voltage for values that give good temporal resolution. Both temporal and spatial resolution depend on the incident-light wavelength. Data for 1.06-..mu..m light show significantly better focusing than for 0.53-..mu..m light. Streak camera operation is simulated with a computer program that calculates photoelectron trajectories. Electron ray tracing describes all of the observed effects of slit width, incident-light wavelength, and focus-grid voltage on output resolution. 7 refs.

  17. Color decorrelation for the PHOBOS mission camera experiment

    NASA Astrophysics Data System (ADS)

    Hauber, E.; Regner, P.; Schmidt, K.; Neukum, G.; Schwarz, G.

    1991-02-01

    The surface characteristics of Phobos are reexamined based on new images provided by the VSK-Fregat camera experiment together with modern processing techniques for color analysis. The VSK-Fregat camera provided a quasi-simultaneous recording of panchromatic high resolution images together with lower resolution two-channel spectral images. Contrast enhancement, geometrical coregistration, band ratioing, principal component analysis, and HSI-color transformations were all performed during image processing. It is concluded that at low phase angles the crater rims appear brighter and redder than the surrounding material and that a slightly reddish patch was discovered that cannot be explained simply by topographic or illumination effects. It is also concluded, however, that the bright crater rims and the slightly reddish surface patch cannot be attributed merely to detector noise or similar effects. Similar observations were made by other research teams and it is hypothesized that a decrease in particle size may be responsible for the reddish appearance in these areas.

  18. G-APDs in Cherenkov astronomy: The FACT camera

    NASA Astrophysics Data System (ADS)

    Krähenbühl, T.; Anderhub, H.; Backes, M.; Biland, A.; Boller, A.; Braun, I.; Bretz, T.; Commichau, V.; Djambazov, L.; Dorner, D.; Farnier, C.; Gendotti, A.; Grimm, O.; von Gunten, H.; Hildebrand, D.; Horisberger, U.; Huber, B.; Kim, K.-S.; Köhne, J.-H.; Krumm, B.; Lee, M.; Lenain, J.-P.; Lorenz, E.; Lustermann, W.; Lyard, E.; Mannheim, K.; Meharga, M.; Neise, D.; Nessi-Tedaldi, F.; Overkemping, A.-K.; Pauss, F.; Renker, D.; Rhode, W.; Ribordy, M.; Rohlfs, R.; Röser, U.; Stucki, J.-P.; Schneider, J.; Thaele, J.; Tibolla, O.; Viertel, G.; Vogler, P.; Walter, R.; Warda, K.; Weitzel, Q.

    2012-12-01

    Geiger-mode avalanche photodiodes (G-APD, SiPM) are a much discussed alternative to photomultiplier tubes in Cherenkov astronomy. The First G-APD Cherenkov Telescope (FACT) collaboration builds a camera based on a hexagonal array of 1440 G-APDs and has now finalized its construction phase. A light-collecting solid PMMA cone is glued to each G-APD to eliminate dead space between the G-APDs by increasing the active area, and to restrict the light collection angle of the sensor to the reflector area in order to reduce the amount of background light. The processing of the signals is integrated in the camera and includes the digitization using the domino ring sampling chip DRS4.

  19. Surveillance of the plant growth using the camera image

    NASA Astrophysics Data System (ADS)

    Fujiwara, Nobuyuki; Terada, Kenji

    2005-12-01

    In this paper, we propose a method of surveillance of the plant growth using the camera image. This method is able to observe the condition of raising the plant in the greenhouse. The plate which is known as HORIBA is prepared for extracting harmful insect. The image of HORIBA is obtained by the camera and used for processing. The resolution of the image is 1280×960. In first process, region of the harmful insect (fly) is extracted from HORIBA by using color information. In next process the template matching is performed to examine the correlation of shape in four different angles. 16 kinds of results are obtained by four different templates. The sum logical of the results is calculated for estimation. In addition, the experimental results are shown in this paper.

  20. Casting and Angling.

    ERIC Educational Resources Information Center

    Smith, Julian W.

    As part of a series of books and pamphlets on outdoor education, this manual consists of easy-to-follow instructions for fishing activities dealing with casting and angling. The manual may be used as a part of the regular physical education program in schools and colleges or as a club activity for the accomplished weekend fisherman or the…

  1. An Iterative Angle Trisection

    ERIC Educational Resources Information Center

    Muench, Donald L.

    2007-01-01

    The problem of angle trisection continues to fascinate people even though it has long been known that it can't be done with straightedge and compass alone. However, for practical purposes, a good iterative procedure can get you as close as you want. In this note, we present such a procedure. Using only straightedge and compass, our procedure…

  2. Interferometric measurement of angles.

    PubMed

    Malacara, D; Harris, O

    1970-07-01

    A new interferometric device for measuring small angles or rotations with high accuracy is described. This instrument works by counting fringes formed by the rotation of a flat-parallel plate of glass illuminated with a collimated beam from a gas laser. Some possible applications are given.

  3. Measurement of the surface wavelength distribution of narrow-band radiation by a colorimetric method

    SciTech Connect

    Kraiskii, A V; Mironova, T V; Sultanov, T T

    2010-09-10

    A method is suggested for determining the wavelength of narrow-band light from a digital photograph of a radiating surface. The digital camera used should be appropriately calibrated. The accuracy of the wavelength measurement is better than 1 nm. The method was tested on the yellow doublet of mercury spectrum and on the adjacent continuum of the incandescent lamp radiation spectrum. By means of the method suggested the homogeneity of holographic sensor swelling was studied in stationary and transient cases. (laser applications and other topics in quantum electronics)

  4. The role of contact angle on unstable flow formation during infiltration and drainage in wettable porous media

    NASA Astrophysics Data System (ADS)

    Wallach, Rony; Margolis, Michal; Graber, Ellen R.

    2013-10-01

    The impact of contact angle on 2-D spatial and temporal water-content distribution during infiltration and drainage was experimentally studied. The 0.3-0.5 mm fraction of a quartz dune sand was treated and turned subcritically repellent (contact angle of 33°, 48°, 56°, and 75° for S33, S48, S56, and S75, respectively). The media were packed uniformly in transparent flow chambers and water was supplied to the surface as a point source at different rates (1-20 ml/min). A sequence of gray-value images was taken by CCD camera during infiltration and subsequent drainage; gray values were converted to volumetric water content by water volume balance. Narrow and long plumes with water accumulation behind the downward moving wetting front (tip) and negative water gradient above it (tail) developed in the S56 and S75 media during infiltration at lower water application rates. The plumes became bulbous with spatially uniform water-content distribution as water application rates increased. All plumes in these media propagated downward at a constant rate during infiltration and did not change their shape during drainage. In contrast, regular plume shapes were observed in the S33 and S48 media at all flow rates, and drainage profiles were nonmonotonic with a transition plane at the depth that water reached during infiltration. Given that the studied media have similar pore-size distributions, the conclusion is that imbibition hindered by the nonzero contact angle induced pressure buildup at the wetting front (dynamic water-entry value) that controlled the plume shape and internal water-content distribution during infiltration and drainage.

  5. Narrow-Line Seyfert 1 Galaxies

    NASA Technical Reports Server (NTRS)

    Leighly, Karen M.

    2000-01-01

    The primary work during this year has been the analysis and interpretation of our HST spectra from two extreme Narrow-line Seyfert 1 galaxies (NLS1s) Infrared Astronomy Satellite (IRAS) 13224-3809 and 1H 0707-495. This work has been presented as an invited talk at the workshop entitled "Observational and theoretical progress in the Study of Narrow-line Seyfert 1 Galaxies" held in Bad Honnef, Germany December 8-11, as a contributed talk at the January 2000 AAS meeting in Atlanta, Georgia, and as a contributed talk at the workshop "Probing the Physics of Active Galactic Nuclei by Multiwavelength Monitoring" held at Goddard Space Flight Center June 20-22, 2000.

  6. Exciton absorption in narrow armchair graphene nanoribbons

    NASA Astrophysics Data System (ADS)

    Monozon, B. S.; Schmelcher, P.

    2016-11-01

    We develop an analytical approach to the exciton optical absorption for narrow gap armchair graphene nanoribbons (AGNR). We focus on the regime of dominant size quantization in combination with the attractive electron-hole interaction. An adiabatic separation of slow and fast motions leads via the two-body Dirac equation to the isolated and coupled subband approximations. Discrete and continuous exciton states are in general coupled and form quasi-Rydberg series of purely discrete and resonance type character. The corresponding oscillator strengths and widths are derived. We show that the exciton peaks are blue-shifted, become broader and increase in magnitude upon narrowing the ribbon. At the edge of a subband the singularity related to the 1D density of states is transformed into finite absorption via the presence of the exciton. Our analytical results are in good agreement with those obtained by other methods including numerical approaches. Estimates of the expected experimental values are provided for realistic AGNR.

  7. Creep turns linear in narrow ferromagnetic nanostrips

    PubMed Central

    Leliaert, Jonathan; Van de Wiele, Ben; Vansteenkiste, Arne; Laurson, Lasse; Durin, Gianfranco; Dupré, Luc; Van Waeyenberge, Bartel

    2016-01-01

    The motion of domain walls in magnetic materials is a typical example of a creep process, usually characterised by a stretched exponential velocity-force relation. By performing large-scale micromagnetic simulations, and analyzing an extended 1D model which takes the effects of finite temperatures and material defects into account, we show that this creep scaling law breaks down in sufficiently narrow ferromagnetic strips. Our analysis of current-driven transverse domain wall motion in disordered Permalloy nanostrips reveals instead a creep regime with a linear dependence of the domain wall velocity on the applied field or current density. This originates from the essentially point-like nature of domain walls moving in narrow, line- like disordered nanostrips. An analogous linear relation is found also by analyzing existing experimental data on field-driven domain wall motion in perpendicularly magnetised media. PMID:26843125

  8. Contact Angle Measurements Using a Simplified Experimental Setup

    ERIC Educational Resources Information Center

    Lamour, Guillaume; Hamraoui, Ahmed; Buvailo, Andrii; Xing, Yangjun; Keuleyan, Sean; Prakash, Vivek; Eftekhari-Bafrooei, Ali; Borguet, Eric

    2010-01-01

    A basic and affordable experimental apparatus is described that measures the static contact angle of a liquid drop in contact with a solid. The image of the drop is made with a simple digital camera by taking a picture that is magnified by an optical lens. The profile of the drop is then processed with ImageJ free software. The ImageJ contact…

  9. A Different Angle on Perspective

    ERIC Educational Resources Information Center

    Frantz, Marc

    2012-01-01

    When a plane figure is photographed from different viewpoints, lengths and angles appear distorted. Hence it is often assumed that lengths, angles, protractors, and compasses have no place in projective geometry. Here we describe a sense in which certain angles are preserved by projective transformations. These angles can be constructed with…

  10. Angle Sense: A Valuable Connector.

    ERIC Educational Resources Information Center

    Rubenstein, Rheta N.; And Others

    1993-01-01

    Proposes angle sense as a fundamental connector between mathematical concepts for middle grade students. Introduces the use of pattern blocks and a goniometer, a tool to measure angles, to help students develop angle sense. Discusses connections between angle measurement and the concepts of rational numbers, circles, area, number theory,…

  11. Decay Modes of Narrow Molecular Resonances

    SciTech Connect

    Courtin, S.; Haas, F.; Salsac, M.-D.; Lebhertz, D.; Michalon, A.; Beck, C.; Rousseau, M.; Zafra, A. Sanchez I.; Hutcheon, D.; Davis, C.; Pearson, J. E.; Lister, C. J.

    2006-08-14

    The heavy-ion radiative capture reactions 12C(12C,{gamma})24Mg and 12C(16O,{gamma})28Si have been performed on and off resonance at TRIUMF using the Dragon separator and its associated BGO array. The decay of the studied narrow resonances has been shown to proceed predominantly through quasi-bound doorway states which cluster and deformed configurations would have a large overlap with the entry resonance states.

  12. Multiwatts narrow linewidth fiber Raman amplifiers.

    PubMed

    Feng, Yan; Taylor, Luke; Bonaccini Calia, Domenico

    2008-07-21

    Up to 4.8 W, approximately 10 MHz, 1178 nm laser is obtained by Raman amplification of a distributed feedback diode laser in standard single mode fibers pumped by an 1120 nm Yb fiber laser. More than 10% efficiency and 27 dB amplification is achieved, limited by onset of stimulated Brillouin scattering. The ratio of Raman to Brillouin gain coefficient of a fiber is identified as a figure of merit for building a narrow linewidth fiber Raman amplifier.

  13. Multiwatts narrow linewidth fiber Raman amplifiers.

    PubMed

    Feng, Yan; Taylor, Luke; Bonaccini Calia, Domenico

    2008-07-21

    Up to 4.8 W, approximately 10 MHz, 1178 nm laser is obtained by Raman amplification of a distributed feedback diode laser in standard single mode fibers pumped by an 1120 nm Yb fiber laser. More than 10% efficiency and 27 dB amplification is achieved, limited by onset of stimulated Brillouin scattering. The ratio of Raman to Brillouin gain coefficient of a fiber is identified as a figure of merit for building a narrow linewidth fiber Raman amplifier. PMID:18648406

  14. Automatic Kappa Angle Estimation for Air Photos Based on Phase Only Correlation

    NASA Astrophysics Data System (ADS)

    Xiong, Z.; Stanley, D.; Xin, Y.

    2016-06-01

    The approximate value of exterior orientation parameters is needed for air photo bundle adjustment. Usually the air borne GPS/IMU can provide the initial value for the camera position and attitude angle. However, in some cases, the camera's attitude angle is not available due to lack of IMU or other reasons. In this case, the kappa angle needs to be estimated for each photo before bundle adjustment. The kappa angle can be obtained from the Ground Control Points (GCPs) in the photo. Unfortunately it is not the case that enough GCPs are always available. In order to overcome this problem, an algorithm is developed to automatically estimate the kappa angle for air photos based on phase only correlation technique. This function has been embedded in PCI software. Extensive experiments show that this algorithm is fast, reliable, and stable.

  15. Empirical Study on Designing of Gaze Tracking Camera Based on the Information of User's Head Movement.

    PubMed

    Pan, Weiyuan; Jung, Dongwook; Yoon, Hyo Sik; Lee, Dong Eun; Naqvi, Rizwan Ali; Lee, Kwan Woo; Park, Kang Ryoung

    2016-08-31

    Gaze tracking is the technology that identifies a region in space that a user is looking at. Most previous non-wearable gaze tracking systems use a near-infrared (NIR) light camera with an NIR illuminator. Based on the kind of camera lens used, the viewing angle and depth-of-field (DOF) of a gaze tracking camera can be different, which affects the performance of the gaze tracking system. Nevertheless, to our best knowledge, most previous researches implemented gaze tracking cameras without ground truth information for determining the optimal viewing angle and DOF of the camera lens. Eye-tracker manufacturers might also use ground truth information, but they do not provide this in public. Therefore, researchers and developers of gaze tracking systems cannot refer to such information for implementing gaze tracking system. We address this problem providing an empirical study in which we design an optimal gaze tracking camera based on experimental measurements of the amount and velocity of user's head movements. Based on our results and analyses, researchers and developers might be able to more easily implement an optimal gaze tracking system. Experimental results show that our gaze tracking system shows high performance in terms of accuracy, user convenience and interest.

  16. Empirical Study on Designing of Gaze Tracking Camera Based on the Information of User's Head Movement.

    PubMed

    Pan, Weiyuan; Jung, Dongwook; Yoon, Hyo Sik; Lee, Dong Eun; Naqvi, Rizwan Ali; Lee, Kwan Woo; Park, Kang Ryoung

    2016-01-01

    Gaze tracking is the technology that identifies a region in space that a user is looking at. Most previous non-wearable gaze tracking systems use a near-infrared (NIR) light camera with an NIR illuminator. Based on the kind of camera lens used, the viewing angle and depth-of-field (DOF) of a gaze tracking camera can be different, which affects the performance of the gaze tracking system. Nevertheless, to our best knowledge, most previous researches implemented gaze tracking cameras without ground truth information for determining the optimal viewing angle and DOF of the camera lens. Eye-tracker manufacturers might also use ground truth information, but they do not provide this in public. Therefore, researchers and developers of gaze tracking systems cannot refer to such information for implementing gaze tracking system. We address this problem providing an empirical study in which we design an optimal gaze tracking camera based on experimental measurements of the amount and velocity of user's head movements. Based on our results and analyses, researchers and developers might be able to more easily implement an optimal gaze tracking system. Experimental results show that our gaze tracking system shows high performance in terms of accuracy, user convenience and interest. PMID:27589768

  17. Combustion pinhole-camera system

    DOEpatents

    Witte, A.B.

    1982-05-19

    A pinhole camera system is described utilizing a sealed optical-purge assembly which provides optical access into a coal combustor or other energy conversion reactors. The camera system basically consists of a focused-purge pinhole optical port assembly, a conventional TV vidicon receiver, an external, variable density light filter which is coupled electronically to the vidicon automatic gain control (agc). The key component of this system is the focused-purge pinhole optical port assembly which utilizes a purging inert gas to keep debris from entering the port and a lens arrangement which transfers the pinhole to the outside of the port assembly. One additional feature of the port assembly is that it is not flush with the interior of the combustor.

  18. A 10-microm infrared camera.

    PubMed

    Arens, J F; Jernigan, J G; Peck, M C; Dobson, C A; Kilk, E; Lacy, J; Gaalema, S

    1987-09-15

    An IR camera has been built at the University of California at Berkeley for astronomical observations. The camera has been used primarily for high angular resolution imaging at mid-IR wavelengths. It has been tested at the University of Arizona 61- and 90-in. telescopes near Tucson and the NASA Infrared Telescope Facility on Mauna Kea, HI. In the observations the system has been used as an imager with interference coated and Fabry-Perot filters. These measurements have demonstrated a sensitivity consistent with photon shot noise, showing that the system is limited by the radiation from the telescope and atmosphere. Measurements of read noise, crosstalk, and hysteresis have been made in our laboratory. PMID:20490151

  19. Electronographic cameras for space astronomy.

    NASA Technical Reports Server (NTRS)

    Carruthers, G. R.; Opal, C. B.

    1972-01-01

    Magnetically-focused electronographic cameras have been under development at the Naval Research Laboratory for use in far-ultraviolet imagery and spectrography, primarily in astronomical and optical-geophysical observations from sounding rockets and space vehicles. Most of this work has been with cameras incorporating internal optics of the Schmidt or wide-field all-reflecting types. More recently, we have begun development of electronographic spectrographs incorporating an internal concave grating, operating at normal or grazing incidence. We also are developing electronographic image tubes of the conventional end-window-photo-cathode type, for far-ultraviolet imagery at the focus of a large space telescope, with image formats up to 120 mm in diameter.

  20. ISO camera array development status

    NASA Technical Reports Server (NTRS)

    Sibille, F.; Cesarsky, C.; Agnese, P.; Rouan, D.

    1989-01-01

    A short outline is given of the Infrared Space Observatory Camera (ISOCAM), one of the 4 instruments onboard the Infrared Space Observatory (ISO), with the current status of its two 32x32 arrays, an InSb charge injection device (CID) and a Si:Ga direct read-out (DRO), and the results of the in orbit radiation simulation with gamma ray sources. A tentative technique for the evaluation of the flat fielding accuracy is also proposed.

  1. Graphic design of pinhole cameras

    NASA Technical Reports Server (NTRS)

    Edwards, H. B.; Chu, W. P.

    1979-01-01

    The paper describes a graphic technique for the analysis and optimization of pinhole size and focal length. The technique is based on the use of the transfer function of optical elements described by Scott (1959) to construct the transfer function of a circular pinhole camera. This transfer function is the response of a component or system to a pattern of lines having a sinusoidally varying radiance at varying spatial frequencies. Some specific examples of graphic design are presented.

  2. SPEIR: A Ge Compton Camera

    SciTech Connect

    Mihailescu, L; Vetter, K M; Burks, M T; Hull, E L; Craig, W W

    2004-02-11

    The SPEctroscopic Imager for {gamma}-Rays (SPEIR) is a new concept of a compact {gamma}-ray imaging system of high efficiency and spectroscopic resolution with a 4-{pi} field-of-view. The system behind this concept employs double-sided segmented planar Ge detectors accompanied by the use of list-mode photon reconstruction methods to create a sensitive, compact Compton scatter camera.

  3. Ortho-Rectification of Narrow Band Multi-Spectral Imagery Assisted by Dslr RGB Imagery Acquired by a Fixed-Wing Uas

    NASA Astrophysics Data System (ADS)

    Rau, J.-Y.; Jhan, J.-P.; Huang, C.-Y.

    2015-08-01

    Miniature Multiple Camera Array (MiniMCA-12) is a frame-based multilens/multispectral sensor composed of 12 lenses with narrow band filters. Due to its small size and light weight, it is suitable to mount on an Unmanned Aerial System (UAS) for acquiring high spectral, spatial and temporal resolution imagery used in various remote sensing applications. However, due to its wavelength range is only 10 nm that results in low image resolution and signal-to-noise ratio which are not suitable for image matching and digital surface model (DSM) generation. In the meantime, the spectral correlation among all 12 bands of MiniMCA images are low, it is difficult to perform tie-point matching and aerial triangulation at the same time. In this study, we thus propose the use of a DSLR camera to assist automatic aerial triangulation of MiniMCA-12 imagery and to produce higher spatial resolution DSM for MiniMCA12 ortho-image generation. Depending on the maximum payload weight of the used UAS, these two kinds of sensors could be collected at the same time or individually. In this study, we adopt a fixed-wing UAS to carry a Canon EOS 5D Mark2 DSLR camera and a MiniMCA-12 multi-spectral camera. For the purpose to perform automatic aerial triangulation between a DSLR camera and the MiniMCA-12, we choose one master band from MiniMCA-12 whose spectral range has overlap with the DSLR camera. However, all lenses of MiniMCA-12 have different perspective centers and viewing angles, the original 12 channels have significant band misregistration effect. Thus, the first issue encountered is to reduce the band misregistration effect. Due to all 12 MiniMCA lenses being frame-based, their spatial offsets are smaller than 15 cm and all images are almost 98% overlapped, we thus propose a modified projective transformation (MPT) method together with two systematic error correction procedures to register all 12 bands of imagery on the same image space. It means that those 12 bands of images acquired at

  4. 21 CFR 886.1120 - Opthalmic camera.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Opthalmic camera. 886.1120 Section 886.1120 Food... DEVICES OPHTHALMIC DEVICES Diagnostic Devices § 886.1120 Opthalmic camera. (a) Identification. An ophthalmic camera is an AC-powered device intended to take photographs of the eye and the surrounding...

  5. 21 CFR 892.1110 - Positron camera.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Positron camera. 892.1110 Section 892.1110 Food... DEVICES RADIOLOGY DEVICES Diagnostic Devices § 892.1110 Positron camera. (a) Identification. A positron camera is a device intended to image the distribution of positron-emitting radionuclides in the...

  6. 16 CFR 501.1 - Camera film.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 16 Commercial Practices 1 2012-01-01 2012-01-01 false Camera film. 501.1 Section 501.1 Commercial... 500 § 501.1 Camera film. Camera film packaged and labeled for retail sale is exempt from the net... should be expressed, provided: (a) The net quantity of contents on packages of movie film and bulk...

  7. 16 CFR 501.1 - Camera film.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 16 Commercial Practices 1 2014-01-01 2014-01-01 false Camera film. 501.1 Section 501.1 Commercial... 500 § 501.1 Camera film. Camera film packaged and labeled for retail sale is exempt from the net... should be expressed, provided: (a) The net quantity of contents on packages of movie film and bulk...

  8. 16 CFR 501.1 - Camera film.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 16 Commercial Practices 1 2013-01-01 2013-01-01 false Camera film. 501.1 Section 501.1 Commercial... 500 § 501.1 Camera film. Camera film packaged and labeled for retail sale is exempt from the net... should be expressed, provided: (a) The net quantity of contents on packages of movie film and bulk...

  9. 16 CFR 501.1 - Camera film.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... should be expressed, provided: (a) The net quantity of contents on packages of movie film and bulk still... 16 Commercial Practices 1 2010-01-01 2010-01-01 false Camera film. 501.1 Section 501.1 Commercial... 500 § 501.1 Camera film. Camera film packaged and labeled for retail sale is exempt from the...

  10. 16 CFR 501.1 - Camera film.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 16 Commercial Practices 1 2011-01-01 2011-01-01 false Camera film. 501.1 Section 501.1 Commercial... 500 § 501.1 Camera film. Camera film packaged and labeled for retail sale is exempt from the net... should be expressed, provided: (a) The net quantity of contents on packages of movie film and bulk...

  11. Unassisted 3D camera calibration

    NASA Astrophysics Data System (ADS)

    Atanassov, Kalin; Ramachandra, Vikas; Nash, James; Goma, Sergio R.

    2012-03-01

    With the rapid growth of 3D technology, 3D image capture has become a critical part of the 3D feature set on mobile phones. 3D image quality is affected by the scene geometry as well as on-the-device processing. An automatic 3D system usually assumes known camera poses accomplished by factory calibration using a special chart. In real life settings, pose parameters estimated by factory calibration can be negatively impacted by movements of the lens barrel due to shaking, focusing, or camera drop. If any of these factors displaces the optical axes of either or both cameras, vertical disparity might exceed the maximum tolerable margin and the 3D user may experience eye strain or headaches. To make 3D capture more practical, one needs to consider unassisted (on arbitrary scenes) calibration. In this paper, we propose an algorithm that relies on detection and matching of keypoints between left and right images. Frames containing erroneous matches, along with frames with insufficiently rich keypoint constellations, are detected and discarded. Roll, pitch yaw , and scale differences between left and right frames are then estimated. The algorithm performance is evaluated in terms of the remaining vertical disparity as compared to the maximum tolerable vertical disparity.

  12. Coaxial fundus camera for opthalmology

    NASA Astrophysics Data System (ADS)

    de Matos, Luciana; Castro, Guilherme; Castro Neto, Jarbas C.

    2015-09-01

    A Fundus Camera for ophthalmology is a high definition device which needs to meet low light illumination of the human retina, high resolution in the retina and reflection free image1. Those constraints make its optical design very sophisticated, but the most difficult to comply with is the reflection free illumination and the final alignment due to the high number of non coaxial optical components in the system. Reflection of the illumination, both in the objective and at the cornea, mask image quality, and a poor alignment make the sophisticated optical design useless. In this work we developed a totally axial optical system for a non-midriatic Fundus Camera. The illumination is performed by a LED ring, coaxial with the optical system and composed of IR of visible LEDs. The illumination ring is projected by the objective lens in the cornea. The Objective, LED illuminator, CCD lens are coaxial making the final alignment easily to perform. The CCD + capture lens module is a CCTV camera with autofocus and Zoom built in, added to a 175 mm focal length doublet corrected for infinity, making the system easily operated and very compact.

  13. Image quality testing of assembled IR camera modules

    NASA Astrophysics Data System (ADS)

    Winters, Daniel; Erichsen, Patrik

    2013-10-01

    Infrared (IR) camera modules for the LWIR (8-12_m) that combine IR imaging optics with microbolometer focal plane array (FPA) sensors with readout electronics are becoming more and more a mass market product. At the same time, steady improvements in sensor resolution in the higher priced markets raise the requirement for imaging performance of objectives and the proper alignment between objective and FPA. This puts pressure on camera manufacturers and system integrators to assess the image quality of finished camera modules in a cost-efficient and automated way for quality control or during end-of-line testing. In this paper we present recent development work done in the field of image quality testing of IR camera modules. This technology provides a wealth of additional information in contrast to the more traditional test methods like minimum resolvable temperature difference (MRTD) which give only a subjective overall test result. Parameters that can be measured are image quality via the modulation transfer function (MTF) for broadband or with various bandpass filters on- and off-axis and optical parameters like e.g. effective focal length (EFL) and distortion. If the camera module allows for refocusing the optics, additional parameters like best focus plane, image plane tilt, auto-focus quality, chief ray angle etc. can be characterized. Additionally, the homogeneity and response of the sensor with the optics can be characterized in order to calculate the appropriate tables for non-uniformity correction (NUC). The technology can also be used to control active alignment methods during mechanical assembly of optics to high resolution sensors. Other important points that are discussed are the flexibility of the technology to test IR modules with different form factors, electrical interfaces and last but not least the suitability for fully automated measurements in mass production.

  14. Silicone Contamination Camera for Developed for Shuttle Payloads

    NASA Technical Reports Server (NTRS)

    1996-01-01

    On many shuttle missions, silicone contamination from unknown sources from within or external to the shuttle payload bay has been a chronic problem plaguing experiment payloads. There is currently a wide range of silicone usage on the shuttle. Silicones are used to coat the shuttle tiles to enhance their ability to shed rain, and over 100 kg of RTV 560 silicone is used to seal white tiles to the shuttle surfaces. Silicones are also used in electronic components, potting compounds, and thermal control blankets. Efforts to date to identify and eliminate the sources of silicone contamination have not been highly successful and have created much controversy. To identify the sources of silicone contamination on the space shuttle, the NASA Lewis Research Center developed a contamination camera. This specially designed pinhole camera utilizes low-Earth-orbit atomic oxygen to develop a picture that identifies sources of silicone contamination on shuttle-launched payloads. The volatile silicone species travel through the aperture of the pinhole camera, and since volatile silicone species lose their hydrocarbon functionalities under atomic oxygen attack, the silicone adheres to the substrate as SiO_x. This glassy deposit should be spatially arranged in the image of the sources of silicone contamination. To view the contamination image, one can use ultrasensitive thickness measurement techniques, such as scanning variable-angle ellipsometry, to map the surface topography of the camera's substrate. The demonstration of a functional contamination camera would resolve the controversial debate concerning the amount and location of contamination sources, would allow corrective actions to be taken, and would demonstrate a useful tool for contamination documentation on future shuttle payloads, with near negligible effect on cost and weight.

  15. Angles in the Sky?

    NASA Astrophysics Data System (ADS)

    Behr, Bradford

    2005-09-01

    Tycho Brahe lived and worked in the late 1500s before the telescope was invented. He made highly accurate observations of the positions of planets, stars, and comets using large angle-measuring devices of his own design. You can use his techniques to observe the sky as well. For example, the degree, a common unit of measurement in astronomy, can be measured by holding your fist at arm's length up to the sky. Open your fist and observe the distance across the sky covered by the width of your pinky fingernail. That is, roughly, a degree! After some practice, and knowing that one degree equals four minutes, you can measure elapsed time by measuring the angle of the distance that the Moon appears to have moved and multiplying that number by four. You can also figure distances and sizes of things. These are not precise measurements, but rough estimates that can give you a "close-enough" answer.

  16. Cooling the dark energy camera instrument

    SciTech Connect

    Schmitt, R.L.; Cease, H.; DePoy, D.; Diehl, H.T.; Estrada, J.; Flaugher, B.; Kuhlmann, S.; Onal, Birce; Stefanik, A.; /Fermilab

    2008-06-01

    DECam, camera for the Dark Energy Survey (DES), is undergoing general design and component testing. For an overview see DePoy, et al in these proceedings. For a description of the imager, see Cease, et al in these proceedings. The CCD instrument will be mounted at the prime focus of the CTIO Blanco 4m telescope. The instrument temperature will be 173K with a heat load of 113W. In similar applications, cooling CCD instruments at the prime focus has been accomplished by three general methods. Liquid nitrogen reservoirs have been constructed to operate in any orientation, pulse tube cryocoolers have been used when tilt angles are limited and Joule-Thompson or Stirling cryocoolers have been used with smaller heat loads. Gifford-MacMahon cooling has been used at the Cassegrain but not at the prime focus. For DES, the combined requirements of high heat load, temperature stability, low vibration, operation in any orientation, liquid nitrogen cost and limited space available led to the design of a pumped, closed loop, circulating nitrogen system. At zenith the instrument will be twelve meters above the pump/cryocooler station. This cooling system expected to have a 10,000 hour maintenance interval. This paper will describe the engineering basis including the thermal model, unbalanced forces, cooldown time, the single and two-phase flow model.

  17. Bio-inspired hemispherical compound eye camera

    NASA Astrophysics Data System (ADS)

    Xiao, Jianliang; Song, Young Min; Xie, Yizhu; Malyarchuk, Viktor; Jung, Inhwa; Choi, Ki-Joong; Liu, Zhuangjian; Park, Hyunsung; Lu, Chaofeng; Kim, Rak-Hwan; Li, Rui; Crozier, Kenneth B.; Huang, Yonggang; Rogers, John A.

    2014-03-01

    Compound eyes in arthropods demonstrate distinct imaging characteristics from human eyes, with wide angle field of view, low aberrations, high acuity to motion and infinite depth of field. Artificial imaging systems with similar geometries and properties are of great interest for many applications. However, the challenges in building such systems with hemispherical, compound apposition layouts cannot be met through established planar sensor technologies and conventional optics. We present our recent progress in combining optics, materials, mechanics and integration schemes to build fully functional artificial compound eye cameras. Nearly full hemispherical shapes (about 160 degrees) with densely packed artificial ommatidia were realized. The number of ommatidia (180) is comparable to those of the eyes of fire ants and bark beetles. The devices combine elastomeric compound optical elements with deformable arrays of thin silicon photodetectors, which were fabricated in the planar geometries and then integrated and elastically transformed to hemispherical shapes. Imaging results and quantitative ray-tracing-based simulations illustrate key features of operation. These general strategies seem to be applicable to other compound eye devices, such as those inspired by moths and lacewings (refracting superposition eyes), lobster and shrimp (reflecting superposition eyes), and houseflies (neural superposition eyes).

  18. Speckle Camera Imaging of the Planet Pluto

    NASA Astrophysics Data System (ADS)

    Howell, Steve B.; Horch, Elliott P.; Everett, Mark E.; Ciardi, David R.

    2012-10-01

    We have obtained optical wavelength (692 nm and 880 nm) speckle imaging of the planet Pluto and its largest moon Charon. Using our DSSI speckle camera attached to the Gemini North 8 m telescope, we collected high resolution imaging with an angular resolution of ~20 mas, a value at the Gemini-N telescope diffraction limit. We have produced for this binary system the first speckle reconstructed images, from which we can measure not only the orbital separation and position angle for Charon, but also the diameters of the two bodies. Our measurements of these parameters agree, within the uncertainties, with the current best values for Pluto and Charon. The Gemini-N speckle observations of Pluto are presented to illustrate the capabilities of our instrument and the robust production of high accuracy, high spatial resolution reconstructed images. We hope our results will suggest additional applications of high resolution speckle imaging for other objects within our solar system and beyond. Based on observations obtained at the Gemini Observatory, which is operated by the Association of Universities for Research in Astronomy, Inc., under a cooperative agreement with the National Science Foundation on behalf of the Gemini partnership: the National Science Foundation (United States), the Science and Technology Facilities Council (United Kingdom), the National Research Council (Canada), CONICYT (Chile), the Australian Research Council (Australia), Ministério da Ciência, Tecnologia e Inovação (Brazil) and Ministerio de Ciencia, Tecnología e Innovación Productiva (Argentina).

  19. Toward the design of a positron volume imaging camera

    SciTech Connect

    Rogers, J.G.; Stazyk, M.; Harrop, R.; Dykstra, C.J.; Barney, J.S.; Atkins, M.S.; Kinahan, P.E. )

    1990-04-01

    Three different computing algorithms for performing positron emission image reconstruction have been compared using Monte Carlo phantom simulations. The work was motivated by the recent announcement of the commercial availability of a positron volume imaging camera which has improved axial (slice) resolution and retractable interslice septa. The simulations demonstrate the importance of developing a complete three-dimensional reconstruction algorithm to deal with the increased gamma detection solid angle and the increased scatter fraction that result when the interslice septa are removed from a ring tomograph.

  20. Recording of essential ballistic data with a new generation of digital ballistic range camera

    NASA Astrophysics Data System (ADS)

    Haddleton, Graham P.; Honour, Jo

    2007-01-01

    Scientists and Engineers still require to record essential parameters during the design and testing of new (or refined) munitions. This essential data, such as velocities, spin, pitch and yaw angles, sabot discards, impact angles, target penetrations, behind target effects and post impact delays, need to be recorded during dynamic, high velocity, and dangerous firings. Traditionally these parameters have been recorded on high-speed film cameras. With the demise of film as a recording media a new generation of electronic digital recording cameras has come to be accepted method of allowing these parameters to be recorded and analysed. Their obvious advantage over film is their instant access to records and their ability for almost instant analysis of records. This paper will detail results obtained using a new specially designed Ballistic Range Camera manufactured by Specialised Imaging Ltd.

  1. Laser angle sensor

    NASA Technical Reports Server (NTRS)

    Pond, C. R.; Texeira, P. D.

    1985-01-01

    A laser angle measurement system was designed and fabricated for NASA Langley Research Center. The instrument is a fringe counting interferometer that monitors the pitch attitude of a model in a wind tunnel. A laser source and detector are mounted above the model. Interference fringes are generated by a small passive element on the model. The fringe count is accumulated and displayed by a processor in the wind tunnel control room. This report includes optical and electrical schematics, system maintenance and operation procedures.

  2. Interference-induced angle-independent acoustical transparency

    SciTech Connect

    Qi, Lehua; Yu, Gaokun Wang, Ning; Wang, Xinlong; Wang, Guibo

    2014-12-21

    It is revealed that the Fano-like interference leads to the extraordinary acoustic transmission through a slab metamaterial of thickness much smaller than the wavelength, with each unit cell consisting of a Helmholtz resonator and a narrow subwavelength slit. More importantly, both the theoretical analysis and experimental measurement show that the angle-independent acoustical transparency can be realized by grafting a Helmholtz resonator and a quarter-wave resonator to the wall of a narrow subwavelength slit in each unit cell of a slit array. The observed phenomenon results from the interferences between the waves propagating in the slit, those re-radiated by the Helmholtz resonator, and those re-radiated by the quarter-wave resonator. The proposed design may find its applications in designing angle-independent acoustical filters and controlling the phase of the transmitted waves.

  3. Advances in fast 2D camera data handling and analysis on NSTX

    SciTech Connect

    Davis, W. M.; Patel, R. I.; Boeglin, W. U.; Roquemore, A. L.; Maqueda, R. J.; Zweben, S. J.

    2010-07-01

    The use of fast 2D cameras on NSTX continues to grow. There are 6 cameras with the capability of taking up to 1–2 gigabytes (GBs) of data apiece during each plasma shot on the National Spherical Torus Experiment (NSTX). Efficient storage and retrieval of this data remains a challenge. Performance comparisons are presented for reading data stored in MDSplus, using both compressed data and segmented records, and direct access I/O with different read sizes. Encouragingly, fast 2D camera data provides considerable insight into plasma complexities, such as small-scale turbulence and particle transport. The last part of this paper is an example of more recent uses: dual cameras looking at the same region of the plasma from different angles, which can provide trajectories of incandescent particles in 3D. A laboratory simulation of the 3D trajectories is presented, as well as corresponding data from NSTX plasma where glowing dust particles can be followed.

  4. Passive Millimeter Wave Camera (PMMWC) at TRW

    NASA Technical Reports Server (NTRS)

    1997-01-01

    Engineers at TRW, Redondo Beach, California, inspect the Passive Millimeter Wave Camera, a weather-piercing camera designed to see through fog, clouds, smoke and dust. Operating in the millimeter wave portion of the electromagnetic spectrum, the camera creates visual-like video images of objects, people, runways, obstacles and the horizon. A demonstration camera (shown in photo) has been completed and is scheduled for checkout tests and flight demonstration. Engineer (left) holds a compact, lightweight circuit board containing 40 complete radiometers, including antenna, monolithic millimeter wave integrated circuit (MMIC) receivers and signal processing and readout electronics that forms the basis for the camera's 1040-element focal plane array.

  5. Passive Millimeter Wave Camera (PMMWC) at TRW

    NASA Technical Reports Server (NTRS)

    1997-01-01

    Engineers at TRW, Redondo Beach, California, inspect the Passive Millimeter Wave Camera, a weather-piercing camera designed to 'see' through fog, clouds, smoke and dust. Operating in the millimeter wave portion of the electromagnetic spectrum, the camera creates visual-like video images of objects, people, runways, obstacles and the horizon. A demonstration camera (shown in photo) has been completed and is scheduled for checkout tests and flight demonstration. Engineer (left) holds a compact, lightweight circuit board containing 40 complete radiometers, including antenna, monolithic millimeter wave integrated circuit (MMIC) receivers and signal processing and readout electronics that forms the basis for the camera's 1040-element focal plane array.

  6. HHEBBES! All sky camera system: status update

    NASA Astrophysics Data System (ADS)

    Bettonvil, F.

    2015-01-01

    A status update is given of the HHEBBES! All sky camera system. HHEBBES!, an automatic camera for capturing bright meteor trails, is based on a DSLR camera and a Liquid Crystal chopper for measuring the angular velocity. Purpose of the system is to a) recover meteorites; b) identify origin/parental bodies. In 2015, two new cameras were rolled out: BINGO! -alike HHEBBES! also in The Netherlands-, and POgLED, in Serbia. BINGO! is a first camera equipped with a longer focal length fisheye lens, to further increase the accuracy. Several minor improvements have been done and the data reduction pipeline was used for processing two prominent Dutch fireballs.

  7. CCD video camera and airborne applications

    NASA Astrophysics Data System (ADS)

    Sturz, Richard A.

    2000-11-01

    The human need to see for ones self and to do so remotely, has given rise to video camera applications never before imagined and growing constantly. The instant understanding and verification offered by video lends its applications to every facet of life. Once an entertainment media, video is now ever present in out daily life. The application to the aircraft platform is one aspect of the video camera versatility. Integrating the video camera into the aircraft platform is yet another story. The typical video camera when applied to more standard scene imaging poses less demanding parameters and considerations. This paper explores the video camera as applied to the more complicated airborne environment.

  8. Spectrometry with consumer-quality CMOS cameras.

    PubMed

    Scheeline, Alexander

    2015-01-01

    Many modern spectrometric instruments use diode arrays, charge-coupled arrays, or CMOS cameras for detection and measurement. As portable or point-of-use instruments are desirable, one would expect that instruments using the cameras in cellular telephones and tablet computers would be the basis of numerous instruments. However, no mass market for such devices has yet developed. The difficulties in using megapixel CMOS cameras for scientific measurements are discussed, and promising avenues for instrument development reviewed. Inexpensive alternatives to use of the built-in camera are also mentioned, as the long-term question is whether it is better to overcome the constraints of CMOS cameras or to bypass them.

  9. Mini gamma camera, camera system and method of use

    DOEpatents

    Majewski, Stanislaw; Weisenberger, Andrew G.; Wojcik, Randolph F.

    2001-01-01

    A gamma camera comprising essentially and in order from the front outer or gamma ray impinging surface: 1) a collimator, 2) a scintillator layer, 3) a light guide, 4) an array of position sensitive, high resolution photomultiplier tubes, and 5) printed circuitry for receipt of the output of the photomultipliers. There is also described, a system wherein the output supplied by the high resolution, position sensitive photomultipiler tubes is communicated to: a) a digitizer and b) a computer where it is processed using advanced image processing techniques and a specific algorithm to calculate the center of gravity of any abnormality observed during imaging, and c) optional image display and telecommunications ports.

  10. Analysis of Pictures Taken with an Underwater Camera

    NASA Astrophysics Data System (ADS)

    Biezeveld, Hubert; Elsinga, Nienke; Harmsen, Floor-Jolijn; Koopman, Rose

    2005-03-01

    In the Dutch high school system students are required to carry out a research project on a subject of their own choosing. During the section on optics, the teacher (Hubert) mentioned that a fish sees the world above the water in a cone with half-angle equal to the critical angle for the air/water interface (49°). This follows from Snell's law and has been discussed in some detail by Jearl Walker. He describes the intersection of this cone with the surface of the water as the "window" through which the fish sees the outside world. A related paper has appeared in this journal.2 Stimulated by the teacher's remark, three students (Nienke, Floor-Jolijn, and Rose) made photographs with an underwater camera in the local swimming pool in Hoorn.

  11. LDEF yaw and pitch angle estimates

    NASA Technical Reports Server (NTRS)

    Banks, Bruce A.; Gebauer, Linda

    1992-01-01

    Quantification of the LDEF yaw and pitch misorientations is crucial to the knowledge of atomic oxygen exposure of samples placed on LDEF. Video camera documentation of the LDEF spacecraft prior to grapple attachment, atomic oxygen shadows on experiment trays and longerons, and a pinhole atomic oxygen camera placed on LDEF provided sources of documentation of the yaw and pitch misorientation. Based on uncertainty-weighted averaging of data, the LDEF yaw offset was found to be 8.1 plus or minus 0.6 degrees, allowing higher atomic oxygen exposure of row 12 than initially anticipated. The LDEF pitch angle offset was found to be 0.8 plus or minus 0.4 degrees, such that the space end was tipped forward toward the direction of travel. The resulting consequences of the yaw and pitch misorientation of LDEF on the atomic oxygen fluence is a factor of 2.16 increase for samples located on row 12, and a factor of 1.18 increase for samples located on the space end compared to that which would be expected for perfect orientation.

  12. Initial laboratory evaluation of color video cameras

    SciTech Connect

    Terry, P L

    1991-01-01

    Sandia National Laboratories has considerable experience with monochrome video cameras used in alarm assessment video systems. Most of these systems, used for perimeter protection, were designed to classify rather than identify an intruder. Monochrome cameras are adequate for that application and were selected over color cameras because of their greater sensitivity and resolution. There is a growing interest in the identification function of security video systems for both access control and insider protection. Color information is useful for identification purposes, and color camera technology is rapidly changing. Thus, Sandia National Laboratories established an ongoing program to evaluate color solid-state cameras. Phase one resulted in the publishing of a report titled, Initial Laboratory Evaluation of Color Video Cameras (SAND--91-2579).'' It gave a brief discussion of imager chips and color cameras and monitors, described the camera selection, detailed traditional test parameters and procedures, and gave the results of the evaluation of twelve cameras. In phase two six additional cameras were tested by the traditional methods and all eighteen cameras were tested by newly developed methods. This report details both the traditional and newly developed test parameters and procedures, and gives the results of both evaluations.

  13. Phenology cameras observing boreal ecosystems of Finland

    NASA Astrophysics Data System (ADS)

    Peltoniemi, Mikko; Böttcher, Kristin; Aurela, Mika; Kolari, Pasi; Tanis, Cemal Melih; Linkosalmi, Maiju; Loehr, John; Metsämäki, Sari; Nadir Arslan, Ali

    2016-04-01

    Cameras have become useful tools for monitoring seasonality of ecosystems. Low-cost cameras facilitate validation of other measurements and allow extracting some key ecological features and moments from image time series. We installed a network of phenology cameras at selected ecosystem research sites in Finland. Cameras were installed above, on the level, or/and below the canopies. Current network hosts cameras taking time lapse images in coniferous and deciduous forests as well as at open wetlands offering thus possibilities to monitor various phenological and time-associated events and elements. In this poster, we present our camera network and give examples of image series use for research. We will show results about the stability of camera derived color signals, and based on that discuss about the applicability of cameras in monitoring time-dependent phenomena. We will also present results from comparisons between camera-derived color signal time series and daily satellite-derived time series (NVDI, NDWI, and fractional snow cover) from the Moderate Resolution Imaging Spectrometer (MODIS) at selected spruce and pine forests and in a wetland. We will discuss the applicability of cameras in supporting phenological observations derived from satellites, by considering the possibility of cameras to monitor both above and below canopy phenology and snow.

  14. Characterization of the Series 1000 Camera System

    SciTech Connect

    Kimbrough, J; Moody, J; Bell, P; Landen, O

    2004-04-07

    The National Ignition Facility requires a compact network addressable scientific grade CCD camera for use in diagnostics ranging from streak cameras to gated x-ray imaging cameras. Due to the limited space inside the diagnostic, an analog and digital input/output option in the camera controller permits control of both the camera and the diagnostic by a single Ethernet link. The system consists of a Spectral Instruments Series 1000 camera, a PC104+ controller, and power supply. The 4k by 4k CCD camera has a dynamic range of 70 dB with less than 14 electron read noise at a 1MHz readout rate. The PC104+ controller includes 16 analog inputs, 4 analog outputs and 16 digital input/output lines for interfacing to diagnostic instrumentation. A description of the system and performance characterization is reported.

  15. Advanced camera image data acquisition system for Pi-of-the-Sky

    NASA Astrophysics Data System (ADS)

    Kwiatkowski, Maciej; Kasprowicz, Grzegorz; Pozniak, Krzysztof; Romaniuk, Ryszard; Wrochna, Grzegorz

    2008-11-01

    The paper describes a new generation of high performance, remote control, CCD cameras designed for astronomical applications. A completely new camera PCB was designed, manufactured, tested and commissioned. The CCD chip was positioned in a different way than previously resulting in better performance of the astronomical video data acquisition system. The camera was built using a low-noise, 4Mpixel CCD circuit by STA. The electronic circuit of the camera is highly parameterized and reconfigurable, as well as modular in comparison with the solution of first generation, due to application of open software solutions and FPGA circuit, Altera Cyclone EP1C6. New algorithms were implemented into the FPGA chip. There were used the following advanced electronic circuit in the camera system: microcontroller CY7C68013a (core 8051) by Cypress, image processor AD9826 by Analog Devices, GigEth interface RTL8169s by Realtec, memory SDRAM AT45DB642 by Atmel, CPU typr microprocessor ARM926EJ-S AT91SAM9260 by ARM and Atmel. Software solutions for the camera and its remote control, as well as image data acquisition are based only on the open source platform. There were used the following image interfaces ISI and API V4L2, data bus AMBA, AHB, INDI protocol. The camera will be replicated in 20 pieces and is designed for continuous on-line, wide angle observations of the sky in the research program Pi-of-the-Sky.

  16. Earth elevation map production and high resolution sensing camera imaging analysis

    NASA Astrophysics Data System (ADS)

    Yang, Xiubin; Jin, Guang; Jiang, Li; Dai, Lu; Xu, Kai

    2010-11-01

    The Earth's digital elevation which impacts space camera imaging has prepared and imaging has analysed. Based on matching error that TDI CCD integral series request of the speed of image motion, statistical experimental methods-Monte Carlo method is used to calculate the distribution histogram of Earth's elevation in image motion compensated model which includes satellite attitude changes, orbital angular rate changes, latitude, longitude and the orbital inclination changes. And then, elevation information of the earth's surface from SRTM is read. Earth elevation map which produced for aerospace electronic cameras is compressed and spliced. It can get elevation data from flash according to the shooting point of latitude and longitude. If elevation data between two data, the ways of searching data uses linear interpolation. Linear interpolation can better meet the rugged mountains and hills changing requests. At last, the deviant framework and camera controller are used to test the character of deviant angle errors, TDI CCD camera simulation system with the material point corresponding to imaging point model is used to analyze the imaging's MTF and mutual correlation similarity measure, simulation system use adding cumulation which TDI CCD imaging exceeded the corresponding pixel horizontal and vertical offset to simulate camera imaging when stability of satellite attitude changes. This process is practicality. It can effectively control the camera memory space, and meet a very good precision TDI CCD camera in the request matches the speed of image motion and imaging.

  17. Photometric Calibration and Image Stitching for a Large Field of View Multi-Camera System

    PubMed Central

    Lu, Yu; Wang, Keyi; Fan, Gongshu

    2016-01-01

    A new compact large field of view (FOV) multi-camera system is introduced. The camera is based on seven tiny complementary metal-oxide-semiconductor sensor modules covering over 160° × 160° FOV. Although image stitching has been studied extensively, sensor and lens differences have not been considered in previous multi-camera devices. In this study, we have calibrated the photometric characteristics of the multi-camera device. Lenses were not mounted on the sensor in the process of radiometric response calibration to eliminate the influence of the focusing effect of uniform light from an integrating sphere. Linearity range of the radiometric response, non-linearity response characteristics, sensitivity, and dark current of the camera response function are presented. The R, G, and B channels have different responses for the same illuminance. Vignetting artifact patterns have been tested. The actual luminance of the object is retrieved by sensor calibration results, and is used to blend images to make panoramas reflect the objective luminance more objectively. This compensates for the limitation of stitching images that are more realistic only through the smoothing method. The dynamic range limitation of can be resolved by using multiple cameras that cover a large field of view instead of a single image sensor with a wide-angle lens. The dynamic range is expanded by 48-fold in this system. We can obtain seven images in one shot with this multi-camera system, at 13 frames per second. PMID:27077857

  18. Photometric Calibration and Image Stitching for a Large Field of View Multi-Camera System.

    PubMed

    Lu, Yu; Wang, Keyi; Fan, Gongshu

    2016-04-11

    A new compact large field of view (FOV) multi-camera system is introduced. The camera is based on seven tiny complementary metal-oxide-semiconductor sensor modules covering over 160° × 160° FOV. Although image stitching has been studied extensively, sensor and lens differences have not been considered in previous multi-camera devices. In this study, we have calibrated the photometric characteristics of the multi-camera device. Lenses were not mounted on the sensor in the process of radiometric response calibration to eliminate the influence of the focusing effect of uniform light from an integrating sphere. Linearity range of the radiometric response, non-linearity response characteristics, sensitivity, and dark current of the camera response function are presented. The R, G, and B channels have different responses for the same illuminance. Vignetting artifact patterns have been tested. The actual luminance of the object is retrieved by sensor calibration results, and is used to blend images to make panoramas reflect the objective luminance more objectively. This compensates for the limitation of stitching images that are more realistic only through the smoothing method. The dynamic range limitation of can be resolved by using multiple cameras that cover a large field of view instead of a single image sensor with a wide-angle lens. The dynamic range is expanded by 48-fold in this system. We can obtain seven images in one shot with this multi-camera system, at 13 frames per second.

  19. Photometric Calibration and Image Stitching for a Large Field of View Multi-Camera System.

    PubMed

    Lu, Yu; Wang, Keyi; Fan, Gongshu

    2016-01-01

    A new compact large field of view (FOV) multi-camera system is introduced. The camera is based on seven tiny complementary metal-oxide-semiconductor sensor modules covering over 160° × 160° FOV. Although image stitching has been studied extensively, sensor and lens differences have not been considered in previous multi-camera devices. In this study, we have calibrated the photometric characteristics of the multi-camera device. Lenses were not mounted on the sensor in the process of radiometric response calibration to eliminate the influence of the focusing effect of uniform light from an integrating sphere. Linearity range of the radiometric response, non-linearity response characteristics, sensitivity, and dark current of the camera response function are presented. The R, G, and B channels have different responses for the same illuminance. Vignetting artifact patterns have been tested. The actual luminance of the object is retrieved by sensor calibration results, and is used to blend images to make panoramas reflect the objective luminance more objectively. This compensates for the limitation of stitching images that are more realistic only through the smoothing method. The dynamic range limitation of can be resolved by using multiple cameras that cover a large field of view instead of a single image sensor with a wide-angle lens. The dynamic range is expanded by 48-fold in this system. We can obtain seven images in one shot with this multi-camera system, at 13 frames per second. PMID:27077857

  20. On the absolute calibration of SO2 cameras

    NASA Astrophysics Data System (ADS)

    Lübcke, P.; Bobrowski, N.; Illing, S.; Kern, C.; Alvarez Nieves, J. M.; Vogel, L.; Zielcke, J.; Delgado Granados, H.; Platt, U.

    2013-03-01

    Sulphur dioxide emission rate measurements are an important tool for volcanic monitoring and eruption risk assessment. The SO2 camera technique remotely measures volcanic emissions by analysing the ultraviolet absorption of SO2 in a narrow spectral window between 300 and 320 nm using solar radiation scattered in the atmosphere. The SO2 absorption is selectively detected by mounting band-pass interference filters in front of a two-dimensional, UV-sensitive CCD detector. One important step for correct SO2 emission rate measurements that can be compared with other measurement techniques is a correct calibration. This requires conversion from the measured optical density to the desired SO2 column density (CD). The conversion factor is most commonly determined by inserting quartz cells (cuvettes) with known amounts of SO2 into the light path. Another calibration method uses an additional narrow field-of-view Differential Optical Absorption Spectroscopy system (NFOV-DOAS), which measures the column density simultaneously in a small area of the camera's field-of-view. This procedure combines the very good spatial and temporal resolution of the SO2 camera technique with the more accurate column densities obtainable from DOAS measurements. This work investigates the uncertainty of results gained through the two commonly used, but quite different, calibration methods (DOAS and calibration cells). Measurements with three different instruments, an SO2 camera, a NFOV-DOAS system and an Imaging DOAS (I-DOAS), are presented. We compare the calibration-cell approach with the calibration from the NFOV-DOAS system. The respective results are compared with measurements from an I-DOAS to verify the calibration curve over the spatial extent of the image. The results show that calibration cells, while working fine in some cases, can lead to an overestimation of the SO2 CD by up to 60% compared with CDs from the DOAS measurements. Besides these errors of calibration, radiative transfer

  1. Narrow-band nonlinear sea waves

    NASA Technical Reports Server (NTRS)

    Tayfun, M. A.

    1980-01-01

    Probabilistic description of nonlinear waves with a narrow-band spectrum is simplified to a form in which each realization of the surface displacement becomes an amplitude-modulated Stokes wave with a mean frequency and random phase. Under appropriate conditions this simplification provides a convenient yet rigorous means of describing nonlinear effects on sea surface properties in a semiclosed or closed form. In particular, it is shown that surface displacements are non-Gaussian and skewed, as was previously predicted by the Gram-Charlier approximation; that wave heights are Rayleigh distributed, just as in the linear case; and that crests are non-Rayleigh.

  2. Narrow electron injector for ballistic electron spectroscopy

    SciTech Connect

    Kast, M.; Pacher, C.; Strasser, G.; Gornik, E.

    2001-06-04

    A three-terminal hot electron transistor is used to measure the normal energy distribution of ballistic electrons generated by an electron injector utilizing an improved injector design. A triple barrier resonant tunneling diode with a rectangular transmission function acts as a narrow (1 meV) energy filter. An asymmetric energy distribution with its maximum on the high-energy side with a full width at half maximum of {Delta}E{sub inj}=10 meV is derived. {copyright} 2001 American Institute of Physics.

  3. Global viscous overstabilities in narrow rings

    NASA Astrophysics Data System (ADS)

    Longaretti, Pierre-Yves; French, Richard G.; Nicholson, Philip D.

    2016-10-01

    Local viscous overstabilities have been the focus of a number of theoretical analyses in the last decades due to the rôle they are believed to play in the creation of the small scale structure of broad ring systems (Saturn, Uranus). Global viscous overstabilities have also been investigated in the 1980s and 1990s as a potential source of narrow ring eccentricities (Longaretti and Rappaport, 1995, Icarus, 116, 376).An important feature of global viscous overstabilities is that they produce slow relative librating or circulating motions of narrow ring edges; they may also produce slowly librating or circulating components of edge modes. This process is potentially relevant to explain the occurrence of unusually large apsidal shifts observed in some saturnian ringlets and may also explain the existence of the free m=2 B ring edge mode that is slowly circulating with respect to the component forced by Mimas.The time-scale of such motions is primarily controlled by the ring self-gravity and can be analytically quantified in a two-streamline analysis which yields a characteristic libration/circulation frequency Ωl = (n/π)(Mr/Mp)(a/δa)2H(q2) where n is the mean motion, Mr the ringlet or pertubed region mass, Mp the planet mass, a the semi-major axis, δa the narrow ringlet or pertubed region width and H(q2) a dimensionless factor of order unity that depends on the streamline compression parameter q. The related time-scale is of the order of a few years to a few tens of years depending on the surface density and ringlet/perturbed region geometry. Preliminary data analyzes indicate that the Maxwell and Huyghens ringlets are probably librating with periods consistent with this two-streamline estimate.The talk will briefly present the physics of global viscous overstabilities as well as more detailed applications to narrow rings, and if time permits, to edge modes.

  4. On the absolute calibration of SO2 cameras

    NASA Astrophysics Data System (ADS)

    Lübcke, P.; Bobrowski, N.; Illing, S.; Kern, C.; Alvarez Nieves, J. M.; Vogel, L.; Zielcke, J.; Delgado Granados, H.; Platt, U.

    2012-09-01

    Sulphur dioxide emission flux measurements are an important tool for volcanic monitoring and eruption risk assessment. The SO2 camera technique remotely measures volcanic emissions by analysing the ultraviolet absorption of SO2 in a narrow spectral window between 305 nm and 320 nm using solar radiation scattered in the atmosphere. The SO2 absorption is selectively detected by mounting band-pass interference filters in front of a two-dimensional, UV-sensitive CCD detector. While this approach is simple and delivers valuable insights into the two-dimensional SO2 distribution, absolute calibration has proven to be difficult. An accurate calibration of the SO2 camera (i.e., conversion from optical density to SO2 column density, CD) is crucial to obtain correct SO2 CDs and flux measurements that are comparable to other measurement techniques and can be used for volcanological applications. The most common approach for calibrating SO2 camera measurements is based on inserting quartz cells (cuvettes) containing known amounts of SO2 into the light path. It has been found, however, that reflections from the windows of the calibration cell can considerably affect the signal measured by the camera. Another possibility for calibration relies on performing simultaneous measurements in a small area of the camera's field-of-view (FOV) by a narrow-field-of-view Differential Optical Absorption Spectroscopy (NFOV-DOAS) system. This procedure combines the very good spatial and temporal resolution of the SO2 camera technique with the more accurate column densities obtainable from DOAS measurements. This work investigates the uncertainty of results gained through the two commonly used, but quite different calibration methods (DOAS and calibration cells). Measurements with three different instruments, an SO2 camera, a NFOV-DOAS system and an Imaging DOAS (IDOAS), are presented. We compare the calibration-cell approach with the calibration from the NFOV-DOAS system. The respective

  5. PDX infrared TV camera system

    SciTech Connect

    Jacobsen, R.A.

    1981-08-01

    An infrared TV camera system has been developed for use on PDX. This system is capable of measuring the temporal and spatial energy deposition on the limiters and divertor neutralizer plates; time resolutions of 1 ms are achievable. The system has been used to measure the energy deposition on the PDX neutralizer plates and the temperature jump of limiter surfaces during a pulse. The energy scrapeoff layer is found to have characteristic dimensions of the order of a cm. The measurement of profiles is very sensitive to variations in the thermal emissivity of the surfaces.

  6. Cryogenic mechanism for ISO camera

    NASA Astrophysics Data System (ADS)

    Luciano, G.

    1987-12-01

    The Infrared Space Observatory (ISO) camera configuration, architecture, materials, tribology, motorization, and development status are outlined. The operating temperature is 2 to 3 K, at 2.5 to 18 microns. Selected material is a titanium alloy, with MoS2/TiC lubrication. A stepping motor drives the ball-bearing mounted wheels to which the optical elements are fixed. Model test results are satisfactory, and also confirm the validity of the test facilities, particularly for vibration tests at 4K.

  7. Fraunhofer diffraction to determine the twin angle in single-crystal BaTiO3.

    PubMed

    Melnichuk, Mike; Wood, Lowell T

    2003-08-01

    We present a new method for determining the electrically induced twin angle alpha of a (100) bulk single crystal of barium titanate (BaTiO3) using a nondestructive optical technique based on Fraunhofer diffraction. The technique required two steps that were performed simultaneously. First, we analyzed the diffracted light intensity captured with a line camera. Second, we measured the size of the diffracting element by analyzing images of crystal's surface taken with a CCD camera. The value obtained for the twin angle is 0.67 degrees +/- 0.05 degrees, which compares favorably with the theoretical value of 0.63 degrees.

  8. NARROW-K-BAND OBSERVATIONS OF THE GJ 1214 SYSTEM

    SciTech Connect

    Colón, Knicole D.; Gaidos, Eric

    2013-10-10

    GJ 1214 is a nearby M dwarf star that hosts a transiting super-Earth-size planet, making this system an excellent target for atmospheric studies. Most studies find that the transmission spectrum of GJ 1214b is flat, which favors either a high mean molecular weight or cloudy/hazy hydrogen (H) rich atmosphere model. Photometry at short wavelengths (<0.7 μm) and in the K band can discriminate the most between these different atmosphere models for GJ 1214b, but current observations do not have sufficiently high precision. We present photometry of seven transits of GJ 1214b through a narrow K-band (2.141 μm) filter with the Wide Field Camera on the 3.8 m United Kingdom Infrared Telescope. Our photometric precision is typically 1.7 × 10{sup –3} (for a single transit), comparable with other ground-based observations of GJ 1214b. We measure a planet-star radius ratio of 0.1158 ± 0.0013, which, along with other studies, also supports a flat transmission spectrum for GJ 1214b. Since this does not exclude a scenario where GJ 1214b has an H-rich envelope with heavy elements that are sequestered below a cloud/haze layer, we compare K-band observations with models of H{sub 2} collision-induced absorption in an atmosphere for a range of temperatures. While we find no evidence for deviation from a flat spectrum (slope s = 0.0016 ± 0.0038), an H{sub 2}-dominated upper atmosphere (<60 mbar) cannot be excluded. More precise observations at <0.7 μm and in the K band, as well as a uniform analysis of all published data, would be useful for establishing more robust limits on atmosphere models for GJ 1214b.

  9. Optical Enhancement of Exoskeleton-Based Estimation of Glenohumeral Angles.

    PubMed

    Cortés, Camilo; Unzueta, Luis; de Los Reyes-Guzmán, Ana; Ruiz, Oscar E; Flórez, Julián

    2016-01-01

    In Robot-Assisted Rehabilitation (RAR) the accurate estimation of the patient limb joint angles is critical for assessing therapy efficacy. In RAR, the use of classic motion capture systems (MOCAPs) (e.g., optical and electromagnetic) to estimate the Glenohumeral (GH) joint angles is hindered by the exoskeleton body, which causes occlusions and magnetic disturbances. Moreover, the exoskeleton posture does not accurately reflect limb posture, as their kinematic models differ. To address the said limitations in posture estimation, we propose installing the cameras of an optical marker-based MOCAP in the rehabilitation exoskeleton. Then, the GH joint angles are estimated by combining the estimated marker poses and exoskeleton Forward Kinematics. Such hybrid system prevents problems related to marker occlusions, reduced camera detection volume, and imprecise joint angle estimation due to the kinematic mismatch of the patient and exoskeleton models. This paper presents the formulation, simulation, and accuracy quantification of the proposed method with simulated human movements. In addition, a sensitivity analysis of the method accuracy to marker position estimation errors, due to system calibration errors and marker drifts, has been carried out. The results show that, even with significant errors in the marker position estimation, method accuracy is adequate for RAR. PMID:27403044

  10. Optical Enhancement of Exoskeleton-Based Estimation of Glenohumeral Angles

    PubMed Central

    Cortés, Camilo; Unzueta, Luis; de los Reyes-Guzmán, Ana; Ruiz, Oscar E.; Flórez, Julián

    2016-01-01

    In Robot-Assisted Rehabilitation (RAR) the accurate estimation of the patient limb joint angles is critical for assessing therapy efficacy. In RAR, the use of classic motion capture systems (MOCAPs) (e.g., optical and electromagnetic) to estimate the Glenohumeral (GH) joint angles is hindered by the exoskeleton body, which causes occlusions and magnetic disturbances. Moreover, the exoskeleton posture does not accurately reflect limb posture, as their kinematic models differ. To address the said limitations in posture estimation, we propose installing the cameras of an optical marker-based MOCAP in the rehabilitation exoskeleton. Then, the GH joint angles are estimated by combining the estimated marker poses and exoskeleton Forward Kinematics. Such hybrid system prevents problems related to marker occlusions, reduced camera detection volume, and imprecise joint angle estimation due to the kinematic mismatch of the patient and exoskeleton models. This paper presents the formulation, simulation, and accuracy quantification of the proposed method with simulated human movements. In addition, a sensitivity analysis of the method accuracy to marker position estimation errors, due to system calibration errors and marker drifts, has been carried out. The results show that, even with significant errors in the marker position estimation, method accuracy is adequate for RAR. PMID:27403044

  11. Optical Enhancement of Exoskeleton-Based Estimation of Glenohumeral Angles.

    PubMed

    Cortés, Camilo; Unzueta, Luis; de Los Reyes-Guzmán, Ana; Ruiz, Oscar E; Flórez, Julián

    2016-01-01

    In Robot-Assisted Rehabilitation (RAR) the accurate estimation of the patient limb joint angles is critical for assessing therapy efficacy. In RAR, the use of classic motion capture systems (MOCAPs) (e.g., optical and electromagnetic) to estimate the Glenohumeral (GH) joint angles is hindered by the exoskeleton body, which causes occlusions and magnetic disturbances. Moreover, the exoskeleton posture does not accurately reflect limb posture, as their kinematic models differ. To address the said limitations in posture estimation, we propose installing the cameras of an optical marker-based MOCAP in the rehabilitation exoskeleton. Then, the GH joint angles are estimated by combining the estimated marker poses and exoskeleton Forward Kinematics. Such hybrid system prevents problems related to marker occlusions, reduced camera detection volume, and imprecise joint angle estimation due to the kinematic mismatch of the patient and exoskeleton models. This paper presents the formulation, simulation, and accuracy quantification of the proposed method with simulated human movements. In addition, a sensitivity analysis of the method accuracy to marker position estimation errors, due to system calibration errors and marker drifts, has been carried out. The results show that, even with significant errors in the marker position estimation, method accuracy is adequate for RAR.

  12. Glancing angle RF sheaths

    NASA Astrophysics Data System (ADS)

    D'Ippolito, D. A.; Myra, J. R.

    2013-10-01

    RF sheaths occur in tokamaks when ICRF waves encounter conducting boundaries. The sheath plays an important role in determining the efficiency of ICRF heating, the impurity influxes from the edge plasma, and the plasma-facing component damage. An important parameter in sheath theory is the angle θ between the equilibrium B field and the wall. Recent work with 1D and 2D sheath models has shown that the rapid variation of θ around a typical limiter can lead to enhanced sheath potentials and localized power deposition (hot spots) when the B field is near glancing incidence. The physics model used to obtain these results does not include some glancing-angle effects, e.g. possible modification of the angular dependence of the Child-Langmuir law and the role of the magnetic pre-sheath. Here, we report on calculations which explore these effects, with the goal of improving the fidelity of the rf sheath BC used in analytical and numerical calculations. Work supported by US DOE grants DE-FC02-05ER54823 and DE-FG02-97ER54392.

  13. Variable angle correlation spectroscopy

    SciTech Connect

    Lee, Y K

    1994-05-01

    In this dissertation, a novel nuclear magnetic resonance (NMR) technique, variable angle correlation spectroscopy (VACSY) is described and demonstrated with {sup 13}C nuclei in rapidly rotating samples. These experiments focus on one of the basic problems in solid state NMR: how to extract the wealth of information contained in the anisotropic component of the NMR signal while still maintaining spectral resolution. Analysis of the anisotropic spectral patterns from poly-crystalline systems reveal information concerning molecular structure and dynamics, yet in all but the simplest of systems, the overlap of spectral patterns from chemically distinct sites renders the spectral analysis difficult if not impossible. One solution to this problem is to perform multi-dimensional experiments where the high-resolution, isotropic spectrum in one dimension is correlated with the anisotropic spectral patterns in the other dimensions. The VACSY technique incorporates the angle between the spinner axis and the static magnetic field as an experimental parameter that may be incremented during the course of the experiment to help correlate the isotropic and anisotropic components of the spectrum. The two-dimensional version of the VACSY experiments is used to extract the chemical shift anisotropy tensor values from multi-site organic molecules, study molecular dynamics in the intermediate time regime, and to examine the ordering properties of partially oriented samples. The VACSY technique is then extended to three-dimensional experiments to study slow molecular reorientations in a multi-site polymer system.

  14. Digital cameras with designs inspired by the arthropod eye.

    PubMed

    Song, Young Min; Xie, Yizhu; Malyarchuk, Viktor; Xiao, Jianliang; Jung, Inhwa; Choi, Ki-Joong; Liu, Zhuangjian; Park, Hyunsung; Lu, Chaofeng; Kim, Rak-Hwan; Li, Rui; Crozier, Kenneth B; Huang, Yonggang; Rogers, John A

    2013-05-01

    In arthropods, evolution has created a remarkably sophisticated class of imaging systems, with a wide-angle field of view, low aberrations, high acuity to motion and an infinite depth of field. A challenge in building digital cameras with the hemispherical, compound apposition layouts of arthropod eyes is that essential design requirements cannot be met with existing planar sensor technologies or conventional optics. Here we present materials, mechanics and integration schemes that afford scalable pathways to working, arthropod-inspired cameras with nearly full hemispherical shapes (about 160 degrees). Their surfaces are densely populated by imaging elements (artificial ommatidia), which are comparable in number (180) to those of the eyes of fire ants (Solenopsis fugax) and bark beetles (Hylastes nigrinus). The devices combine elastomeric compound optical elements with deformable arrays of thin silicon photodetectors into integrated sheets that can be elastically transformed from the planar geometries in which they are fabricated to hemispherical shapes for integration into apposition cameras. Our imaging results and quantitative ray-tracing-based simulations illustrate key features of operation. These general strategies seem to be applicable to other compound eye devices, such as those inspired by moths and lacewings (refracting superposition eyes), lobster and shrimp (reflecting superposition eyes), and houseflies (neural superposition eyes). PMID:23636401

  15. Narrow field electromagnetic sensor system and method

    DOEpatents

    McEwan, Thomas E.

    1996-01-01

    A narrow field electromagnetic sensor system and method of sensing a characteristic of an object provide the capability to realize a characteristic of an object such as density, thickness, or presence, for any desired coordinate position on the object. One application is imaging. The sensor can also be used as an obstruction detector or an electronic trip wire with a narrow field without the disadvantages of impaired performance when exposed to dirt, snow, rain, or sunlight. The sensor employs a transmitter for transmitting a sequence of electromagnetic signals in response to a transmit timing signal, a receiver for sampling only the initial direct RF path of the electromagnetic signal while excluding all other electromagnetic signals in response to a receive timing signal, and a signal processor for processing the sampled direct RF path electromagnetic signal and providing an indication of the characteristic of an object. Usually, the electromagnetic signal is a short RF burst and the obstruction must provide a substantially complete eclipse of the direct RF path. By employing time-of-flight techniques, a timing circuit controls the receiver to sample only the initial direct RF path of the electromagnetic signal while not sampling indirect path electromagnetic signals. The sensor system also incorporates circuitry for ultra-wideband spread spectrum operation that reduces interference to and from other RF services while allowing co-location of multiple electronic sensors without the need for frequency assignments.

  16. Narrow field electromagnetic sensor system and method

    DOEpatents

    McEwan, T.E.

    1996-11-19

    A narrow field electromagnetic sensor system and method of sensing a characteristic of an object provide the capability to realize a characteristic of an object such as density, thickness, or presence, for any desired coordinate position on the object. One application is imaging. The sensor can also be used as an obstruction detector or an electronic trip wire with a narrow field without the disadvantages of impaired performance when exposed to dirt, snow, rain, or sunlight. The sensor employs a transmitter for transmitting a sequence of electromagnetic signals in response to a transmit timing signal, a receiver for sampling only the initial direct RF path of the electromagnetic signal while excluding all other electromagnetic signals in response to a receive timing signal, and a signal processor for processing the sampled direct RF path electromagnetic signal and providing an indication of the characteristic of an object. Usually, the electromagnetic signal is a short RF burst and the obstruction must provide a substantially complete eclipse of the direct RF path. By employing time-of-flight techniques, a timing circuit controls the receiver to sample only the initial direct RF path of the electromagnetic signal while not sampling indirect path electromagnetic signals. The sensor system also incorporates circuitry for ultra-wideband spread spectrum operation that reduces interference to and from other RF services while allowing co-location of multiple electronic sensors without the need for frequency assignments. 12 figs.

  17. Studies of narrow autoionizing resonances in gadolinium

    SciTech Connect

    Bushaw, Bruce A.; Nortershauser, W.; Blaum, K.; Wendt, Klaus

    2003-06-30

    The autoionization (AI) spectrum of gadolinium between the first and second limits has been investigated by triple-resonance excitation with high-resolution cw lasers. A large number of narrow AI resonances have been observed and assigned total angular momentum J values. The resonances are further divided into members of AI Rydberg series converging to the second limit or other ''interloping'' levels. Fine structure in the Rydberg series has been identified and interpreted in terms of Jc j coupling. A number of detailed studies have been performed on the interloping resonances: These include lifetime determination by lineshape analysis, isotope shifts, hyperfine structure, and photoionization saturation parameters. The electronic structure of the interloping levels is discussed in terms of these studies. Linewidths generally decrease with increasing total angular momentum and the J = 7 resonances are extremely narrow with Lorentzian widths ranging from < 1 MHz up to 157 MHz. The strongest resonances are found to have cross-sections of {approx}10-12 cm{sup 2} and photoionization can be saturated with powers available from cw diode lasers.

  18. Still red light for red light cameras? An update.

    PubMed

    Høye, Alena

    2013-06-01

    The present study has replicated the results from a previous meta-analysis by Erke (2009) [Erke, A., 2009. Red light for red-light cameras? A meta-analysis of the effects of red-light cameras on crashes. Accident Analysis & Prevention 41 (5), 897-905.] based on a larger sample of RLC-studies, and provides answers to the criticisms that were raised by Lund et al. (2009) [Lund, A.K., Kyrychenko, S.Y., Retting, R.A., 2009. Caution: a comment on Alena Erke's red light for red-light cameras? A meta-analysis of the effects of red-light cameras on crashes. Accident Analysis and Prevention 41, 895-896.] against the previous meta-analysis. The addition of recent studies to the meta-analysis and a more thorough investigation of potential moderator variables lead to a slight improvement of the estimated effects of RLC in the previous meta-analysis. The present study found a non-significant increase of all crashes by 6% and a non-significant decrease of all injury crashes by 13%. Right-angle collisions were found to decrease by 13% and rear-end collisions were found to increase by 39%. For right-angle injury collisions a decrease by 33% was found and for rear-end injury collisions a smaller increase was found (+19%). The effects of RLC are likely to be more favorable when RLC-warning signs are set up at main entrances to areas with RLC enforcement than when each RLC-intersection is signposted. The effects of RLC may become more favorable over time, this could however not be investigated empirically. Several results indicate that spillover effects may occur for right-angle collisions, but most likely not for rear-end and other crashes. If spillover effects do not occur for rear-end crashes, which increase at RLC intersection, this would be a positive result for RLC. However, the results seem to be affected to some degree by publication bias and the effects may therefore be somewhat less favorable than indicated by the results from meta-analysis.

  19. Propagation modeling results for narrow-beam undersea laser communications

    NASA Astrophysics Data System (ADS)

    Fletcher, Andrew S.; Hardy, Nicholas D.; Hamilton, Scott A.

    2016-03-01

    Communication links through ocean waters are challenging due to undersea propagation physics. Undersea optical communications at blue or green wavelengths can achieve high data rates (megabit- to gigabit-per-second class links) despite the challenging undersea medium. Absorption and scattering in ocean waters attenuate optical signals and distort the waveform through dense multipath. The exponential propagation loss and the temporal spread due to multipath limit the achievable link distance and data rate. In this paper, we describe the Monte Carlo modeling of the undersea scattering and absorption channel. We model photon signal attenuation levels, spatial photon distributions, time of arrival statistics, and angle of arrival statistics for a variety of lasercom scenarios through both clear and turbid water environments. Modeling results inform the design options for an undersea optical communication system, particularly illustrating the advantages of narrow-beam lasers compared to wide beam methods (e.g. LED sources). The modeled pupil plane and focal plane photon arrival distributions enable beam tracking techniques for robust pointing solutions, even in highly scattering harbor waters. Laser communication with collimated beams maximizes the photon transfer through the scattering medium and enables spatial and temporal filters to minimize waveform distortion and background interference.

  20. A LARGE APERTURE NARROW QUADROUPOLE FOR THE SNS ACCUMULATOR RING.

    SciTech Connect

    TSOUPAS,N.; BRODOWSKI,J.; MENG,W.; WEI,J.; LEE,Y.Y.; TUOZZOLO,J.

    2002-06-03

    The accumulator ring of the Spallation Neutron Source (SNS) is designed to accept high-intensity H{sup -} beam of 1 GeV kinetic energy from the injecting LINAC, and to accumulate, in a time interval of 1 msec, 2 x 10{sup 14} protons in a single bunch of 700 nsec. In order to optimize the effective straight-section spaces for beam-injection, extraction and collimation, we have minimized the width of the large aperture quadrupoles which are located in the same straight sections of the accumulator ring with the injection and extraction systems. By minimizing the width of the quadrupoles to {+-}40.4 cm, the beam-injection and extraction angles are lowered to 8.75{sup o} and 16.8{sup o} respectively. Further optimization of the narrow quadrupole, minimizes the strength of the dodecapole multipole component of the quadrupole, thus reducing the width of the 12pole structure resonance and allowing a larger tune space for stability of the circulating beam. In this paper we present results derived from magnetic field calculations of 2D and 3D modeling, and discuss the method of optimizing the size of the quadrupole and minimizing its dodecapole multipole component.

  1. Inclined internal tide waves at a narrow Mexican Pacific shelf

    NASA Astrophysics Data System (ADS)

    Filonov, Anatoliy

    2011-07-01

    The dynamics of a semidiurnal internal tidal wave at a narrow Mexican Pacific shelf is discussed using the data of temperature obtained by an anchored instrument and data of field surveys. The internal tide on the shelf is dominated by an inclined wave, which propagates upward and onshore along a continental slope. Despite its reflection from the bottom and from the surface of the ocean, they remain inclined and totally destroyed over the course of one wavelength. Due to wave reflection from the inclined bottom, the horizontal and vertical wave number increase threefold when the wave goes into shallow waters. The wave undergoes nonlinear transformation and overturns forming several homogeneous temperature layers up to 20 m thick. The most intense disturbances of water layers are observed near the bottom, where the slope angle approaches its critical value. Because of nonlinear effects, the wave carries cool deep water out to the shallow depth and causes coastal upwelling. Intense solar warming together with vertical mixing results in a rapid rise of temperature in the 130-m water column that was observed.

  2. CCD Camera Lens Interface for Real-Time Theodolite Alignment

    NASA Technical Reports Server (NTRS)

    Wake, Shane; Scott, V. Stanley, III

    2012-01-01

    Theodolites are a common instrument in the testing, alignment, and building of various systems ranging from a single optical component to an entire instrument. They provide a precise way to measure horizontal and vertical angles. They can be used to align multiple objects in a desired way at specific angles. They can also be used to reference a specific location or orientation of an object that has moved. Some systems may require a small margin of error in position of components. A theodolite can assist with accurately measuring and/or minimizing that error. The technology is an adapter for a CCD camera with lens to attach to a Leica Wild T3000 Theodolite eyepiece that enables viewing on a connected monitor, and thus can be utilized with multiple theodolites simultaneously. This technology removes a substantial part of human error by relying on the CCD camera and monitors. It also allows image recording of the alignment, and therefore provides a quantitative means to measure such error.

  3. A remote camera operation system using a marker attached cap

    NASA Astrophysics Data System (ADS)

    Kawai, Hironori; Hama, Hiromitsu

    2005-12-01

    In this paper, we propose a convenient system to control a remote camera according to the eye-gazing direction of the operator, which is approximately obtained through calculating the face direction by means of image processing. The operator put a marker attached cap on his head, and the system takes an image of the operator from above with only one video camera. Three markers are set up on the cap, and 'three' is the minimum number to calculate the tilt angle of the head. The more markers are used, the robuster system may be made to occlusion, and the wider moving range of the head is tolerated. It is supposed that the markers must not exist on any three dimensional straight line. To compensate the marker's color change due to illumination conditions, the threshold for the marker extraction is adaptively decided using a k-means clustering method. The system was implemented with MATLAB on a personal computer, and the real-time operation was realized. Through the experimental results, robustness of the system was confirmed and tilt and pan angles of the head could be calculated with enough accuracy to use.

  4. Optimising camera traps for monitoring small mammals.

    PubMed

    Glen, Alistair S; Cockburn, Stuart; Nichols, Margaret; Ekanayake, Jagath; Warburton, Bruce

    2013-01-01

    Practical techniques are required to monitor invasive animals, which are often cryptic and occur at low density. Camera traps have potential for this purpose, but may have problems detecting and identifying small species. A further challenge is how to standardise the size of each camera's field of view so capture rates are comparable between different places and times. We investigated the optimal specifications for a low-cost camera trap for small mammals. The factors tested were 1) trigger speed, 2) passive infrared vs. microwave sensor, 3) white vs. infrared flash, and 4) still photographs vs. video. We also tested a new approach to standardise each camera's field of view. We compared the success rates of four camera trap designs in detecting and taking recognisable photographs of captive stoats (Mustelaerminea), feral cats (Felis catus) and hedgehogs (Erinaceuseuropaeus). Trigger speeds of 0.2-2.1 s captured photographs of all three target species unless the animal was running at high speed. The camera with a microwave sensor was prone to false triggers, and often failed to trigger when an animal moved in front of it. A white flash produced photographs that were more readily identified to species than those obtained under infrared light. However, a white flash may be more likely to frighten target animals, potentially affecting detection probabilities. Video footage achieved similar success rates to still cameras but required more processing time and computer memory. Placing two camera traps side by side achieved a higher success rate than using a single camera. Camera traps show considerable promise for monitoring invasive mammal control operations. Further research should address how best to standardise the size of each camera's field of view, maximise the probability that an animal encountering a camera trap will be detected, and eliminate visible or audible cues emitted by camera traps.

  5. A frequency selective bolometer camera for measuring millimeter spectral energy distributions

    NASA Astrophysics Data System (ADS)

    Logan, Daniel William

    2009-06-01

    Bolometers are the most sensitive detectors for measuring millimeter and submillimeter wavelength astrophysical signals. Cameras comprised of arrays of bolometers have already made significant contributions to the field of astronomy. A challenge for bolometer cameras is obtaining observations at multiple wavelengths. Traditionally, observing in multiple bands requires a partial disassembly of the instrument to replace bandpass filters, a task which prevents immediate spectral interrogation of a source. More complex cameras have been constructed to observe in several bands using beam splitters and dichroic filters, but the added complexity leads to physically larger instruments with reduced efficiencies. The SPEctral Energy Distribution camera (SPEED) is a new type of bolometer camera designed to efficiently observe in multiple wavebands without the need for excess bandpass filters and beam splitters. SPEED is a ground-based millimeter-wave bolometer camera designed to observe at 2.1, 1.3, 1.1, and 0.85 mm simultaneously. SPEED makes use of a new type of bolometer, the frequency selective bolometer (FSB), to observe all of the wavebands within each of the camera's four pixels. FSBs incorporate frequency selective dipole surfaces as absorbing elements allowing each detector to absorb a single, narrow band of radiation and pass all other radiation with low loss. Each FSB also contains a superconducting transition-edge sensor (TES) that acts as a sensitive thermistor for measuring the temperature of the FSB. This thesis describes the development of the SPEED camera and FSB detectors. The design of the detectors used in the instrument is described as well as the the general optical performance of frequency selective dipole surfaces. Laboratory results of both the optical and thermal properties of millimeter- wave FSBs are also presented. The SPEED instrument and its components are highlighted and the optical design of the optics which couple SPEED to the Heinrich Hertz

  6. Volcano surveillance using infrared cameras

    NASA Astrophysics Data System (ADS)

    Spampinato, Letizia; Calvari, Sonia; Oppenheimer, Clive; Boschi, Enzo

    2011-05-01

    Volcanic eruptions are commonly preceded, accompanied, and followed by variations of a number of detectable geophysical and geochemical manifestations. Many remote sensing techniques have been applied to tracking anomalies and eruptive precursors, and monitoring ongoing volcanic eruptions, offering obvious advantages over in situ techniques especially during hazardous activity. Whilst spaceborne instruments provide a distinct advantage for collecting data remotely in this regard, they still cannot match the spatial detail or time resolution achievable using portable imagers on the ground or aircraft. Hand-held infrared camera technology has advanced significantly over the last decade, resulting in a proliferation of commercially available instruments, such that volcano observatories are increasingly implementing them in monitoring efforts. Improved thermal surveillance of active volcanoes has not only enhanced hazard assessment but it has contributed substantially to understanding a variety of volcanic processes. Drawing on over a decade of operational volcano surveillance in Italy, we provide here a critical review of the application of infrared thermal cameras to volcano monitoring. Following a summary of key physical principles, instrument capabilities, and the practicalities and methods of data collection, we discuss the types of information that can be retrieved from thermal imagery and what they have contributed to hazard assessment and risk management, and to physical volcanology. With continued developments in thermal imager technology and lower instrument costs, there will be increasing opportunity to gather valuable observations of volcanoes. It is thus timely to review the state of the art and we hope thereby to stimulate further research and innovation in this area.

  7. Toward the camera rain gauge

    NASA Astrophysics Data System (ADS)

    Allamano, P.; Croci, A.; Laio, F.

    2015-03-01

    We propose a novel technique based on the quantitative detection of rain intensity from images, i.e., from pictures taken in rainy conditions. The method is fully analytical and based on the fundamentals of camera optics. A rigorous statistical framing of the technique allows one to obtain the rain rate estimates in terms of expected values and associated uncertainty. We show that the method can be profitably applied to real rain events, and we obtain promising results with errors of the order of ±25%. A precise quantification of the method's accuracy will require a more systematic and long-term comparison with benchmark measures. The significant step forward with respect to standard rain gauges resides in the possibility to retrieve measures at very high temporal resolution (e.g., 30 measures per minute) at a very low cost. Perspective applications include the possibility to dramatically increase the spatial density of rain observations by exporting the technique to crowdsourced pictures of rain acquired with cameras and smartphones.

  8. The Pluto System At Small Phase Angles

    NASA Astrophysics Data System (ADS)

    Verbiscer, Anne J.; Buie, Marc W.; Binzel, Richard; Ennico, Kimberly; Grundy, William M.; Olkin, Catherine B.; Showalter, Mark Robert; Spencer, John R.; Stern, S. Alan; Weaver, Harold A.; Young, Leslie; New Horizons Science Team

    2016-10-01

    Hubble Space Telescope observations of the Pluto system acquired during the New Horizons encounter epoch (HST Program 13667, M. Buie, PI) span the phase angle range from 0.06 to 1.7 degrees, enabling the measurement and characterization of the opposition effect for Pluto and its satellites at 0.58 microns using HST WFC3/UVIS with the F350LP filter, which has a broadband response and a pivot wavelength of 0.58 microns. At these small phase angles, differences in the opposition effect width and amplitude appear. The small satellites Nix and Hydra both exhibit a very narrow opposition surge, while the considerably larger moon Charon has a broader opposition surge. Microtextural surface properties derived from the shape and magnitude of the opposition surge of each surface contain a record of the collisional history of the system. We combine these small phase angle observations with those made at larger phase angles by the New Horizons Long Range Reconnaissance Imager (LORRI), which also has a broadband response with a pivot wavelength of 0.61 microns, to produce the most complete disk-integrated solar phase curves that we will have for decades to come. Modeling these disk-integrated phase curves generates sets of photometric parameters that will inform spectral modeling of the satellite surfaces as well as terrains on Pluto from spatially resolved New Horizons Ralph Linear Etalon Imaging Spectral Array (LEISA) data from 1.2 to 2.5 microns. Rotationally resolved phase curves of Pluto reveal opposition effects that only appear at phase angles less than 0.1 degree and have widths and amplitudes that are highly dependent on longitude and therefore on Pluto's diverse terrains. The high albedo region informally known as Sputnik Planum dominates the disk-integrated reflectance of Pluto on the New Horizons encounter hemisphere. These results lay the groundwork for observations at true opposition in 2018, when the Pluto system will be observable at phase angles so small that

  9. Development of narrow gap welding technology for extremely thick steel

    NASA Astrophysics Data System (ADS)

    Imai, K.; Saito, T.; Okumura, M.

    In the field of extremely thick steel, various narrow gap welding methods were developed on the basis of former welding methods and are used in practice. It is important to develop and improve automatic narrow gap welding, J edge preparation by gas cutting, the prevention of welding defects, wires for narrow gap welding and so on in order to expand the scope of application of the method. Narrow gap welding technologies are described, based on new concepts developed by Nippon Steel Corporation.

  10. 2. Photocopied July 1971 from photostat Jordan Narrows Folder #1, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    2. Photocopied July 1971 from photostat Jordan Narrows Folder #1, Engineering Department, Utah Power and Light Co., Salt Lake City, Utah. JORDAN NARROWS STATION. PLAN AND SECTION. - Salt Lake City Water & Electrical Power Company, Jordan Narrows Hydroelectric Plant, Jordan River, Riverton, Salt Lake County, UT

  11. 33 CFR 117.561 - Kent Island Narrows.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 33 Navigation and Navigable Waters 1 2014-07-01 2014-07-01 false Kent Island Narrows. 117.561... DRAWBRIDGE OPERATION REGULATIONS Specific Requirements Maryland § 117.561 Kent Island Narrows. The draw of the U.S. Route 50/301 bridge, mile 1.0, Kent Island Narrows, operates as follows: (a) From November...

  12. 33 CFR 117.561 - Kent Island Narrows.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 33 Navigation and Navigable Waters 1 2012-07-01 2012-07-01 false Kent Island Narrows. 117.561... DRAWBRIDGE OPERATION REGULATIONS Specific Requirements Maryland § 117.561 Kent Island Narrows. The draw of the U.S. Route 50/301 bridge, mile 1.0, Kent Island Narrows, operates as follows: (a) From November...

  13. 33 CFR 117.561 - Kent Island Narrows.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 33 Navigation and Navigable Waters 1 2013-07-01 2013-07-01 false Kent Island Narrows. 117.561... DRAWBRIDGE OPERATION REGULATIONS Specific Requirements Maryland § 117.561 Kent Island Narrows. The draw of the U.S. Route 50/301 bridge, mile 1.0, Kent Island Narrows, operates as follows: (a) From November...

  14. 33 CFR 117.561 - Kent Island Narrows.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 33 Navigation and Navigable Waters 1 2010-07-01 2010-07-01 false Kent Island Narrows. 117.561... DRAWBRIDGE OPERATION REGULATIONS Specific Requirements Maryland § 117.561 Kent Island Narrows. The draw of the U.S. Route 50/301 bridge, mile 1.0, Kent Island Narrows, operates as follows: (a) From November...

  15. 33 CFR 117.561 - Kent Island Narrows.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 33 Navigation and Navigable Waters 1 2011-07-01 2011-07-01 false Kent Island Narrows. 117.561... DRAWBRIDGE OPERATION REGULATIONS Specific Requirements Maryland § 117.561 Kent Island Narrows. The draw of the U.S. Route 50/301 bridge, mile 1.0, Kent Island Narrows, operates as follows: (a) From November...

  16. Promoting L2 Vocabulary Learning through Narrow Reading

    ERIC Educational Resources Information Center

    Kang, Eun Young

    2015-01-01

    Krashen (2004) has advocated that narrow reading, i.e., reading a series of texts addressing one specific topic, is an effective method to grow vocabulary. While narrow reading has been championed to have many advantages for L2 vocabulary learning, there remains a relative dearth of empirical studies that test the impact of narrow reading on L2…

  17. Cerebellopontine Angle Lipoma

    PubMed Central

    Schuhmann, Martin U.; Lüdemann, Wolf O.; Schreiber, Hartwig; Samii, Madjid

    1997-01-01

    Intracranial lipomas in an infratentorial and extra-axial location are extremely rare. The presented case of an extensive lipoma of the cerebellopontine angle (CPA) represents 0.05% of all CPA tumors operated on in our department from 1978 to 1996. The lipoma constitutes an important differential diagnosis because the clinical management differs significantly from other CPA lesions. The clinical presentation and management of the presented case are analyzed in comparison to all previously described cases of CPA lipomas. The etiology and the radiological features of CPA lipomas are reviewed and discussed. CPA lipomas are maldevelopmental lesions that may cause slowly progressive symptoms. Neuroradiology enables a reliable preoperative diagnosis. Attempts of complete lipoma resection usually result in severe neurological deficits. Therefore, we recommend a conservative approach in managing these patients. Limited surgery is indicated if the patient has an associated vascular compression syndrome or suffers from disabling vertigo. ImagesFigure 1Figure 2Figure 3Figure 4 PMID:17171031

  18. Heterodyne Interferometer Angle Metrology

    NASA Technical Reports Server (NTRS)

    Hahn, Inseob; Weilert, Mark A.; Wang, Xu; Goullioud, Renaud

    2010-01-01

    A compact, high-resolution angle measurement instrument has been developed that is based on a heterodyne interferometer. The common-path heterodyne interferometer metrology is used to measure displacements of a reflective target surface. In the interferometer setup, an optical mask is used to sample the measurement laser beam reflecting back from a target surface. Angular rotations, around two orthogonal axes in a plane perpendicular to the measurement- beam propagation direction, are determined simultaneously from the relative displacement measurement of the target surface. The device is used in a tracking telescope system where pitch and yaw measurements of a flat mirror were simultaneously performed with a sensitivity of 0.1 nrad, per second, and a measuring range of 0.15 mrad at a working distance of an order of a meter. The nonlinearity of the device is also measured less than one percent over the measurement range.

  19. High-speed measurement of nozzle swing angle of rocket engine based on monocular vision

    NASA Astrophysics Data System (ADS)

    Qu, Yufu; Yang, Haijuan

    2015-02-01

    A nozzle angle measurement system based on monocular vision is proposed to achieve high-speed and non-contact angle measurement of rocket engine nozzle. The measurement system consists of two illumination sources, a lens, a target board with spots, a high-speed camera, an image acquisition card and a PC. A target board with spots was fixed on the end of rocket engine nozzle. The image of the target board moved along with the rocket engine nozzle swing was captured by a high-speed camera and transferred to the PC by an image acquisition card. Then a data processing algorithm was utilized to acquire the swing angle of the engine nozzle. Experiment shows that the accuracy of swing angle measurement was 0.2° and the measurement frequency was up to 500Hz.

  20. A telephoto camera system with shooting direction control by gaze detection

    NASA Astrophysics Data System (ADS)

    Teraya, Daiki; Hachisu, Takumi; Yendo, Tomohiro

    2015-05-01

    For safe driving, it is important for driver to check traffic conditions such as traffic lights, or traffic signs as early as soon. If on-vehicle camera takes image of important objects to understand traffic conditions from long distance and shows these to driver, driver can understand traffic conditions earlier. To take image of long distance objects clearly, the focal length of camera must be long. When the focal length is long, on-vehicle camera doesn't have enough field of view to check traffic conditions. Therefore, in order to get necessary images from long distance, camera must have long-focal length and controllability of shooting direction. In previous study, driver indicates shooting direction on displayed image taken by a wide-angle camera, a direction controllable camera takes telescopic image, and displays these to driver. However, driver uses a touch panel to indicate the shooting direction in previous study. It is cause of disturb driving. So, we propose a telephoto camera system for driving support whose shooting direction is controlled by driver's gaze to avoid disturbing drive. This proposed system is composed of a gaze detector and an active telephoto camera whose shooting direction is controlled. We adopt non-wear detecting method to avoid hindrance to drive. The gaze detector measures driver's gaze by image processing. The shooting direction of the active telephoto camera is controlled by galvanometer scanners and the direction can be switched within a few milliseconds. We confirmed that the proposed system takes images of gazing straight ahead of subject by experiments.

  1. [The influence of camera-to-object distance and focal length on the representation of faces].

    PubMed

    Verhoff, Marcel A; Witzel, Carsten; Ramsthaler, Frank; Kreutz, Kerstin

    2007-01-01

    When one thinks of the so-called barrel or wide-angle distortion, grotesquely warped faces may come to mind. For less extreme cases with primarily inconspicuous facial proportions, the question, however, still arises whether there may be a resulting impact on the identification of faces. In the first experiment, 3 test persons were photographed at a fixed camera-to-object distance of 2 m. In the second experiment, 18 test persons were each photographed at a distance of 0.5 m and 2.0 m. For both experiments photographs were taken from a fixed angle of view in alignment with the Frankfurt Plane. An isolated effect of the focal length on facial proportions could not be demonstrated. On the other hand, changes in the camera-to-object distance clearly influenced facial proportions and shape. A standardized camera-to-object distance for passport photos, as well as reconstruction of the camera-to-object distance from crime scene photos and the use of this same distance in taking photographs for comparison of suspects are called for. A proposal to refer to wide-angle distortion as the nearness effect is put forward. PMID:17879705

  2. High Speed Digital Camera Technology Review

    NASA Technical Reports Server (NTRS)

    Clements, Sandra D.

    2009-01-01

    A High Speed Digital Camera Technology Review (HSD Review) is being conducted to evaluate the state-of-the-shelf in this rapidly progressing industry. Five HSD cameras supplied by four camera manufacturers participated in a Field Test during the Space Shuttle Discovery STS-128 launch. Each camera was also subjected to Bench Tests in the ASRC Imaging Development Laboratory. Evaluation of the data from the Field and Bench Tests is underway. Representatives from the imaging communities at NASA / KSC and the Optical Systems Group are participating as reviewers. A High Speed Digital Video Camera Draft Specification was updated to address Shuttle engineering imagery requirements based on findings from this HSD Review. This draft specification will serve as the template for a High Speed Digital Video Camera Specification to be developed for the wider OSG imaging community under OSG Task OS-33.

  3. Television camera video level control system

    NASA Technical Reports Server (NTRS)

    Kravitz, M.; Freedman, L. A.; Fredd, E. H.; Denef, D. E. (Inventor)

    1985-01-01

    A video level control system is provided which generates a normalized video signal for a camera processing circuit. The video level control system includes a lens iris which provides a controlled light signal to a camera tube. The camera tube converts the light signal provided by the lens iris into electrical signals. A feedback circuit in response to the electrical signals generated by the camera tube, provides feedback signals to the lens iris and the camera tube. This assures that a normalized video signal is provided in a first illumination range. An automatic gain control loop, which is also responsive to the electrical signals generated by the camera tube 4, operates in tandem with the feedback circuit. This assures that the normalized video signal is maintained in a second illumination range.

  4. Optimum Projection Angle for Attaining Maximum Distance in a Soccer Punt Kick

    PubMed Central

    Linthorne, Nicholas P.; Patel, Dipesh S.

    2011-01-01

    To produce the greatest horizontal distance in a punt kick the ball must be projected at an appropriate angle. Here, we investigated the optimum projection angle that maximises the distance attained in a punt kick by a soccer goalkeeper. Two male players performed many maximum-effort kicks using projection angles of between 10° and 90°. The kicks were recorded by a video camera at 100 Hz and a 2 D biomechanical analysis was conducted to obtain measures of the projection velocity, projection angle, projection height, ball spin rate, and foot velocity at impact. The player’s optimum projection angle was calculated by substituting mathematical equations for the relationships between the projection variables into the equations for the aerodynamic flight of a soccer ball. The calculated optimum projection angles were in agreement with the player’s preferred projection angles (40° and 44°). In projectile sports even a small dependence of projection velocity on projection angle is sufficient to produce a substantial shift in the optimum projection angle away from 45°. In the punt kicks studied here, the optimum projection angle was close to 45° because the projection velocity of the ball remained almost constant across all projection angles. This result is in contrast to throwing and jumping for maximum distance, where the projection velocity the athlete is able to achieve decreases substantially with increasing projection angle and so the optimum projection angle is well below 45°. Key points The optimum projection angle that maximizes the distance of a punt kick by a soccer goalkeeper is about 45°. The optimum projection angle is close to 45° because the projection velocity of the ball is almost the same at all projection angles. This result is in contrast to throwing and jumping for maximum distance, where the optimum projection angle is well below 45° because the projection velocity the athlete is able to achieve decreases substantially with increasing

  5. Monitoring the night sky with the Cerro Tololo All-Sky camera for the TMT and LSST projects

    NASA Astrophysics Data System (ADS)

    Walker, David E.; Schwarz, Hugo E.; Bustos, Edison

    2006-06-01

    The All-Sky camera used in the LSST and TMT site testing campaigns is described and some early results are shown. The All-Sky camera takes images of the entire visible hemisphere of sky every 30s in blue, red, Y and Z filters giving enhanced contrast for the detection of clouds, airglow and the near-infrared. Animation is used to show movement of clouds. An additional narrow band filter is centered on the most prominent line of the sodium vapor lamp spectra and is used to monitor any man-made light pollution near the site. The camera also detects aircraft lights and contrails, satellites, meteor(ite)s, local light polluters, and can be used for stellar extinction monitoring and for photometry of transient astronomical objects. For outreach and education the All-Sky camera can show wandering planets, diurnal rotation of the sky, the zodiacal light, and similar astronomical basics.

  6. Indium nitride: A narrow gap semiconductor

    SciTech Connect

    Wu, J.; Walukiewicz, W.; Yu, K.M.; Ager III, J.W.; Haller, E.E.; Lu, H.; Schaff, W.J.

    2002-08-14

    The optical properties of wurtzite InN grown on sapphire substrates by molecular-beam epitaxy have been characterized by optical absorption, photoluminescence, and photomodulated reflectance techniques. All these three characterization techniques show an energy gap for InN between 0.7 and 0.8 eV, much lower than the commonly accepted value of 1.9 eV. The photoluminescence peak energy is found to be sensitive to the free electron concentration of the sample. The peak energy exhibits a very weak hydrostatic pressure dependence and a small, anomalous blueshift with increasing temperature. The bandgap energies of In-rich InGaN alloys were found to be consistent with the narrow gap of InN. The bandgap bowing parameter was determined to be 1.43 eV in InGaN.

  7. Synchrotron studies of narrow band materials

    SciTech Connect

    Allen, J.W.

    1993-01-01

    Objective was to determine the single-particle electronic structure of selected narrow band materials in order to understand the relation between their electronic structures and novel low energy properties, such as mixed valence, heavy Fermions, Kondo effect, insulator-metal transitions, non-Fermi liquid behavior, and high-temperature superconductivity. This program supports photoemission spectroscopy (PES) at various synchrotrons. The progress is reported under the following section titles: ZSA (Zaanen-Sawatzky-Allen) systematics and I-M transitions in 3d transition metal oxides, insulator-metal transitions in superconducting cuprates, Fermi liquid and non-Fermi liquid behavior in angular resolved PES lineshapes, heavy-Fermion and non-Fermi liquid 5f electron systems, and Kondo insulators.

  8. Isolating prompt photons with narrow cones

    NASA Astrophysics Data System (ADS)

    Catani, S.; Fontannaz, M.; Guillet, J. Ph.; Pilon, E.

    2013-09-01

    We discuss the isolation of prompt photons in hadronic collisions by means of narrow isolation cones and the QCD computation of the corresponding cross sections. We reconsider the occurence of large perturbative terms with logarithmic dependence on the cone size and their impact on the fragmentation scale dependence. We cure the apparent perturbative violation of unitarity for small cone sizes, which had been noticed earlier in next-to-leading-order (NLO) calculations, by resumming the leading logarithmic dependence on the cone size. We discuss possible implications regarding the implementation of some hollow cone variants of the cone criterion, which simulate the experimental difficulty to impose isolation inside the region filled by the electromagnetic shower that develops in the calorimeter.

  9. Tissue characterization by using narrow band imaging

    NASA Astrophysics Data System (ADS)

    Gono, Kazuhiro

    2010-02-01

    NBI (Narrow Band Imaging) was first introduced in the market in 2005 as a technique enabling to enhance image contrast of capillaries on a mucosal surface(1). It is classified as an Optical-Digital Method for Image-Enhanced Endoscopy(2). To date, the application has widely spread not only to gastrointestinal fields such as esophagus, stomach and colon but also the organs such as bronchus and bladder. The main target tissue of NBI enhancement is capillaries. However, findings of many clinical studies conducted by endoscopy physicians have revealed that NBI observation enables to enhance more other structures in addition to capillaries. There is a close relationship between those enhanced structures and histological microstructure of a tissue. This report introduces the tissue microstructures enhanced by NBI and discusses the possibility of optimized illumination wavelength in observing living tissues.

  10. Ultra-narrow metallic armchair graphene nanoribbons

    NASA Astrophysics Data System (ADS)

    Kimouche, Amina; Ervasti, Mikko M.; Drost, Robert; Halonen, Simo; Harju, Ari; Joensuu, Pekka M.; Sainio, Jani; Liljeroth, Peter

    2015-12-01

    Graphene nanoribbons (GNRs)--narrow stripes of graphene--have emerged as promising building blocks for nanoelectronic devices. Recent advances in bottom-up synthesis have allowed production of atomically well-defined armchair GNRs with different widths and doping. While all experimentally studied GNRs have exhibited wide bandgaps, theory predicts that every third armchair GNR (widths of N=3m+2, where m is an integer) should be nearly metallic with a very small bandgap. Here, we synthesize the narrowest possible GNR belonging to this family (five carbon atoms wide, N=5). We study the evolution of the electronic bandgap and orbital structure of GNR segments as a function of their length using low-temperature scanning tunnelling microscopy and density-functional theory calculations. Already GNRs with lengths of 5 nm reach almost metallic behaviour with ~100 meV bandgap. Finally, we show that defects (kinks) in the GNRs do not strongly modify their electronic structure.

  11. Ultra-narrow metallic armchair graphene nanoribbons.

    PubMed

    Kimouche, Amina; Ervasti, Mikko M; Drost, Robert; Halonen, Simo; Harju, Ari; Joensuu, Pekka M; Sainio, Jani; Liljeroth, Peter

    2015-12-14

    Graphene nanoribbons (GNRs)-narrow stripes of graphene-have emerged as promising building blocks for nanoelectronic devices. Recent advances in bottom-up synthesis have allowed production of atomically well-defined armchair GNRs with different widths and doping. While all experimentally studied GNRs have exhibited wide bandgaps, theory predicts that every third armchair GNR (widths of N=3m+2, where m is an integer) should be nearly metallic with a very small bandgap. Here, we synthesize the narrowest possible GNR belonging to this family (five carbon atoms wide, N=5). We study the evolution of the electronic bandgap and orbital structure of GNR segments as a function of their length using low-temperature scanning tunnelling microscopy and density-functional theory calculations. Already GNRs with lengths of 5 nm reach almost metallic behaviour with ∼100 meV bandgap. Finally, we show that defects (kinks) in the GNRs do not strongly modify their electronic structure.

  12. Line Narrowing Parameter Measurement by Modulation Spectroscopy

    NASA Technical Reports Server (NTRS)

    Dharamsi, Amin N.

    1998-01-01

    Accurate Characterization of Oxygen A-Band Line Parameters by Wavelength Modulation Spectroscopy with tunable diode lasers is an ongoing research at Old Dominion University, under sponsorship from NASA Langley research Center. The work proposed here will be undertaken under the guidance of Dr. William Chu and Dr. Lamont Poole of the Aerosol Research Branch at NASA Langley-Research Center in Hampton, Virginia. The research was started about two years ago and utilizes wavelength modulation absorption spectroscopy with higher harmonic detection, a technique that we developed at Old Dominion University, to obtain the absorption line characteristics of the Oxygen A-band rovibronic lines. Accurate characterization of this absorption band is needed for processing of data that will be obtained in experiments such as the NASA Stratospheric Aerosol and Gas Experiment III (SAGE III) as part of the US Mission to Planet Earth. The research work for Summer Fellowship undertook a measurement of the Dicke line-narrowing parameters of the Oxygen A-Band lines by using wavelength modulation spectroscopy. Our previous theoretical results had indicated that such a measurement could be done sensitively and in a convenient fashion by using this type of spectroscopy. In particular, theoretical results had indicated that the signal magnitude would depend on pressure in a manner that was very sensitive to the narrowing parameter. One of the major tasks undertaken during the summer of 1998 was to establish experimentally that these theoretical predictions were correct. This was done successfully and the results of the work are being prepared for publication. Experimental Results were obtained in which the magnitude of the signal was measured as a function of pressure, for various harmonic detection orders (N = 1, 2, 3, 4, 5). A comparison with theoretical results was made, and it was shown that the agreement between theory and experiment was very good. More importantly, however, it was shown

  13. Diluted magnetic semiconductors with narrow band gaps

    NASA Astrophysics Data System (ADS)

    Gu, Bo; Maekawa, Sadamichi

    2016-10-01

    We propose a method to realize diluted magnetic semiconductors (DMSs) with p - and n -type carriers by choosing host semiconductors with a narrow band gap. By employing a combination of the density function theory and quantum Monte Carlo simulation, we demonstrate such semiconductors using Mn-doped BaZn2As2 , which has a band gap of 0.2 eV. In addition, we found a nontoxic DMS Mn-doped BaZn2Sb2 , of which the Curie temperature Tc is predicted to be higher than that of Mn-doped BaZn2As2 , the Tc of which was up to 230 K in a recent experiment.

  14. Robotic chair at steep and narrow stairways

    NASA Astrophysics Data System (ADS)

    Imazato, Masahiro; Yamaguchi, Masahiro; Moromugi, Shunji; Ishimatsu, Takakazu

    2007-12-01

    A robotic chair is developed to support mobility of elderly and disabled people living in the house where steep and narrow stairways are installed. In order to deal with such mobility problem the developed robotic chair has a compact original configuration. The robotic chair vertically moves by actuation of electric cylinders and horizontally moves by push-pull operation given by a care-giver. In order to navigate safely every action of the chair is checked by the operator. Up-and-down motions of the robotic chair on the stairway are executed through combinations of motor and cylinder actuations. Performance of the robotic chair was evaluated through two kinds of experiments. The excellent ability of the robotic chair could be confirmed through these experiments.

  15. Development of biostereometric experiments. [stereometric camera system

    NASA Technical Reports Server (NTRS)

    Herron, R. E.

    1978-01-01

    The stereometric camera was designed for close-range techniques in biostereometrics. The camera focusing distance of 360 mm to infinity covers a broad field of close-range photogrammetry. The design provides for a separate unit for the lens system and interchangeable backs on the camera for the use of single frame film exposure, roll-type film cassettes, or glass plates. The system incorporates the use of a surface contrast optical projector.

  16. Multiplex imaging with multiple-pinhole cameras

    NASA Technical Reports Server (NTRS)

    Brown, C.

    1974-01-01

    When making photographs in X rays or gamma rays with a multiple-pinhole camera, the individual images of an extended object such as the sun may be allowed to overlap. Then the situation is in many ways analogous to that in a multiplexing device such as a Fourier spectroscope. Some advantages and problems arising with such use of the camera are discussed, and expressions are derived to describe the relative efficacy of three exposure/postprocessing schemes using multiple-pinhole cameras.

  17. Electrostatic camera system functional design study

    NASA Technical Reports Server (NTRS)

    Botticelli, R. A.; Cook, F. J.; Moore, R. F.

    1972-01-01

    A functional design study for an electrostatic camera system for application to planetary missions is presented. The electrostatic camera can produce and store a large number of pictures and provide for transmission of the stored information at arbitrary times after exposure. Preliminary configuration drawings and circuit diagrams for the system are illustrated. The camera system's size, weight, power consumption, and performance are characterized. Tradeoffs between system weight, power, and storage capacity are identified.

  18. The CTIO CCD-TV acquisition camera

    NASA Astrophysics Data System (ADS)

    Walker, Alistair R.; Schmidt, Ricardo

    A prototype CCD-TV camera has been built at CTIO, conceptually similar to the cameras in use at Lick Observatory. A GEC CCD is used as the detector, cooled thermo-electrically to -45C. Pictures are displayed via an IBM PC clone computer and an ITI image display board. Results of tests at the CTIO telescopes are discussed, including comparisons with the RCA ISIT cameras used at present for acquisition and guiding.

  19. Limbus Impact on Off-angle Iris Degradation

    SciTech Connect

    Karakaya, Mahmut; Barstow, Del R; Santos-Villalobos, Hector J; Thompson, Joseph W; Bolme, David S; Boehnen, Chris Bensing

    2013-01-01

    The accuracy of iris recognition depends on the quality of data capture and is negatively affected by several factors such as angle, occlusion, and dilation. Off-angle iris recognition is a new research focus in biometrics that tries to address several issues including corneal refraction, complex 3D iris texture, and blur. In this paper, we present an additional significant challenge that degrades the performance of the off-angle iris recognition systems, called the limbus effect . The limbus is the region at the border of the cornea where the cornea joins the sclera. The limbus is a semitransparent tissue that occludes a side portion of the iris plane. The amount of occluded iris texture on the side nearest the camera increases as the image acquisition angle increases. Without considering the role of the limbus effect, it is difficult to design an accurate off-angle iris recognition system. To the best of our knowledge, this is the first work that investigates the limbus effect in detail from a biometrics perspective. Based on results from real images and simulated experiments with real iris texture, the limbus effect increases the hamming distance score between frontal and off-angle iris images ranging from 0.05 to 0.2 depending upon the limbus height.

  20. An Instability in Narrow Planetary Rings

    NASA Astrophysics Data System (ADS)

    Weiss, J. W.; Stewart, G. R.

    2003-08-01

    We will present our work investigating the behavior of narrow planetary rings with low dispersion velocities. Such narrow a ring will be initially unstable to self-gravitational collapse. After the collapse, the ring is collisionally very dense. At this stage, it is subject to a new instability. Waves appear on the inner and outer edges of the ring within half of an orbital period. The ring then breaks apart radially, taking approximately a quarter of an orbital period of do so. As clumps of ring particles expand radially away from the dense ring, Kepler shear causes these clumps to stretch out azimuthally, and eventually collapse into a new set of dense rings. Small-scale repetitions of the original instability in these new rings eventually leads to a stabilized broad ring with higher dispersion velocities than the initial ring. Preliminary results indicate that this instability may be operating on small scales in broad rings in the wake-like features seen by Salo and others. Some intriguing properties have been observed during this instability. The most significant is a coherence in the epicyclic phases of the particles. Both self-gravity and collisions in the ring operated to create and enforce this coherence. The coherence might also be responsible for the instability to radial expansion. We also observe that guiding centers of the particles do not migrate to the center of the ring during the collapse phase of the ring. In fact, guiding centers move radially away from the core of the ring during this phase, consistent with global conservation of angular momentum. We will show the results of our simulations to date, including movies of the evolution of various parameters. (Audiences members wanting popcorn are advised to bring their own.) This work is supported by a NASA Graduate Student Research Program grant and by the Cassini mission.