These are representative sample records from Science.gov related to your search topic.
For comprehensive and current results, perform a real-time search at Science.gov.
1

Narrow Angle movie  

NASA Technical Reports Server (NTRS)

This brief three-frame movie of the Moon was made from three Cassini narrow-angle images as the spacecraft passed by the Moon on the way to its closest approach with Earth on August 17, 1999. The purpose of this particular set of images was to calibrate the spectral response of the narrow-angle camera and to test its 'on-chip summing mode' data compression technique in flight. From left to right, they show the Moon in the green, blue and ultraviolet regions of the spectrum in 40, 60 and 80 millisecond exposures, respectively. All three images have been scaled so that the brightness of Crisium basin, the dark circular region in the upper right, is the same in each image. The spatial scale in the blue and ultraviolet images is 1.4 miles per pixel (2.3 kilometers). The original scale in the green image (which was captured in the usual manner and then reduced in size by 2x2 pixel summing within the camera system) was 2.8 miles per pixel (4.6 kilometers). It has been enlarged for display to the same scale as the other two. The imaging data were processed and released by the Cassini Imaging Central Laboratory for Operations (CICLOPS) at the University of Arizona's Lunar and Planetary Laboratory, Tucson, AZ.

Photo Credit: NASA/JPL/Cassini Imaging Team/University of Arizona

Cassini, launched in 1997, is a joint mission of NASA, the European Space Agency and Italian Space Agency. The mission is managed by NASA's Jet Propulsion Laboratory, Pasadena, CA, for NASA's Office of Space Science, Washington DC. JPL is a division of the California Institute of Technology, Pasadena, CA.

1999-01-01

2

MarcoPolo-R narrow angle camera: a three-mirror anastigmat design proposal with a smart finite conjugates refocusing optical system  

E-print Network

MarcoPolo-R is a medium-class space mission proposed for the 2015-2025 ESA Cosmic Vision Program with primary goal to return to Earth an unaltered sample from a primitive near-Earth asteroid (NEA). Among the proposed instruments on board, its narrow-angle camera (NAC) should be able to image the candidate object with spatial resolution of 3 mm per pixel at 200 m from its surface. The camera should also be able to support the lander descent operations by imaging the target from several distances in order to locate a suitable place for the landing. Hence a refocusing system is requested to accomplish this task, extending its imaging capabilities. Here we present a three-mirror anastigmat (TMA) common-axis optical design, providing high-quality imaging performances by selecting as entrance pupil the system aperture stop and exploiting the motion of a single mirror inside the instrument to allow the wide image refocusing requested, from infinity up to 200 m above the NEA surface. Such proposal matches with the NA...

Antichi, Jacopo; Magrin, Demetrio; Ragazzoni, Roberto; Cremonese, Gabriele

2012-01-01

3

Methane Band and Continuum Band Imaging of Titan's Atmosphere Using Cassini ISS Narrow Angle Camera Pictures from the CURE/Cassini Imaging Project  

NASA Astrophysics Data System (ADS)

The study of Titan's atmosphere, which bears resemblance to early Earth's, may help us understand more of our own. Constructing a Monte Carlo model of Titan's atmosphere is helpful to achieve this goal. Methane (MT) and continuum band (CB) images of Titan taken by the CURE/Cassini Imaging Project, using the Cassini Narrow Angle Camera (NAC) were analyzed. They were scheduled by Cassini Optical Navigation. Images were obtained at phase 53°, 112°, 161°, and 165°. They include 22 total MT1(center wavelength at 619nm), MT2(727nm), MT3(889nm), CB1(635nm), CB2(751nm), and CB3(938nm) images. They were reduced with previously written scripts using the National Optical Astronomy Observatory Image Reduction and Analysis Facility scientific analysis suite. Correction for horizontal and vertical banding and cosmic ray hits were made. The MT images were registered with corresponding CB images to ensure that subsequently measured fluxes ratios came from the same parts of the atmosphere. Preliminary DN limb-to-limb scans and loci of the haze layers will be presented. Accurate estimates of the sub-spacecraft points on each picture will be presented. Flux ratios (FMT/FCB=Q0) along the scans and total absorption coefficients along the lines of sight from the spacecraft through the pixels (and into Titan) will also be presented.

Shitanishi, Jennifer; Gillam, S. D.

2009-05-01

4

Fluorescence nanoscopy by polarization modulation and polarization angle narrowing.  

PubMed

When excited with rotating linear polarized light, differently oriented fluorescent dyes emit periodic signals peaking at different times. We show that measurement of the average orientation of fluorescent dyes attached to rigid sample structures mapped to regularly defined (50 nm)(2) image nanoareas can provide subdiffraction resolution (super resolution by polarization demodulation, SPoD). Because the polarization angle range for effective excitation of an oriented molecule is rather broad and unspecific, we narrowed this range by simultaneous irradiation with a second, de-excitation, beam possessing a polarization perpendicular to the excitation beam (excitation polarization angle narrowing, ExPAN). This shortened the periodic emission flashes, allowing better discrimination between molecules or nanoareas. Our method requires neither the generation of nanometric interference structures nor the use of switchable or blinking fluorescent probes. We applied the method to standard wide-field microscopy with camera detection and to two-photon scanning microscopy, imaging the fine structural details of neuronal spines. PMID:24705472

Hafi, Nour; Grunwald, Matthias; van den Heuvel, Laura S; Aspelmeier, Timo; Chen, Jian-Hua; Zagrebelsky, Marta; Schütte, Ole M; Steinem, Claudia; Korte, Martin; Munk, Axel; Walla, Peter J

2014-05-01

5

A camera for a narrow and deep welding groove  

NASA Astrophysics Data System (ADS)

In this paper welding seam imaging in a very narrow and deep groove is presented. Standard camera optics can not be used as it does not reach the bottom of the groove. Therefore, selecting suitable imaging optics and components was the main challenge of the study. The implementation is based on image transmission via a borescope. The borescope has a long and narrow tube with graded index relay optics inside. To avoid excessive heating, the borescope tube is enclosed in a cooling pipe. The performance of the imaging system was tested by measuring its modulation transfer function (MTF) and visually evaluated its distortion. The results show that a borescope providing VGA resolution is adequate for the application. The spectrum of the welding processes was studied to determine optimum window to observe the welding seam and electrode. Optimal bandwidth was found in region of 700nm-1000nm.

Vehmanen, Miika S.; Korhonen, Mika; Mäkynen, Anssi J.

2008-06-01

6

10. 22'X34' original blueprint, VariableAngle Launcher, 'SIDE VIEW CAMERA CARSTEEL ...  

Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

10. 22'X34' original blueprint, Variable-Angle Launcher, 'SIDE VIEW CAMERA CAR-STEEL FRAME AND AXLES' drawn at 1/2'=1'-0'. (BOURD Sketch # 209124). - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

7

13. 22'X34' original vellum, VariableAngle Launcher, 'SIDEVIEW CAMERA CAR TRACK ...  

Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

13. 22'X34' original vellum, Variable-Angle Launcher, 'SIDEVIEW CAMERA CAR TRACK DETAILS' drawn at 1/4'=1'-0' (BUORD Sketch # 208078, PAPW 908). - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

8

Improved iris localization by using wide and narrow field of view cameras for iris recognition  

NASA Astrophysics Data System (ADS)

Biometrics is a method of identifying individuals by their physiological or behavioral characteristics. Among other biometric identifiers, iris recognition has been widely used for various applications that require a high level of security. When a conventional iris recognition camera is used, the size and position of the iris region in a captured image vary according to the X, Y positions of a user's eye and the Z distance between a user and the camera. Therefore, the searching area of the iris detection algorithm is increased, which can inevitably decrease both the detection speed and accuracy. To solve these problems, we propose a new method of iris localization that uses wide field of view (WFOV) and narrow field of view (NFOV) cameras. Our study is new as compared to previous studies in the following four ways. First, the device used in our research acquires three images, one each of the face and both irises, using one WFOV and two NFOV cameras simultaneously. The relation between the WFOV and NFOV cameras is determined by simple geometric transformation without complex calibration. Second, the Z distance (between a user's eye and the iris camera) is estimated based on the iris size in the WFOV image and anthropometric data of the size of the human iris. Third, the accuracy of the geometric transformation between the WFOV and NFOV cameras is enhanced by using multiple matrices of the transformation according to the Z distance. Fourth, the searching region for iris localization in the NFOV image is significantly reduced based on the detected iris region in the WFOV image and the matrix of geometric transformation corresponding to the estimated Z distance. Experimental results showed that the performance of the proposed iris localization method is better than that of conventional methods in terms of accuracy and processing time.

Kim, Yeong Gon; Shin, Kwang Yong; Park, Kang Ryoung

2013-10-01

9

Use of a wide angle CCD line camera for BRDF measurements  

NASA Astrophysics Data System (ADS)

In order to determine the Bi-directional Reflectance Distribution Function (BRDF) of natural surfaces a CCD line camera is used. This allows measurements under natural conditions with a high azimuth and zenith angular resolution in a short time. The CCD line spans a field of view of 80° as the zenith angle range. For covering the azimuth range, the camera is mounted on a rotating device and an extendible boom provides an aerial platform. This set up allows the measurement of the [almost] complete reflectance distribution of the surface below the camera for the 30-s rotation period of the camera. The camera used for this set up is the wide angle airborne camera (WAAC), which was developed at the DLR for airborne stereo imaging purposes. This paper presents the radiometric calibration of the system and shows the initial results of our approach in measuring the BRDF with high angular resolution for a short period.

Demircan, A.; Schuster, R.; Radke, M.; Schönermark, M.; Röser, H. P.

2000-02-01

10

Use of a wide angle CCD line camera for BRDF measurements  

Microsoft Academic Search

In order to determine the Bi-directional Reflectance Distribution Function (BRDF) of natural surfaces a CCD line camera is used. This allows measurements under natural conditions with a high azimuth and zenith angular resolution in a short time. The CCD line spans a field of view of 80° as the zenith angle range. For covering the azimuth range, the camera is

A. Demircan; R. Schuster; M. Radke; M. Schönermark; H. P Röser

2000-01-01

11

On-Orbit Geometric Calibration of the Lunar Reconnaissance Orbiter Wide Angle Camera  

NASA Astrophysics Data System (ADS)

Lunar Reconnaissance Orbiter (LRO) is equipped with a single Wide Angle Camera (WAC) [1] designed to collect monochromatic and multispectral observations of the lunar surface. Cartographically accurate image mosaics and stereo image based terrain models requires the position of each pixel in a given image be known to a corresponding point on the lunar surface with a high degree of accuracy and precision. The Lunar Reconnaissance Orbiter Camera (LROC) team initially characterized the WAC geometry prior to launch at the Malin Space Science Systems calibration facility. After lunar orbit insertion, the LROC team recognized spatially varying geometric offsets between color bands. These misregistrations made analysis of the color data problematic and showed that refinements to the pre-launch geometric analysis were necessary. The geometric parameters that define the WAC optical system were characterized from statistics gathered from co-registering over 84,000 image pairs. For each pair, we registered all five visible WAC bands to a precisely rectified Narrow Angle Camera (NAC) image (accuracy <15 m) [2] to compute key geometric parameters. In total, we registered 2,896 monochrome and 1,079 color WAC observations to nearly 34,000 NAC observations and collected over 13.7 million data points across the visible portion of the WAC CCD. Using the collected statistics, we refined the relative pointing (yaw, pitch and roll), effective focal length, principal point coordinates, and radial distortion coefficients. This large dataset also revealed spatial offsets between bands after orthorectification due to chromatic aberrations in the optical system. As white light enters the optical system, the light bends at different magnitudes as a function of wavelength, causing a single incident ray to disperse in a spectral spread of color [3,4]. This lateral chromatic aberration effect, also known as 'chromatic difference in magnification' [5] introduces variation to the effective focal length for each WAC band. Secondly, tangential distortions caused by minor decentering in the optical system altered the derived exterior orientation parameters for each 14-line WAC band. We computed the geometric parameter sets separately for each band to characterize the lateral chromatic aberrations and the decentering components in the WAC optical system. From this approach, we negated the need for additional tangential terms in the distortion model, thus reducing the number of computations during image orthorectification and therefore expediting the orthorectification process. We undertook a similar process for refining the geometry for the UV bands (321 and 360 nm), except we registered each UV bands to orthorectified visible bands of the same WAC observation (the visible bands have resolutions 4 times greater than the UV). The resulting 7-band camera model with refined geometric parameters enables map projection with sub-pixel accuracy. References: [1] Robinson et al. (2010) Space Sci. Rev. 150, 81-124 [2] Wagner et al. (2013) Lunar Sci Forum [3] Mahajan, V.N. (1998) Optical Imaging and Aberrations [4] Fiete, R.D. (2013), Manual of Photogrammetry, pp. 359-450 [5] Brown, D.C. (1966) Photometric Eng. 32, 444-462.

Speyerer, E. J.; Wagner, R.; Robinson, M. S.

2013-12-01

12

Camera Calibration for Miniature, Low-cost, Wide-angle Imaging Systems  

Microsoft Academic Search

This paper presents a new model and an extension to an existing algorithm for camera calibration. The main goal of the proposed approach is to cal- ibrate miniature, low-cost, wide-angle fisheye lenses. The model has been verified with a calibration implementation and was tested on real data. Ex- periments show that the proposed model improves the accuracy compared to the

Oliver Frank; Roman Katz; Christel-Loic Tisse; Hugh F. Durrant-Whyte

2007-01-01

13

Large zenith angle observations with the high-resolution GRANITE III camera  

E-print Network

The GRANITE III camera of the Whipple Cherenkov Telescope at the Fred Lawrence Whipple Observatory on Mount Hopkins, Arizona (2300 m a.s.l.) has the highest angular resolution of all cameras used on this telescope so far. The central region of the camera has 379 pixels with an individual angular diameter of 0.12 degrees. This makes the instrument especially suitable for observations of gamma-induced air-showers at large zenith angles since the increase in average distance to the shower maximum leads to smaller shower images in the focal plane of the telescope. We examine the performance of the telescope for observations of gamma-induced air-showers at zenith angles up to 63 degrees based on observations of Mkn 421 and using Monte Carlo Simulations. An improvement to the standard data analysis is suggested.

D. Petry; the VERITAS Collaboration

2001-08-06

14

Calibration of a trinocular system formed with wide angle lens cameras.  

PubMed

To obtain 3D information of large areas, wide angle lens cameras are used to reduce the number of cameras as much as possible. However, since images are high distorted, errors in point correspondences increase and 3D information could be erroneous. To increase the number of data from images and to improve the 3D information, trinocular sensors are used. In this paper a calibration method for a trinocular sensor formed with wide angle lens cameras is proposed. First pixels locations in the images are corrected using a set of constraints which define the image formation in a trinocular system. When pixels location are corrected, lens distortion and trifocal tensor is computed. PMID:23262716

Ricolfe-Viala, Carlos; Sanchez-Salmeron, Antonio-Jose; Valera, Angel

2012-12-01

15

Simulating a dual beam combiner at SUSI for narrow-angle astrometry  

NASA Astrophysics Data System (ADS)

The Sydney University Stellar Interferometer (SUSI) has two beam combiners, i.e. the Precision Astronomical Visible Observations (PAVO) and the Microarcsecond University of Sydney Companion Astrometry (MUSCA). The primary beam combiner, PAVO, can be operated independently and is typically used to measure properties of binary stars of less than 50 milliarcsec (mas) separation and the angular diameters of single stars. On the other hand, MUSCA was recently installed and must be used in tandem with the former. It is dedicated for microarcsecond precision narrow-angle astrometry of close binary stars. The performance evaluation and development of the data reduction pipeline for the new setup was assisted by an in-house computer simulation tool developed for this and related purposes. This paper describes the framework of the simulation tool, simulations carried out to evaluate the performance of each beam combiner and the expected astrometric precision of the dual beam combiner setup, both at SUSI and possible future sites.

Kok, Yitping; Maestro, Vicente; Ireland, Michael J.; Tuthill, Peter G.; Robertson, J. Gordon

2013-08-01

16

Narrow Angle Wide Spectral Range Radiometer Design FEANICS/REEFS Radiometer Design Report  

NASA Technical Reports Server (NTRS)

A critical measurement for the Radiative Enhancement Effects on Flame Spread (REEFS) microgravity combustion experiment is the net radiative flux emitted from the gases and from the solid fuel bed. These quantities are measured using a set of narrow angle, wide spectral range radiometers. The radiometers are required to have an angular field of view of 1.2 degrees and measure over the spectral range of 0.6 to 30 microns, which presents a challenging design effort. This report details the design of this radiometer system including field of view, radiometer response, radiometric calculations, temperature effects, error sources, baffling and amplifiers. This report presents some radiometer specific data but does not present any REEFS experiment data.

Camperchioli, William

2005-01-01

17

In situ measurements of particle friction angles in steep, narrow channels  

NASA Astrophysics Data System (ADS)

The persistent observation that sediment requires increased fluid stresses to move on steeper channels has inspired a wide range of explanations, which can loosely be divided into those that invoke increased grain stability (friction angle, ?) and those that require altered flow hydraulics in steep channels. Measurements of bulk fluid forces over a wide range of channel slopes (? ? 22°) have been obtained using laboratory flume experiments that can control for grain stability and show that altered flow hydraulics do play a role in increased critical shear stress. However, measurements of grain stability are almost all limited to channel slopes less than a few degrees. These friction angle studies have been conducted by tilting a fixed gravel bed with a single loose particle until dislodgment, or by directly measuring the forces required to dislodge a particle using a load cell. The latter methodology is less common but offers the advantage of quickly measuring the friction angles of in situ grains in natural river channels. Indeed, it has enabled the collection of extremely large datasets at low slopes [e.g., Johnston et al., 1998]. We are adding to this dataset with measurements from several natural steep channels in the San Gabriel Mountains, CA to test if the particle friction angle changes systematically as a function of slope or width-to-grain size ratio (W/D50), which is thought to determine the propensity for particle jamming. Using a load cell that records peak forces we measure the minimum force required to pull a particle from its pocket in the downstream direction and the particle weight. Particles are sampled over a regular grid and we record the percentage of the particle buried by fines and the qualitative degree of interlocking. Preliminary results from three sites with bed slopes of ? = 2.9°, 3.2°, and 9.0° suggest that the at-a-site variability in friction angle is much higher than between-site variability, and that median values do not vary in a consistent manner with bed slope (? = 51°, 67°, and 65°, respectively). At an individual site the degree of interlocking is the primary control on particle friction angle. However, the degree of interlocking was not higher in the steep (? = 9.0°), narrow (W/D50 = 12.5) channel. This indicates that increased grain stability may not play a crucial role in increasing the threshold shear stresses required for sediment motion on very steep slopes.

Prancevic, J.; Lamb, M. P.

2013-12-01

18

Large-angle pinhole gamma camera with depth-of-interaction detector for contamination monitoring  

NASA Astrophysics Data System (ADS)

The gamma camera system was designed for monitoring the medical fields such as a radiopharmaceutical preparation lab or a patient waiting room (after source injection) in the division of nuclear medicine. However, gamma cameras equipped with a large-angle pinhole collimator and a thick monolithic crystal suffer from the degradation of the spatial resolution at the periphery region due to parallax error by obliquely incident photons. To improve the uniformity of the spatial resolution across the field of view (FOV), we proposed a three-layer crystal detector with a maximum-likelihood position-estimation (MLPE) method, which can measure depth-of-interaction (DOI) information. The aim of this study was to develop and evaluate the performance of new detector experimentally. The proposed detector employed three layers of monolithic CsI(Tl) crystals, each of which is 50.0×50.0×2.0 mm3, and a large-angle pinhole collimator with an acceptance angle of 120°. The bottom surface of the third layer was directly coupled to an 8×8 channel position-sensitive photomultiplier tube (PSPMT, Hamamatsu H8500C). The PSPMT was read out using a resistive charge divider, which multiplexes 64 anodes into 8(X)+8(Y) channels. Gaussian-based MLPE method has been implemented using experimentally measured detector response functions (DRFs). Tc-99 m point source was imaged at different positions with and without DOI measurements. Experimental results showed that the spatial resolution was degraded gradually as the source moved from the center to the periphery of the FOV without DOI information but the DOI detector showed the marked improvement in the spatial resolution, especially at off-center by correcting the parallax error. In this paper, our new detector with DOI capability proved to characterize reliably the gamma event position with the high and uniform spatial resolution, so that the large-angle pinhole gamma camera could be a useful tool in contamination monitoring.

Baek, Cheol-Ha; Kim, Hyun-Il; Hwang, Ji Yeon; Jung An, Su; Kim, Kwang Hyun; Kwak, Sung-Woo; Chung, Yong Hyun

2011-08-01

19

Comparison of Scheimpflug imaging and spectral domain anterior segment optical coherence tomography for detection of narrow anterior chamber angles  

Microsoft Academic Search

PurposeTo compare the performance of anterior chamber volume (ACV) and anterior chamber depth (ACD) obtained using Scheimpflug imaging with angle opening distance (AOD500) and trabecular-iris space area (TISA500) obtained using spectral domain anterior segment optical coherence tomography (SD-ASOCT) in detecting narrow angles classified using gonioscopy.MethodsIn this prospective, cross-sectional observational study, 265 eyes of 265 consecutive patients underwent sequential Scheimpflug imaging,

D S Grewal; G S Brar; R Jain; S P S Grewal

2011-01-01

20

Optical design of the wide angle camera for the Rosetta mission.  

PubMed

The final optical design of the Wide Angle Camera for the Rosetta mission to the P/Wirtanen comet is described. This camera is an F/5.6 telescope with a rather large 12 degrees x 12 degrees field of view. To satisfy the scientific requirements for spatial resolution, contrast capability, and spectral coverage, a two-mirror, off-axis, and unobstructed optical design, believed to be novel, has been adopted. This configuration has been simulated with a ray-tracing code, showing that theoretically more than 80% of the collimated beam energy falls within a single pixel (20" x 20") over the whole camera field of view and that the possible contrast ratio is smaller than 1/1000. Moreover, this novel optical design is rather simple from a mechanical point of view and is compact and relatively easy to align. All these characteristics make this type of camera rather flexible and also suitable for other space missions with similar performance requirements. PMID:11900025

Naletto, Giampiero; Da, Deppo Vania; Pelizzo, Maria Guglielmina; Ragazzoni, Roberto; Marchetti, Enrico

2002-03-01

21

Dynamic closed-loop test for real-time drift angle adjustment of space camera on the Earth  

NASA Astrophysics Data System (ADS)

In order to eliminate the influence of aircraft attitude angle to the image quality of space camera, and assure that the drift angle of space camera could be accurately adjusted at the orbit, a novel closed-loop test method is provided for real-time drift angle adjustment of space camera on the Earth. A long focal length dynamic aim generator is applied to simulate the image motion and the variety drift angle, and to detect the precision of the image motion compensation machinery and the capability of the drift angle control system. The computer system is used to control the dynamic aim generator, accomplish the data processing, transmit and receive the data information. The seamless connection and the data transmission between the aim generator and the aircraft simulation devices are constituted. The command, parameter and drift angle data transmitted by the simulation devices are received by the space camera at the real time, then the photos are taken and the draft angle is adjusted simultaneously. It is shown that the drift angle can be accurately tracked by the space camera at the real time, and the detective method satisfies the test requirement.

Hu, Jun; Cao, Xiaotao; Wang, Dong; Wu, Weiping; Xu, Shuyan

2010-10-01

22

Lunar Reconnaissance Orbiter Camera (LROC) Instrument Overview  

Microsoft Academic Search

The Lunar Reconnaissance Orbiter Camera (LROC) Wide Angle Camera (WAC) and Narrow Angle Cameras (NACs) are on the NASA Lunar\\u000a Reconnaissance Orbiter (LRO). The WAC is a 7-color push-frame camera (100 and 400 m\\/pixel visible and UV, respectively), while\\u000a the two NACs are monochrome narrow-angle linescan imagers (0.5 m\\/pixel). The primary mission of LRO is to obtain measurements\\u000a of the Moon that

M. S. Robinson; S. M. Brylow; M. Tschimmel; D. Humm; S. J. Lawrence; P. C. Thomas; B. W. Denevi; E. Bowman-Cisneros; J. Zerr; M. A. Ravine; M. A. Caplinger; F. T. Ghaemi; J. A. Schaffner; M. C. Malin; P. Mahanti; A. Bartels; J. Anderson; T. N. Tran; E. M. Eliason; A. S. McEwen; E. Turtle; B. L. Jolliff; H. Hiesinger

2010-01-01

23

Development of soft x-ray large solid angle camera onboard WF-MAXI  

NASA Astrophysics Data System (ADS)

Wide-Field MAXI (WF-MAXI) planned to be installed in Japanese Experiment Module "Kibo" Exposed Facility of the international space station (ISS). WF-MAXI consists of two types of cameras, Soft X-ray Large Solid Angle Camera (SLC) and Hard X-ray Monitor (HXM). HXM is multi-channel arrays of CsI scintillators coupled with avalanche photodiodes (APDs) which covers the energy range of 20 - 200 keV. SLC is arrays of CCD, which is evolved version of MAXI/SSC. Instead of slit and collimator in SSC, SLC is equipped with coded mask allowing its field of view to 20% of all sky at any given time, and its location determination accuracy to few arcminutes. In older to achieve larger effective area, the number of CCD chip and the size of each chip will be larger than that of SSC. We are planning to use 59 x 31 mm2 CCD chip provided by Hamamatsu Photonics. Each camera will be quipped with 16 CCDs and total of 4 cameras will be installed in WF-MAXI. Since SLC utilize X-ray CCDs it must equip active cooling system for CCDs. Instead of using the peltier cooler, we use mechanical coolers that are also employed in Astro-H. In this way we can cool the CCDs down to -100C. ISS orbit around the earth in 90 minutes; therefore a point source moves 4 arcminutes per second. In order to achieve location determination accuracy, we need fast readout from CCD. The pulse heights are stacked into a single row along the vertical direction. Charge is transferred continuously, thus the spatial information along the vertical direction is lost and replaced with the precise arrival time information. Currently we are making experimental model of the camera body including the CCD and electronics for the CCDs. In this paper, we show the development status of SLC.

Kimura, Masashi; Tomida, Hiroshi; Ueno, Shiro; Kawai, Nobuyuki; Yatsu, Yoichi; Arimoto, Makoto; Mihara, Tatehiro; Serino, Motoko; Tsunemi, Hiroshi; Yoshida, Atsumasa; Sakamoto, Takanori; Kohmura, Takayoshi; Negoro, Hitoshi

2014-07-01

24

Initial Results of 3D Topographic Mapping Using Lunar Reconnaissance Orbiter Camera (LROC) Stereo Imagery  

Microsoft Academic Search

The Lunar Reconnaissance Orbiter (LRO), launched June 18, 2009, carries the Lunar Reconnaissance Orbiter Camera (LROC) as one of seven remote sensing instruments on board. The camera system is equipped with a Wide Angle Camera (WAC) and two Narrow Angle Cameras (NAC) for systematic lunar surface mapping and detailed site characterization for potential landing site selection and resource identification. The

R. Li; J. Oberst; A. S. McEwen; B. A. Archinal; R. A. Beyer; P. C. Thomas; Y. Chen; J. Hwangbo; J. D. Lawver; F. Scholten; S. S. Mattson; A. E. Howington-Kraus; M. S. Robinson

2009-01-01

25

A New Approach to Micro-arcsecond Astrometry with SIM Allowing Early Mission Narrow Angle Measurements of Compelling Astronomical Targets  

NASA Technical Reports Server (NTRS)

The Space Interferometry Mission (SIM) is capable of detecting and measuring the mass of terrestrial planets around stars other than our own. It can measure the mass of black holes and the visual orbits of radio and x-ray binary sources. SIM makes possible a new level of understanding of complex astrophysical processes. SIM achieves its high precision in the so-called narrow-angle regime. This is defined by a 1 degree diameter field in which the position of a target star is measured with respect to a set of reference stars. The observation is performed in two parts: first, SIM observes a grid of stars that spans the full sky. After a few years, repeated observations of the grid allow one to determine the orientation of the interferometer baseline. Second, throughout the mission, SIM periodically observes in the narrow-angle mode. Every narrow-angle observation is linked to the grid to determine the precise attitude and length of the baseline. The narrow angle process demands patience. It is not until five years after launch that SIM achieves its ultimate accuracy of 1 microarcsecond. The accuracy is degraded by a factor of approx. 2 at mid-mission. Our work proposes a technique for narrow angle astrometry that does not rely on the measurement of grid stars. This technique, called Gridless Narrow Angle Astrometry (GNAA) can obtain microarcsecond accuracy and can detect extra-solar planets and other exciting objects with a few days of observation. It can be applied as early as during the first six months of in-orbit calibration (IOC). The motivations for doing this are strong. First, and obviously, it is an insurance policy against a catastrophic mid-mission failure. Second, at the start of the mission, with several space-based interferometers in the planning or implementation phase, NASA will be eager to capture the public's imagination with interferometric science. Third, early results and a technique that can duplicate those results throughout the mission will give the analysts important experience in the proper use and calibration of SIM.

Shaklan, Stuart; Pan, Xiaopei

2004-01-01

26

Development of a large-angle pinhole gamma camera with depth-of-interaction capability for small animal imaging  

NASA Astrophysics Data System (ADS)

A large-angle gamma camera was developed for imaging small animal models used in medical and biological research. The simulation study shows that a large field of view (FOV) system provides higher sensitivity with respect to a typical pinhole gamma cameras by reducing the distance between the pinhole and the object. However, this gamma camera suffers from the degradation of the spatial resolution at the periphery region due to parallax error by obliquely incident photons. We propose a new method to measure the depth of interaction (DOI) using three layers of monolithic scintillators to reduce the parallax error. The detector module consists of three layers of monolithic CsI(Tl) crystals with dimensions of 50.0 × 50.0 × 2.0 mm3, a Hamamatsu H8500 PSPMT and a large-angle pinhole collimator with an acceptance angle of 120°. The 3-dimensional event positions were determined by the maximum-likelihood position-estimation (MLPE) algorithm and the pre-generated look up table (LUT). The spatial resolution (FWHM) of a Co-57 point-like source was measured at different source position with the conventional method (Anger logic) and with DOI information. We proved that high sensitivity can be achieved without degradation of spatial resolution using a large-angle pinhole gamma camera: this system can be used as a small animal imaging tool.

Baek, C.-H.; An, S. J.; Kim, H.-I.; Choi, Y.; Chung, Y. H.

2012-01-01

27

A New Lunar Atlas: Mapping the Moon with the Wide Angle Camera  

NASA Astrophysics Data System (ADS)

The Lunar Reconnaissance Orbiter (LRO) spacecraft launched in June 2009 and began systematically mapping the lunar surface and providing a priceless dataset for the planetary science community and future mission planners. From 20 September 2009 to 11 December 2011, the spacecraft was in a nominal 50 km polar orbit, except for two one-month long periods when a series of spacecraft maneuvers enabled low attitude flyovers (as low as 22 km) of key exploration and scientifically interesting targets. One of the instruments, the Lunar Reconnaissance Orbiter Camera (LROC) Wide Angle Camera (WAC) [1], captured nearly continuous synoptic views of the illuminated lunar surface. The WAC is a 7-band (321, 360, 415, 566, 604, 643, 689 nm) push frame imager with field of view of 60° in color mode and 90° in monochrome mode. This broad field of view enables the WAC to reimage nearly 50% (at the equator where the orbit tracks our spaced the furthest) of the terrain it imaged in the previous orbit. The visible bands of map projected WAC images have a pixel scale of 100 m, while UV bands have a pixel scale of 400 m due to 4x4 pixel on-chip binning that increases signal-to-noise. The nearly circular polar orbit and short (two hour) orbital periods enable seamless mosaics of broad areas of the surface with uniform lighting and resolution. In March of 2011, the LROC team released the first version of the global monochrome (643nm) morphologic map [2], which was comprised of 15,000 WAC images collected over three periods. With the over 130,000 WAC images collected while the spacecraft was in the 50 km orbit, a new set of mosaics are being produced by the LROC Team and will be released to the Planetary Data Systems. These new maps include an updated morphologic map with an improved set of images (limiting illumination variations and gores due to off-nadir observation of other instruments) and a new photometric correction derived from the LROC WAC dataset. In addition, a higher sun (lower incidence angle) mosaic will also be released. This map has minimal shadows and highlights albedo differences. In addition, seamless regional WAC mosaics acquired under multiple lighting geometries (Sunlight coming from the East, overhead, and West) will also be produced for key areas of interest. These new maps use the latest terrain model (LROC WAC GLD100) [3], updated spacecraft ephemeris provided by the LOLA team [4], and improved WAC distortion model [5] to provide accurate placement of each WAC pixel on the lunar surface. References: [1] Robinson et al. (2010) Space Sci. Rev. [2] Speyerer et al. (2011) LPSC, #2387. [3] Scholten et al. (2012) JGR. [4] Mazarico et al. (2012) J. of Geodesy [5] Speyerer et al. (2012) ISPRS Congress.

Speyerer, E.; Robinson, M. S.; Boyd, A.; Sato, H.

2012-12-01

28

The measurement and modelling of light scattering by phytoplankton cells at narrow forward angles  

NASA Astrophysics Data System (ADS)

A procedure has been devised for measuring the angular dependence of light scattering from suspensions of phytoplankton cells at forward angles from 0.25° to 8°. The cells were illuminated with a spatially-filtered laser beam and the angular distribution of scattered light measured by tracking a photodetector across the Fourier plane of a collecting lens using a stepper-motor driven stage. The procedure was calibrated by measuring scattering from latex bead suspensions with known size distributions. It was then used to examine the scattering from cultures of the unicellular algae Isochrysis galbana (4 µm × 5 µm), Dunaliella primolecta (6 µm × 7 µm) and Rhinomonas reticulata (5 µm × 11 µm). The results were compared with the predictions of Mie theory. Excellent agreement was obtained for spherical particles. A suitable choice of spherical-equivalent scattering parameters was required to enable reasonable agreement within the first diffraction lobe for ellipsoidal particles.

MacCallum, Iain; Cunningham, Alex; McKee, David

2004-07-01

29

Erratum: The Wide Angle Camera of the ROSETTA Mission [Mem.SAIt 74, 434-435 (2003)  

NASA Astrophysics Data System (ADS)

The authors acknowledge that the paper fails to convey the correct information about the respective contributions and roles of the partners of the OSIRIS consortium. In particular, the hardware contributions of the Max-Planck Institut für Sonnensystemforschung, MPS, (Katlenburg Lindau, Germany, formerly MPAe), of the Instituto de Astrofisica de Andalucia (Granada, Spain), of the Department of Astronomy and Space Physics of Uppsala University (DASP), of ESA Research and Scientific Support Department (ESA/RSSD) to the Wide Angle Camera has not been mentioned or incorrectly expounded. The overall responsibility (PI ship) of MPS (MPAe) for OSIRIS and hence for the Wide Angle Camera is not correctly mentioned either. The correct information is given in the paper by Keller et al. (2006, Space Science Review, in press). The authors take this opportunity to acknowledge that the activity of the Italian team has been partly supported by the Italian Space Agency ASI through a contract to CISAS.

Barbieri, C.; Fornasier, S.; Verani, S.; Bertini, I.; Lazzarin, M.; Rampazzi, F.; Cremonese, G.; Ragazzoni, R.; Marzari, F.; Angrilli, F.; Bianchini, G. A.; Debei, S.; Dececco, M.; Guizzo, G.; Parzianello, G.; Ramous, P.; Saggin, B.; Zaccariotto, M.; da Deppo, V.; Naletto, G.; Nicolosi, G.; Pelizzo, M. G.; Tondello, G.; Brunello, P.; Peron, F.

30

Observations of Comet 9P\\/Tempel 1 around the Deep Impact event by the OSIRIS cameras onboard Rosetta  

Microsoft Academic Search

The OSIRIS cameras on the Rosetta spacecraft observed Comet 9P\\/Tempel 1 from 5 days before to 10 days after it was hit by the Deep Impact projectile. The Narrow Angle Camera (NAC) monitored the cometary dust in 5 different filters. The Wide Angle Camera (WAC) observed through

Horst Uwe Keller; Michael Küppers; Sonia Fornasier; Pedro J. Gutiérrez; Stubbe F. Hviid; Laurent Jorda; Jörg Knollenberg; Stephen C. Lowry; Miriam Rengel; Ivano Bertini; Rainer Kramm; Ekkehard Kührt; Luisa-Maria Lara; Holger Sierks; Cesare Barbieri; Philippe Lamy; Hans Rickman; Rafael Rodrigo; Michael F. A'Hearn; Björn J. R. Davidsson; Marco Fulle; Fritz Gliem; Olivier Groussin; José J. Lopez Moreno; Francesco Marzari; Angel Sanz; Camino Bajo de Huétor; Chung Li; G. Galilei

2006-01-01

31

Angles  

NSDL National Science Digital Library

Play these games to determine the best angles for success! Alien Angles Set the angle to rescue the alien. Space Angles Target the angle to shoot the alien spaceship. Mini Golf Knowing the angles will help you get the ball in the hole. ...

Clark, Mr

2012-10-31

32

Early direct-injection, low-temperature combustion of diesel fuel in an optical engine utilizing a 15-hole, dual-row, narrow-included-angle nozzle.  

SciTech Connect

Low-temperature combustion of diesel fuel was studied in a heavy-duty, single-cylinder optical engine employing a 15-hole, dual-row, narrow-included-angle nozzle (10 holes x 70/mD and 5 holes x 35/mD) with 103-/gmm-diameter orifices. This nozzle configuration provided the spray targeting necessary to contain the direct-injected diesel fuel within the piston bowl for injection timings as early as 70/mD before top dead center. Spray-visualization movies, acquired using a high-speed camera, show that impingement of liquid fuel on the piston surface can result when the in-cylinder temperature and density at the time of injection are sufficiently low. Seven single- and two-parameter sweeps around a 4.82-bar gross indicated mean effective pressure load point were performed to map the sensitivity of the combustion and emissions to variations in injection timing, injection pressure, equivalence ratio, simulated exhaust-gas recirculation, intake temperature, intake boost pressure, and load. High-speed movies of natural luminosity were acquired by viewing through a window in the cylinder wall and through a window in the piston to provide quasi-3D information about the combustion process. These movies revealed that advanced combustion phasing resulted in intense pool fires within the piston bowl, after the end of significant heat release. These pool fires are a result of fuel-films created when the injected fuel impinged on the piston surface. The emissions results showed a strong correlation with pool-fire activity. Smoke and NO/dx emissions rose steadily as pool-fire intensity increased, whereas HC and CO showed a dramatic increase with near-zero pool-fire activity.

Gehrke, Christopher R. (Caterpillar Inc.); Radovanovic, Michael S. (Caterpillar Inc.); Milam, David M. (Caterpillar Inc.); Martin, Glen C.; Mueller, Charles J.

2008-04-01

33

Angles  

NSDL National Science Digital Library

This set of eight interactive activities lets the user explore angles from many different perspectives. Activities include (1) visualizing the size of an angle; (2) examining objects that will stand or fall with right and non-right angles; (3) identifying obtuse, right, acute and straight angles; (4) guessing angle measures with different levels of precision; (5) exploring regular shapes and their angle measures; (6) studying angles in a fractal tree that is drawn with user inputs of the same angle measure between the branches at each stage; (7) exploring angle measures through firing a cannon (8) drawing with a Logo activity.

Edkins, Jo

2007-01-01

34

Lunar Reconnaissance Orbiter Camera (LROC) instrument overview  

USGS Publications Warehouse

The Lunar Reconnaissance Orbiter Camera (LROC) Wide Angle Camera (WAC) and Narrow Angle Cameras (NACs) are on the NASA Lunar Reconnaissance Orbiter (LRO). The WAC is a 7-color push-frame camera (100 and 400 m/pixel visible and UV, respectively), while the two NACs are monochrome narrow-angle linescan imagers (0.5 m/pixel). The primary mission of LRO is to obtain measurements of the Moon that will enable future lunar human exploration. The overarching goals of the LROC investigation include landing site identification and certification, mapping of permanently polar shadowed and sunlit regions, meter-scale mapping of polar regions, global multispectral imaging, a global morphology base map, characterization of regolith properties, and determination of current impact hazards.

Robinson, M.S.; Brylow, S.M.; Tschimmel, M.; Humm, D.; Lawrence, S.J.; Thomas, P.C.; Denevi, B.W.; Bowman-Cisneros, E.; Zerr, J.; Ravine, M.A.; Caplinger, M.A.; Ghaemi, F.T.; Schaffner, J.A.; Malin, M.C.; Mahanti, P.; Bartels, A.; Anderson, J.; Tran, T.N.; Eliason, E.M.; McEwen, A.S.; Turtle, E.; Jolliff, B.L.; Hiesinger, H.

2010-01-01

35

Miniature Wide-Angle Lens for Small-Pixel Electronic Camera  

NASA Technical Reports Server (NTRS)

A proposed wideangle lens is shown that would be especially well suited for an electronic camera in which the focal plane is occupied by an image sensor that has small pixels. The design of the lens is intended to satisfy requirements for compactness, high image quality, and reasonably low cost, while addressing issues peculiar to the operation of small-pixel image sensors. Hence, this design is expected to enable the development of a new generation of compact, high-performance electronic cameras. The lens example shown has a 60 degree field of view and a relative aperture (f-number) of 3.2. The main issues affecting the design are also shown.

Mouroulils, Pantazis; Blazejewski, Edward

2009-01-01

36

On an assessment of surface roughness estimates from lunar laser altimetry pulse-widths for the Moon from LOLA using LROC narrow-angle stereo DTMs.  

NASA Astrophysics Data System (ADS)

Neumann et al. [1] proposed that laser altimetry pulse-widths could be employed to derive "within-footprint" surface roughness as opposed to surface roughness estimated from between laser altimetry pierce-points such as the example for Mars [2] and more recently from the 4-pointed star-shaped LOLA (Lunar reconnaissance Orbiter Laser Altimeter) onboard the NASA-LRO [3]. Since 2009, the LOLA has been collecting extensive global laser altimetry data with a 5m footprint and ?25m between the 5 points in a star-shape. In order to assess how accurately surface roughness (defined as simple RMS after slope correction) derived from LROC matches with surface roughness derived from LOLA footprints, publicly released LROC-NA (LRO Camera Narrow Angle) 1m Digital Terrain Models (DTMs) were employed to measure the surface roughness directly within each 5m footprint. A set of 20 LROC-NA DTMs were examined. Initially the match-up between the LOLA and LROC-NA orthorectified images (ORIs) is assessed visually to ensure that the co-registration is better than the LOLA footprint resolution. For each LOLA footprint, the pulse-width geolocation is then retrieved and this is used to "cookie-cut" the surface roughness and slopes derived from the LROC-NA DTMs. The investigation which includes data from a variety of different landforms shows little, if any correlation between surface roughness estimated from DTMs with LOLA pulse-widths at sub-footprint scale. In fact there is only any perceptible correlation between LOLA and LROC-DTMs at baselines of 40-60m for surface roughness and 20m for slopes. [1] Neumann et al. Mars Orbiter Laser Altimeter pulse width measurements and footprint-scale roughness. Geophysical Research Letters (2003) vol. 30 (11), paper 1561. DOI: 10.1029/2003GL017048 [2] Kreslavsky and Head. Kilometer-scale roughness of Mars: results from MOLA data analysis. J Geophys Res (2000) vol. 105 (E11) pp. 26695-26711. [3] Rosenburg et al. Global surface slopes and roughness of the Moon from the Lunar Orbiter Laser Altimeter. Journal of Geophysical Research (2011) vol. 116, paper E02001. DOI: 10.1029/2010JE003716 [4] Chin et al. Lunar Reconnaissance Orbiter Overview: The Instrument Suite and Mission. Space Science Reviews (2007) vol. 129 (4) pp. 391-419

Muller, Jan-Peter; Poole, William

2013-04-01

37

Angles  

NSDL National Science Digital Library

This Java applet enables students to investigate acute, obtuse, and right angles. The student decides to work with one or two transversals and a pair of parallel lines. Angle measure is given for one angle. The student answers a short series of questions about the size of other angles, identifying relationships such as vertical and adjacent angles and alternate interior and alternate exterior angles. In addition to automatically checking the student's answers, the applet can keep score of correct answers. From the activity page, What, How, and Why buttons open pages that explain the activity's purpose, function, and how the mathematics fits into the curriculum. Supplemental resources include lesson plans and a handout with a grid for showing the relationship between all possible angles that occur when parallel lines are cut by a transversal. Copyright 2005 Eisenhower National Clearinghouse

Foundation, Shodor E.

2004-01-01

38

HAWC+: A Detector, Polarimetry, and Narrow-Band Imaging Upgrade to SOFIA's Far-Infrared Facility Camera  

NASA Astrophysics Data System (ADS)

HAWC, the High-resolution Airborne Widebandwidth Camera, is the facility far-infrared camera for SOFIA, providing continuum imaging from 50 to 250 microns wavelength. As a result of NASA selection as a SOFIA Second Generation Instruments upgrade investigation, HAWC will be upgraded with enhanced capability for addressing current problems in star formation and interstellar medium physics prior to commissioning in early 2015. We describe the capabilities of the upgraded HAWC+, as well as our initial science program. The mapping speed of HAWC is increased by a factor of 9, accomplished by using NASA/Goddard's Backshort-Under-Grid bolometer detectors in a 64x40 format. Two arrays are used in a dual-beam polarimeter format, and the full complement of 5120 transition-edge detectors is read using NIST SQUID multiplexers and U.B.C. Multi-Channel Electronics. A multi-band polarimeter is added to the HAWC opto-mechanical system, at the cryogenic pupil image, employing rotating quartz half-wave plates. Six new filters are added to HAWC+, bringing the full set to 53, 63, 89, 155, and 216 microns at R = 5 resolution and 52, 63, 88, 158, and 205 microns at R = 300 resolution. The latter filters are fixed-tuned to key fine-structure emission lines from [OIII], [OI], [CII], and [NII]. Polarimetry can be performed in any of the filter bands. The first-light science program with HAWC+ emphasizes polarimetry for the purpose of mapping magnetic fields in Galactic clouds. The strength and character of magnetic fields in molecular clouds before, during, and after the star formation phase are largely unknown, despite pioneering efforts on the KAO and ground-based telescopes. SOFIA and HAWC+ provide significant new capability: sensitivity to extended dust emission (to A_V ~ 1) which is unmatched, ~10 arcsec angular resolution combined with wide-field mapping which allows statistical estimates of magnetic field strength, and wavelength coverage spanning the peak of the far-infrared spectrum of star-forming clouds. Our initial targets include nearby quiescent clouds, active sites of high- and low-mass star formation, remnants of dispersing clouds, and the Galactic center.

Dowell, C. D.; Staguhn, J.; Harper, D. A.; Ames, T. J.; Benford, D. J.; Berthoud, M.; Chapman, N. L.; Chuss, D. T.; Dotson, J. L.; Irwin, K. D.; Jhabvala, C. A.; Kovacs, A.; Looney, L.; Novak, G.; Stacey, G. J.; Vaillancourt, J. E.; HAWC+ Science Collaboration

2013-01-01

39

Large-area proportional counter camera for the US National Small-Angle Neutron Scattering Facility  

SciTech Connect

An engineering model of a multiwire position-sensitive proportional-counter (PSPC) was developed, tested, and installed at the US National Small-Angle Neutron Scattering Facility at ORNL. The PSPC is based on the RC-encoding and time-difference decoding method to measure the spatial coordinates of the interaction loci of individual scattered neutrons. The active area of the PSPC is 65 cm x 65 cm, and the active depth is 3.6 cm. The spatial uncertainty in both coordinates is approx. 1.0 cm (fwhm) for thermal neutrons; thus, a matrix of 64 x 64 picture elements is resolved. The count rate capability for randomly detected neutrons is 10/sup 4/ counts per second, with < 3% coincidence loss. The PSPC gas composition is 63% /sup 3/He, 32% Xe, and 5% CO/sub 2/ at an absolute pressure of approx. 3 x 10/sup 5/ Pa (3 atm). The detection efficiency is approx. 90% for the 0.475-nm (4.75-A) neutrons used in the scattering experiments.

Abele, R.K.; Allin, G.W.; Clay, W.T.; Fowler, C.E.; Kopp, M.K.

1980-01-01

40

Angles  

NSDL National Science Digital Library

In this activity, students practice comparing angles when a transversal intersects two parallel lines. This activity allows students to explore the vocabulary used when comparing angles (e.g., alternate, same-side, interior, corresponding, etc.). This activity includes supplemental materials, including background information about the topics covered, a description of how to use the application, and exploration questions for use with the java applet.

2011-03-04

41

Post-trial anatomical frame alignment procedure for comparison of 3D joint angle measurement from magnetic/inertial measurement units and camera-based systems.  

PubMed

Magnetic and inertial measurement units (MIMUs) have been widely used as an alternative to traditional camera-based motion capture systems for 3D joint kinematics measurement. Since these sensors do not directly measure position, a pre-trial anatomical calibration, either with the assistance of a special protocol/apparatus or with another motion capture system is required to establish the transformation matrices between the local sensor frame and the anatomical frame (AF) of each body segment on which the sensors are attached. Because the axes of AFs are often used as the rotational axes in the joint angle calculation, any difference in the AF determination will cause discrepancies in the calculated joint angles. Therefore, a direct comparison of joint angles between MIMU systems and camera-based systems is less meaningful because the calculated joint angles contain a systemic error due to the differences in the AF determination. To solve this problem a new post-trial AF alignment procedure is proposed. By correcting the AF misalignments, the joint angle differences caused by the difference in AF determination are eliminated and the remaining discrepancies are mainly from the measurement accuracy of the systems themselves. Lower limb joint angles from 30 walking trials were used to validate the effectiveness of the proposed AF alignment procedure. This technique could serve as a new means for calibrating magnetic/inertial sensor-based motion capture systems and correcting for AF misalignment in scenarios where joint angles are compared directly. PMID:25340557

Li, Qingguo; Zhang, Jun-Tian

2014-11-01

42

Angle and polarization independent narrow-band thermal emitter made of metallic disk on SiO2  

NASA Astrophysics Data System (ADS)

It is shown that the metallic disk structure can be used as an efficient narrow-band thermal emitter in the IR region. The absorption spectra of such structure are investigated both theoretically and experimentally. Calculations of thermal radiation properties of the metallic disk show that the metallic disk is a perfect emitter at a specific wavelength, which can be tuned by varying the diameter of the disk. The metallic disk exhibits only one significant localized surface plasmon polariton (LSPP) mode for both TM and TE polarizations simultaneously. The LSPP mode can be tuned by either varying the disk diameter or the spacer (made of SiO2).

Abbas, Mohammed Nadhim; Cheng, Cheng-Wen; Chang, Yia-Chung; Shih, Min-Hsiung; Chen, Hung-Hsin; Lee, Si-Chen

2011-03-01

43

7. VAL CAMERA CAR, DETAIL OF 'FLARE' OR TRAJECTORY CAMERA ...  

Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

7. VAL CAMERA CAR, DETAIL OF 'FLARE' OR TRAJECTORY CAMERA INSIDE CAMERA CAR. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

44

6. VAL CAMERA CAR, DETAIL OF COMMUNICATION EQUIPMENT INSIDE CAMERA ...  

Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

6. VAL CAMERA CAR, DETAIL OF COMMUNICATION EQUIPMENT INSIDE CAMERA CAR WITH CAMERA MOUNT IN FOREGROUND. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

45

2. VAL CAMERA CAR, VIEW OF CAMERA CAR AND TRACK ...  

Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

2. VAL CAMERA CAR, VIEW OF CAMERA CAR AND TRACK WITH CAMERA STATION ABOVE LOOKING WEST TAKEN FROM RESERVOIR. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

46

Erratum: First Results from the Wide Angle Camera of the ROSETTA Mission [Mem.SAIt Suppl. 6, 28-33 (2005)  

NASA Astrophysics Data System (ADS)

The authors acknowledge that the paper fails to convey the correct information about the respective contributions and roles of the partners of the OSIRIS consortium. In particular, the hardware contributions of the Max-Planck Institut für Sonnensystemforschung, MPS, (Katlenburg Lindau, Germany, formerly MPAe), of the Instituto de Astrofisica de Andalucia (Granada, Spain), of the Department of Astronomy and Space Physics of Uppsala University (DASP), of ESA Research and Scientific Support Department (ESA/RSSD) to the Wide Angle Camera has not been mentioned or incorrectly expounded. The overall responsibility (PI ship) of MPS (MPAe) for OSIRIS and hence for the Wide Angle Camera is not correctly mentioned either. The correct information is given in the paper by Keller et al. (2006, Space Science Review, in press). The authors take this opportunity to acknowledge that the activity of the Italian team has been partly supported by the Italian Space Agency ASI through a contract to CISAS.

Barbieri, C.; Fornasier, S.; Bertini, I.; Angrilli, F.; Bianchini, G. A.; Debei, S.; de Cecco, M.; Parzianello, G.; Zaccariotto, M.; da Deppo, V.; Naletto, G.

47

The calibration of wide-angle lens cameras using perspective and non-perspective projections in the context of real-time tracking applications  

NASA Astrophysics Data System (ADS)

In most close-range photogrammetry applications, the cameras are modelled as imaging systems with perspective projection combined with the lens distortion correction as proposed by Brown in 1971. In the 1980s, the calibration of video cameras received considerable attention. This required compensation for further systematic effects caused by the digitization of the analogue image signal. Modelling the image process in that manner has become the widely-applied standard since then. To take advantage of the increased field of view of individual cameras, the use of wide angle as well as fisheye lenses became common in computer vision and close-range photogrammetry, again requiring appropriate modelling of the imaging process to ensure high accuracies. A.R.T. provides real-time tracking systems with infra-red cameras, which are in some cases equipped with short focal length lenses for the purpose of increased fields of view, resulting in larger trackable object volumes. Unfortunately the lens distortion of these cameras reaches magnitudes which can not be sufficiently modelled with the customary Brown model as - mainly at high excentricities such as image corners - the calculation of the correction is not applicable. Considerations to avoid modelling these lenses as fisheye projections led to an alternate and rather pragmatic approach, where the distortion model is extended by a fourth radial distortion coefficient. Due to numeric instabilities, a stepwise camera calibration is required to achieve convergence in the bundle adjustment process. This paper presents the modified lens distortion model, describes the stepwise calibration procedure and compares results in respect to the conventional approach. The results are also compared to the approach wherein the camera lens is modelled as a fisheye projection. The introduction of a fourth radial lens distortion parameter allows the correction of lens distortion effects over the full sensor area of wide angle lenses, which increases the usable field of view of that specific camera and therefore the size of the trackable observed object volume. The approaches with the extended lens distortion model and the fisheye projection were successfully implemented and tested, and are on target to become part of the A.R.T. product range.

Willneff, Jochen; Wenisch, Oliver

2011-07-01

48

Mars Global Surveyor Mars Orbiter Camera Image Gallery  

NSDL National Science Digital Library

This site from Malin Space Science Systems provides access to all of the images acquired by the Mars Orbiter Camera (MOC) during the Mars Global Surveyor mission through March 2005. MOC consists of several cameras: A narrow angle system that provides grayscale high resolution views of the planet's surface (typically, 1.5 to 12 meters/pixel), and red and blue wide angle cameras that provide daily global weather monitoring, context images to determine where the narrow angle views were actually acquired, and regional coverage to monitor variable surface features such as polar frost and wind streaks. Ancillary data for each image is provided and instructions regarding gallery usage are also available on the site.

Malin Space Science Systems

49

3. VAL CAMERA CAR, VIEW OF CAMERA CAR AND TRACK ...  

Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

3. VAL CAMERA CAR, VIEW OF CAMERA CAR AND TRACK WITH THE VAL TO THE RIGHT, LOOKING NORTHEAST. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

50

1. VARIABLEANGLE LAUNCHER CAMERA CAR, VIEW OF CAMERA CAR AND ...  

Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

1. VARIABLE-ANGLE LAUNCHER CAMERA CAR, VIEW OF CAMERA CAR AND TRACK WITH CAMERA STATION ABOVE LOOKING NORTH TAKEN FROM RESERVOIR. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

51

A survey of Martian dust devil activity using Mars Global Surveyor Mars Orbiter Camera images  

Microsoft Academic Search

A survey of dust devils using the Mars Global Surveyor (MGS) Mars Orbiter Camera (MOC) wide- and narrow-angle (WA and NA) images has been undertaken. The survey comprises two parts: (1) sampling of nine broad regions from September 1997 to July 2001 and (2) a focused seasonal monitoring of variability in the Amazonis region, an active dust devil site, from

Jenny A. Fisher; Mark I. Richardson; Claire E. Newman; Mark A. Szwast; Chelsea Graf; Shabari Basu; Shawn P. Ewald; Anthony D. Toigo; R. John Wilson

2005-01-01

52

Observations of comet 9P\\/Tempel 1 around the Deep Impact event with the OSIRIS cameras on Rosetta  

Microsoft Academic Search

The scientific imaging system OSIRIS on Rosetta observed comet 9P Tempel 1 nearly continuously from 5 days before it was hit by the Deep Impact projectile on 4 July 2005 to 10 days after the impact The narrow angle camera NAC of OSIRIS monitored the evolution of the impact created dust cloud with a resolution of 1500 km pixel through

H. U. Keller; M. Küppers; M. Rengel; S. Fornasier; G. Cremonese; P. Gutierrez; W. H. Ip; J. Knollenberg; L. Jorda

2006-01-01

53

Wide Angle Movie  

NASA Technical Reports Server (NTRS)

This brief movie illustrates the passage of the Moon through the Saturn-bound Cassini spacecraft's wide-angle camera field of view as the spacecraft passed by the Moon on the way to its closest approach with Earth on August 17, 1999. From beginning to end of the sequence, 25 wide-angle images (with a spatial image scale of about 14 miles per pixel (about 23 kilometers)were taken over the course of 7 and 1/2 minutes through a series of narrow and broadband spectral filters and polarizers, ranging from the violet to the near-infrared regions of the spectrum, to calibrate the spectral response of the wide-angle camera. The exposure times range from 5 milliseconds to 1.5 seconds. Two of the exposures were smeared and have been discarded and replaced with nearby images to make a smooth movie sequence. All images were scaled so that the brightness of Crisium basin, the dark circular region in the upper right, is approximately the same in every image. The imaging data were processed and released by the Cassini Imaging Central Laboratory for Operations (CICLOPS)at the University of Arizona's Lunar and Planetary Laboratory, Tucson, AZ.

Photo Credit: NASA/JPL/Cassini Imaging Team/University of Arizona

Cassini, launched in 1997, is a joint mission of NASA, the European Space Agency and Italian Space Agency. The mission is managed by NASA's Jet Propulsion Laboratory, Pasadena, CA, for NASA's Office of Space Science, Washington DC. JPL is a division of the California Institute of Technology, Pasadena, CA.

1999-01-01

54

Optimum camera placement considering camera specification for security monitoring  

Microsoft Academic Search

Abstract—We,present,an optimum,camera,placement,algo- rithm. We,are motivated,by,the fact that the installation of security cameras,is increasing rapidly. From the system cost point of view, it is desirable to observe all the area of interest by the smallest number,of cameras. We propose,a method,for deciding optimum,camera,placement,automatically,considering,camera specification such as visual distance, visual angle, and resolution. Moreover, to reduce the number of cameras, we divide the

Kenichi Yabuta; Hitoshi Kitazawa

2008-01-01

55

Angles, Angles and More Angles!  

NSDL National Science Digital Library

Test Your Angle Knowledge! Angles Telescope Star Gazing Help diget to fill up his scrapbook of stars by using his telescope and pointting at each planet during the night! But make sure you hurry before the sun comes up! Shoot The Space Ship Angles Game Try and figure out which angle you need to use to shoot down the aliens spaceship! ...

Smith, Miss

2011-03-23

56

Camera Obscura  

NSDL National Science Digital Library

Before photography was invented there was the camera obscura, useful for studying the sun, as an aid to artists, and for general entertainment. What is a camera obscura and how does it work ??? Camera = Latin for room Obscura = Latin for dark But what is a Camera Obscura? The Magic Mirror of Life What is a camera obscura? A French drawing camera with supplies A French drawing camera with supplies Drawing Camera Obscuras with Lens at the top Drawing Camera Obscuras with Lens at the top Read the first three paragraphs of this article. Under the portion Early Observations and Use in Astronomy you will find the answers to the ...

Engelman, Mr.

2008-10-28

57

A Survey of Martian Dust Devil Activity Using Mars Global Surveyor Mars Orbiter Camera Images  

Microsoft Academic Search

We present results from an orbital survey of Martian dust devils using the Mars Global Surveyor (MGS) Mars Orbiter Camera (MOC) wide- and narrow-angle (WA and NA) images. The survey includes all available imaging data (mapping and pre-mapping orbit), through to mission phase E06. Due to the large volume of data, we have concentrated on surveying limited regions, selected variously

J. Fisher; M. I. Richardson; S. P. Ewald; A. D. Toigo; R. J. Wilson

2002-01-01

58

8. VAL CAMERA CAR, CLOSEUP VIEW OF 'FLARE' OR TRAJECTORY ...  

Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

8. VAL CAMERA CAR, CLOSE-UP VIEW OF 'FLARE' OR TRAJECTORY CAMERA ON SLIDING MOUNT. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

59

Imaging Narrow Angle The Voyager Spacecraft  

E-print Network

three times farther away from Earth and the Sun than is Pluto. The Voyagers are involved in a mis- sion for meeting several science objectives of NASA's Heliophysics System Observa- tory. The Voyagers are the only

Waliser, Duane E.

60

High-performance IR cameras  

NASA Astrophysics Data System (ADS)

Aerojet has developed high performance infrared (IR) cameras based on Kodak's 640 by 486 pixels platinum silicide (PtSi) array. Several versions of the camera have been developed for the various applications. The cameras have multiple field-of-view (FOV) optics, with the narrow FOV used for long range observation. The use of the large array of PtSi leads to very high image quality as well as high resolution. The operation in the medium-wavelength infrared (MWIR) allows observation at very long ranges in high humidity, warm atmospheres. Field tests have shown the advantage of these cameras, particularly in coastal and marine applications.

Shaham, Yifal J.; Schellhase, R.

1996-06-01

61

The DSLR Camera  

NASA Astrophysics Data System (ADS)

Cameras have developed significantly in the past decade; in particular, digital Single-Lens Reflex Cameras (DSLR) have appeared. As a consequence we can buy cameras of higher and higher pixel number, and mass production has resulted in the great reduction of prices. CMOS sensors used for imaging are increasingly sensitive, and the electronics in the cameras allows images to be taken with much less noise. The software background is developing in a similar way—intelligent programs are created for after-processing and other supplementary works. Nowadays we can find a digital camera in almost every household, most of these cameras are DSLR ones. These can be used very well for astronomical imaging, which is nicely demonstrated by the amount and quality of the spectacular astrophotos appearing in different publications. These examples also show how much post-processing software contributes to the rise in the standard of the pictures. To sum up, the DSLR camera serves as a cheap alternative for the CCD camera, with somewhat weaker technical characteristics. In the following, I will introduce how we can measure the main parameters (position angle and separation) of double stars, based on the methods, software and equipment I use. Others can easily apply these for their own circumstances.

Berkó, Ern?; Argyle, R. W.

62

Camera Animation  

NSDL National Science Digital Library

A general discussion of the use of cameras in computer animation. This section includes principles of traditional film techniques and suggestions for the use of a camera during an architectural walkthrough. This section includes html pages, images and one video.

63

Camera Calibration using the Damped Bundle Adjustment Toolbox  

NASA Astrophysics Data System (ADS)

Camera calibration is one of the fundamental photogrammetric tasks. The standard procedure is to apply an iterative adjustment to measurements of known control points. The iterative adjustment needs initial values of internal and external parameters. In this paper we investigate a procedure where only one parameter - the focal length is given a specific initial value. The procedure is validated using the freely available Damped Bundle Adjustment Toolbox on five calibration data sets using varying narrow- and wide-angle lenses. The results show that the Gauss-Newton-Armijo and Levenberg-Marquardt-Powell bundle adjustment methods implemented in the toolbox converge even if the initial values of the focal length are between 1/2 and 32 times the true focal length, even if the parameters are highly correlated. Standard statistical analysis methods in the toolbox enable manual selection of the lens distortion parameters to estimate, something not available in other camera calibration toolboxes. A standardised camera calibration procedure that does not require any information about the camera sensor or focal length is suggested based on the convergence results. The toolbox source and data sets used in this paper are available from the authors.

Börlin, N.; Grussenmeyer, P.

2014-05-01

64

Automatic camera selection for activity monitoring in a multi-camera system for tennis  

Microsoft Academic Search

In professional tennis training matches, the coach needs to be able to view play from the most appropriate angle in order to monitor players' activities. In this paper, we describe and evaluate a system for automatic camera selection from a network of synchronised cameras within a tennis sporting arena. This work combines synchronised video streams from multiple cameras into a

Philip Kelly; C. O. Conaire; Chanyul Kim; Noel E. O'Connor

2009-01-01

65

5. VAL CAMERA CAR, DETAIL OF HOIST AT SIDE OF ...  

Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

5. VAL CAMERA CAR, DETAIL OF HOIST AT SIDE OF BRIDGE AND ENGINE CAR ON TRACKS, LOOKING NORTHEAST. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

66

Electronic Still Camera  

NASA Technical Reports Server (NTRS)

A handheld, programmable, digital camera is disclosed that supports a variety of sensors and has program control over the system components to provide versatility. The camera uses a high performance design which produces near film quality images from an electronic system. The optical system of the camera incorporates a conventional camera body that was slightly modified, thus permitting the use of conventional camera accessories, such as telephoto lenses, wide-angle lenses, auto-focusing circuitry, auto-exposure circuitry, flash units, and the like. An image sensor, such as a charge coupled device ('CCD') collects the photons that pass through the camera aperture when the shutter is opened, and produces an analog electrical signal indicative of the image. The analog image signal is read out of the CCD and is processed by preamplifier circuitry, a correlated double sampler, and a sample and hold circuit before it is converted to a digital signal. The analog-to-digital converter has an accuracy of eight bits to insure accuracy during the conversion. Two types of data ports are included for two different data transfer needs. One data port comprises a general purpose industrial standard port and the other a high speed/high performance application specific port. The system uses removable hard disks as its permanent storage media. The hard disk receives the digital image signal from the memory buffer and correlates the image signal with other sensed parameters, such as longitudinal or other information. When the storage capacity of the hard disk has been filled, the disk can be replaced with a new disk.

Holland, S. Douglas (inventor)

1992-01-01

67

Neutron Guinier camera  

NASA Astrophysics Data System (ADS)

The feasibility of Guinier cameras for small angle neutron scattering (SANS) is analyzed theoretically and experimentally. Small angle X-ray scattering (SAXS) is commonly measured with Guinier cameras1 that use bent perfect crystals to focus to detector beams from point sources of characteristic X-rays. Neutron Guinier cameras do not exist yet, although focusing to detector has occasionally been tried. The philosophy of current SANS pinhole instruments is to gain intensity from broad wavelength bands at tight collimation. With characteristic X-rays, intensity gains can only come from broad angular divergences. Neutron focusing instruments represent a return, at a higher level, to the philosophy of characteristic X-rays. Such a return is advocated in this paper for SANS. The resolution of Guinier cameras is defined not by the collimation (which is relaxed), but by the beam size at focus and the spatial resolution of the position sensitive detector (which should match each other). Within the recent concept of neutron imaging2 multi-wafer monochromators can provide image sizes comparable to the thickness of one wafer in the bent packet. The imaging may be non-dispersive, at broad wavelength bands, like with mirrors in conventional optics. These are the right ingredients for convergent neutron beams in Guinier cameras. The paper addresses the question whether the increased angular divergence can compensate for the reduced size of the source that is imaged into a sharp spot at detector. A neutron Guinier camera at thermal neutron energies is evaluated. It turns to be quite feasible, providing moderate resolution at high intensity with detection systems in current use for high-resolution neutron diffraction. High-resolution SANS is also possible with detection by image plates or microchannel plate systems. Tests were performed using a single wafer and a packet of bent silicon wafers in both Bragg and Laue (transmission) geometry in non-dispersive imaging arrangements. Experiments have confirmed expectations. SANS data obtained in neutron Guinier camera conditions on samples of collagen and lipids are presented.

Popovici, Mihai; Stoica, Alexandru D.; Worcester, David L.

2002-11-01

68

Infrared Camera  

NASA Technical Reports Server (NTRS)

A sensitive infrared camera that observes the blazing plumes from the Space Shuttle or expendable rocket lift-offs is capable of scanning for fires, monitoring the environment and providing medical imaging. The hand-held camera uses highly sensitive arrays in infrared photodetectors known as quantum well infrared photo detectors (QWIPS). QWIPS were developed by the Jet Propulsion Laboratory's Center for Space Microelectronics Technology in partnership with Amber, a Raytheon company. In October 1996, QWIP detectors pointed out hot spots of the destructive fires speeding through Malibu, California. Night vision, early warning systems, navigation, flight control systems, weather monitoring, security and surveillance are among the duties for which the camera is suited. Medical applications are also expected.

1997-01-01

69

CCD Camera  

DOEpatents

A CCD camera capable of observing a moving object which has varying intensities of radiation eminating therefrom and which may move at varying speeds is shown wherein there is substantially no overlapping of successive images and wherein the exposure times and scan times may be varied independently of each other.

Roth, Roger R. (Minnetonka, MN)

1983-01-01

70

Security Cameras  

NSDL National Science Digital Library

Using the real world example of security cameras, this lesson has students explore properties of polygons. Using this example, students will be able to discover a formula as related to polygons. An activity sheet and student questions are included. This material is intended for students from grades 9-12 and should require 1 class period to complete.

2010-12-16

71

Differing angles on angle  

Microsoft Academic Search

Values of plane angles are expressed with a choice of several units. Historically the quantity needed a unit because it was, and still is, used as a base quantity. ISO\\/TC 12 defines it as a derived, dimensionless quantity, and the International System of Units (SI) gives it the ‘dimensionless unit’ radian, which now means no more than ‘one’. This paper

W H Emerson

2005-01-01

72

Characterization of suspension poly(vinyl chloride) resins and narrow polystyrene standards by size exclusion chromatography with multiple detectors: Online right angle laser-light scattering and differential viscometric detectors  

Microsoft Academic Search

This work reports the utilization of a multi-detector size chromatography for the characterization of poly(vinyl chloride) (PVC) resins prepared by suspension polymerization in the range of temperatures between 21 and 75°C. The chromatography equipment offers the possibility of analyzing the samples in terms of their absolute molecular mass using a combination of three detectors (TriSEC): right angle light scattering (RALLS),

Jorge F. J. Coelho; Pedro M. F. O. Gonçalves; Dionísio Miranda; M. H. Gil

2006-01-01

73

Camera Projector  

NSDL National Science Digital Library

In this activity (posted on March 14, 2011), learners follow the steps to construct a camera projector to explore lenses and refraction. First, learners use relatively simple materials to construct the projector. Then, learners discover that lenses project images upside down and backwards. They explore this phenomenon by creating their own slides (must be drawn upside down and backwards to appear normally). Use this activity to also introduce learners to spherical aberration and chromatic aberration.

2013-07-30

74

Pinhole Cameras: For Science, Art, and Fun!  

ERIC Educational Resources Information Center

A pinhole camera is a camera without a lens. A tiny hole replaces the lens, and light is allowed to come in for short amount of time by means of a hand-operated shutter. The pinhole allows only a very narrow beam of light to enter, which reduces confusion due to scattered light on the film. This results in an image that is focused, reversed, and…

Button, Clare

2007-01-01

75

Narrow Bandwidth Telecommunications.  

ERIC Educational Resources Information Center

The basic principles of narrow bandwidth telecommunications are treated in a manner understandable to the non-engineer. The comparative characteristics of the various narrow bandwidth communications circuits are examined. Currently available graphics transmission and reception equipment are described and their capabilities and limitations…

Kessler, William J.; Wilhelm, Michael J.

76

Readout electronics of physics of accelerating universe camera  

NASA Astrophysics Data System (ADS)

The Physics of Accelerating Universe Camera (PAUCam) is a new camera for dark energy studies that will be installed in the William Herschel telescope. The main characteristic of the camera is the capacity for high precision photometric redshift measurement. The camera is composed of eighteen Hamamatsu Photonics CCDs providing a wide field of view covering a diameter of one degree. Unlike the common five optical filters of other similar surveys, PAUCam has forty optical narrow band filters which will provide higher resolution in photometric redshifts. In this paper a general description of the electronics of the camera and its status is presented.

de Vicente, Juan; Castilla, Javier; Jiménez, Jorge; Cardiel-Sas, L.; Illa, José M.

2014-08-01

77

Automatic commanding of the Mars Observer Camera  

NASA Technical Reports Server (NTRS)

Mars Observer, launched in September 1992, was intended to be a 'survey-type' mission that acquired global coverage of Mars from a low, circular, near-polar orbit during an entire Martian year. As such, most of its instruments had fixed data rates, wide fields of view, and relatively low resolution, with fairly limited requirements for commanding. An exception is the Mars Observer Camera, or MOC. The MOC consists of a two-color Wide Angle (WA) system that can acquire both global images at low resolution (7.5 km/pixel) and regional images at commandable resolutions up to 250 m/pixel. Complementing the WA is the Narrow Angle (NA) system, that can acquire images at 8 resolutions from 12 m/pixel to 1.5 m/pixel, with a maximum crosstrack dimension of 3 km. The MOC also provides various forms of data compression (both lossless and lossy), and is designed to work at data rates from 700 bits per second (bps) to over 80k bps. Because of this flexibility, developing MOC command sequences is much more difficult than the routine mode-changing that characterizes other instrument operations. Although the MOC cannot be pointed (the spacecraft is fixed nadir-pointing and has no scan platform), the timing, downlink stream allocation, compression type and parameters, and image dimensions of each image must be commanded from the ground, subject to the constraints inherent in the MOC and the spacecraft. To minimize the need for a large operations staff, the entire command generation process has been automated within the MOC Ground Data System. Following the loss of the Mars Observer spacecraft in August 1993, NASA intends to launch a new spacecraft, Mars Global Surveyor (MGS), in late 1996. This spacecraft will carry the MOC flight spare (MOC 2). The MOC 2 operations plan will be largely identical to that developed for MOC, and all of the algorithms described here are applicable to it.

Caplinger, Michael

1994-01-01

78

Caught on Camera.  

ERIC Educational Resources Information Center

Describes the benefits of and rules to be followed when using surveillance cameras for school security. Discusses various camera models, including indoor and outdoor fixed position cameras, pan-tilt zoom cameras, and pinhole-lens cameras for covert surveillance. (EV)

Milshtein, Amy

2002-01-01

79

System Synchronizes Recordings from Separated Video Cameras  

NASA Technical Reports Server (NTRS)

A system of electronic hardware and software for synchronizing recordings from multiple, physically separated video cameras is being developed, primarily for use in multiple-look-angle video production. The system, the time code used in the system, and the underlying method of synchronization upon which the design of the system is based are denoted generally by the term "Geo-TimeCode(TradeMark)." The system is embodied mostly in compact, lightweight, portable units (see figure) denoted video time-code units (VTUs) - one VTU for each video camera. The system is scalable in that any number of camera recordings can be synchronized. The estimated retail price per unit would be about $350 (in 2006 dollars). The need for this or another synchronization system external to video cameras arises because most video cameras do not include internal means for maintaining synchronization with other video cameras. Unlike prior video-camera-synchronization systems, this system does not depend on continuous cable or radio links between cameras (however, it does depend on occasional cable links lasting a few seconds). Also, whereas the time codes used in prior video-camera-synchronization systems typically repeat after 24 hours, the time code used in this system does not repeat for slightly more than 136 years; hence, this system is much better suited for long-term deployment of multiple cameras.

Nail, William; Nail, William L.; Nail, Jasper M.; Le, Doung T.

2009-01-01

80

Boresight calibration of the aerial multi-head camera system  

NASA Astrophysics Data System (ADS)

This paper introduces a novel geometric constraint to boresight calibration for aerial multi-head camera systems. Using the precise EOPs (exterior orientation parameters) estimated for each physical camera and the surface information of the area of interest, multi head camera provides a synthetic image at each time epoch. The camera EOPs can be computed directly from the navigation solution provided by an onboard GPS/INS system and camera platform geometric calibration parameters, which represent the geometric relationship between the camera heads. For direct acquisition of EOPs from the navigation system, the camera frame and the INS frame should be precisely aligned. Boresight can be defined as mounting angles between the INS frame and camera frame. Since small but unknown misalignment angles could cause large errors on the ground, which suggests that they should be precisely estimated. In this paper, unknown boresight angles are estimated by using camera platform geometric calibration parameters as constraints. Since each physical camera of the multi-head camera system is tightly affixed to the platform, the geometry between camera frames remains constant. Simulation results show that the constrained method provides better estimation in terms of both accuracy and precision compared to traditional approach which does not use the constraint.

Lee, Young-Jin; Yilmaz, Alper

2011-06-01

81

Pre-hibernation performances of the OSIRIS cameras onboard the Rosetta spacecraft  

NASA Astrophysics Data System (ADS)

Context. The ESA cometary mission Rosetta was launched in 2004. In the past years and until the spacecraft hibernation in June 2011, the two cameras of the OSIRIS imaging system (Narrow Angle and Wide Angle Camera, NAC and WAC) observed many different sources. On 20 January 2014 the spacecraft successfully exited hibernation to start observing the primary scientific target of the mission, comet 67P/Churyumov-Gerasimenko. Aims: A study of the past performances of the cameras is now mandatory to be able to determine whether the system has been stable through the time and to derive, if necessary, additional analysis methods for the future precise calibration of the cometary data. Methods: The instrumental responses and filter passbands were used to estimate the efficiency of the system. A comparison with acquired images of specific calibration stars was made, and a refined photometric calibration was computed, both for the absolute flux and for the reflectivity of small bodies of the solar system. Results: We found a stability of the instrumental performances within ±1.5% from 2007 to 2010, with no evidence of an aging effect on the optics or detectors. The efficiency of the instrumentation is found to be as expected in the visible range, but lower than expected in the UV and IR range. A photometric calibration implementation was discussed for the two cameras. Conclusions: The calibration derived from pre-hibernation phases of the mission will be checked as soon as possible after the awakening of OSIRIS and will be continuously monitored until the end of the mission in December 2015. A list of additional calibration sources has been determined that are to be observed during the forthcoming phases of the mission to ensure a better coverage across the wavelength range of the cameras and to study the possible dust contamination of the optics.

Magrin, S.; La Forgia, F.; Da Deppo, V.; Lazzarin, M.; Bertini, I.; Ferri, F.; Pajola, M.; Barbieri, M.; Naletto, G.; Barbieri, C.; Tubiana, C.; Küppers, M.; Fornasier, S.; Jorda, L.; Sierks, H.

2015-02-01

82

Multispectral calibration to enhance the metrology performance of C-mount camera systems  

NASA Astrophysics Data System (ADS)

Low cost monochrome camera systems based on CMOS sensors and C-mount lenses have been successfully applied to a wide variety of metrology tasks. For high accuracy work such cameras are typically equipped with ring lights to image retro-reflective targets as high contrast image features. Whilst algorithms for target image measurement and lens modelling are highly advanced, including separate RGB channel lens distortion correction, target image circularity compensation and a wide variety of detection and centroiding approaches, less effort has been directed towards optimising physical target image quality by considering optical performance in narrow wavelength bands. This paper describes an initial investigation to assess the effect of wavelength on camera calibration parameters for two different camera bodies and the same `C-mount' wide angle lens. Results demonstrate the expected strong influence on principal distance, radial and tangential distortion, and also highlight possible trends in principal point, orthogonality and affinity parameters which are close to the parameter estimation noise level from the strong convergent self-calibrating image networks.

Robson, S.; MacDonald, L.; Kyle, S. A.; Shortis, M. R.

2014-06-01

83

Angle Sums  

NSDL National Science Digital Library

"Examine the angles in a triangle, quadrilateral, pentagon, hexagon, heptagon or octagon and find a relationship between the number of sides and the sum of the interior angles." (Source: 2000-2012 National Council of Teachers of Mathematics)

Mathematics, Illuminations N.

2010-05-20

84

Angle Practice!  

NSDL National Science Digital Library

How well do you know your angles? Check out these games and put your knowledge to the test! They will stump you if you don't pay close attention to the different angles they give you! Alien Angles! - Use the protractor to guess where the alien has flown away to. If you pick the right spot, you can save all the aliens! Squirt the Dog! Angle practice - Move the hose using different measures of angles to try and squirt the dog. To what degree? - Think you're ready to challenge yourself? Check out ...

Hume, Ms.

2012-11-02

85

Object tracking using multiple camera video streams  

NASA Astrophysics Data System (ADS)

Two synchronized cameras are utilized to obtain independent video streams to detect moving objects from two different viewing angles. The video frames are directly correlated in time. Moving objects in image frames from the two cameras are identified and tagged for tracking. One advantage of such a system involves overcoming effects of occlusions that could result in an object in partial or full view in one camera, when the same object is fully visible in another camera. Object registration is achieved by determining the location of common features in the moving object across simultaneous frames. Perspective differences are adjusted. Combining information from images from multiple cameras increases robustness of the tracking process. Motion tracking is achieved by determining anomalies caused by the objects' movement across frames in time in each and the combined video information. The path of each object is determined heuristically. Accuracy of detection is dependent on the speed of the object as well as variations in direction of motion. Fast cameras increase accuracy but limit the speed and complexity of the algorithm. Such an imaging system has applications in traffic analysis, surveillance and security, as well as object modeling from multi-view images. The system can easily be expanded by increasing the number of cameras such that there is an overlap between the scenes from at least two cameras in proximity. An object can then be tracked long distances or across multiple cameras continuously, applicable, for example, in wireless sensor networks for surveillance or navigation.

Mehrubeoglu, Mehrube; Rojas, Diego; McLauchlan, Lifford

2010-05-01

86

Internet Archive: Tacoma Narrows Bridge Collapse  

NSDL National Science Digital Library

This web page contains the original footage of the 1940 collapse of the Tacoma Narrows Bridge, a mile-long suspension bridge in the state of Washington. The collapse occurred due to strong gusting winds which drove the span into large-amplitude oscillatory motion. The footage is 2 minutes, 35 seconds in length. The collapse of the bridge was recorded on film by Barney Elliott, owner of a Tacoma camera shop. In 1998, The Tacoma Narrows Bridge Collapse was selected for preservation in the United States National Film Registry by the Library of Congress as being "culturally, historically, or aesthetically significant." The sound track which accompanies this archived film clip is also an historic item, originally recorded for a 1940's documentary.

2009-03-28

87

Angle performance on optima MDxt  

NASA Astrophysics Data System (ADS)

Angle control on medium current implanters is important due to the high angle-sensitivity of typical medium current implants, such as halo implants. On the Optima MDxt, beam-to-wafer angles are controlled in both the horizontal and vertical directions. In the horizontal direction, the beam angle is measured through six narrow slits, and any angle adjustment is made by electrostatically steering the beam, while cross-wafer beam parallelism is adjusted by changing the focus of the electrostatic parallelizing lens (P-lens). In the vertical direction, the beam angle is measured through a high aspect ratio mask, and any angle adjustment is made by slightly tilting the wafer platen prior to implant. A variety of tests were run to measure the accuracy and repeatability of Optima MDxt's angle control. SIMS profiles of a high energy, channeling sensitive condition show both the cross-wafer angle uniformity, along with the small-angle resolution of the system. Angle repeatability was quantified by running a channeling sensitive implant as a regular monitor over a seven month period and measuring the sheet resistance-to-angle sensitivity. Even though crystal cut error was not controlled for in this case, when attributing all Rs variation to angle changes, the overall angle repeatability was measured as 0.16° (1?). A separate angle repeatability test involved running a series of V-curves tests over a four month period using low crystal cut wafers selected from the same boule. The results of this test showed the angle repeatability to be <0.1° (1?).

David, Jonathan; Kamenitsa, Dennis

2012-11-01

88

Angle performance on optima MDxt  

SciTech Connect

Angle control on medium current implanters is important due to the high angle-sensitivity of typical medium current implants, such as halo implants. On the Optima MDxt, beam-to-wafer angles are controlled in both the horizontal and vertical directions. In the horizontal direction, the beam angle is measured through six narrow slits, and any angle adjustment is made by electrostatically steering the beam, while cross-wafer beam parallelism is adjusted by changing the focus of the electrostatic parallelizing lens (P-lens). In the vertical direction, the beam angle is measured through a high aspect ratio mask, and any angle adjustment is made by slightly tilting the wafer platen prior to implant. A variety of tests were run to measure the accuracy and repeatability of Optima MDxt's angle control. SIMS profiles of a high energy, channeling sensitive condition show both the cross-wafer angle uniformity, along with the small-angle resolution of the system. Angle repeatability was quantified by running a channeling sensitive implant as a regular monitor over a seven month period and measuring the sheet resistance-to-angle sensitivity. Even though crystal cut error was not controlled for in this case, when attributing all Rs variation to angle changes, the overall angle repeatability was measured as 0.16 Degree-Sign (1{sigma}). A separate angle repeatability test involved running a series of V-curves tests over a four month period using low crystal cut wafers selected from the same boule. The results of this test showed the angle repeatability to be <0.1 Degree-Sign (1{sigma}).

David, Jonathan; Kamenitsa, Dennis [Axcelis Technologies, Inc., 108 Cherry Hill Dr, Beverly, MA 01915 (United States)

2012-11-06

89

Two-Camera Acquisition and Tracking of a Flying Target  

NASA Technical Reports Server (NTRS)

A method and apparatus have been developed to solve the problem of automated acquisition and tracking, from a location on the ground, of a luminous moving target in the sky. The method involves the use of two electronic cameras: (1) a stationary camera having a wide field of view, positioned and oriented to image the entire sky; and (2) a camera that has a much narrower field of view (a few degrees wide) and is mounted on a two-axis gimbal. The wide-field-of-view stationary camera is used to initially identify the target against the background sky. So that the approximate position of the target can be determined, pixel locations on the image-detector plane in the stationary camera are calibrated with respect to azimuth and elevation. The approximate target position is used to initially aim the gimballed narrow-field-of-view camera in the approximate direction of the target. Next, the narrow-field-of view camera locks onto the target image, and thereafter the gimbals are actuated as needed to maintain lock and thereby track the target with precision greater than that attainable by use of the stationary camera.

Biswas, Abhijit; Assad, Christopher; Kovalik, Joseph M.; Pain, Bedabrata; Wrigley, Chris J.; Twiss, Peter

2008-01-01

90

Narrowness and Liberality  

ERIC Educational Resources Information Center

John Agresto, whose task has been to rebuild the war-ravaged infrastructure of a Middle-Eastern university system, is discouraged to see that narrow expertise is the only goal of education there, to the utter exclusion of intellectual breadth. He comments that, although it is not that bad in the U.S., he feels that doctoral programs as currently…

Agresto, John

2003-01-01

91

Determining Camera Gain in Room Temperature Cameras  

SciTech Connect

James R. Janesick provides a method for determining the amplification of a CCD or CMOS camera when only access to the raw images is provided. However, the equation that is provided ignores the contribution of dark current. For CCD or CMOS cameras that are cooled well below room temperature, this is not a problem, however, the technique needs adjustment for use with room temperature cameras. This article describes the adjustment made to the equation, and a test of this method.

Joshua Cogliati

2010-12-01

92

Investigating the Pinhole Camera and Camera Obscura  

NSDL National Science Digital Library

In this activity, students explore the nature of light, including the fact that it travels in straight lines, by building and using two visual tools. The first is a simple pinhole camera--a box with a pinhole opening. The second is a camera obscura--a tool

John Eichinger

2009-05-30

93

9. COMPLETED ROLLING CAMERA CAR ON RAILROAD TRACK AND BRIDGE ...  

Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

9. COMPLETED ROLLING CAMERA CAR ON RAILROAD TRACK AND BRIDGE LOOKING WEST, APRIL 26, 1948. (ORIGINAL PHOTOGRAPH IN POSSESSION OF DAVE WILLIS, SAN DIEGO, CALIFORNIA.) - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

94

Project Echo: Boresight Cameras for Recording Antenna Pointing Accuracy  

NASA Technical Reports Server (NTRS)

Motion picture cameras equipped with telephoto lenses were installed on the transmitting and receiving antennas at Holmdel, New Jersey. When the Echo satellite was visible, a camera obtained a photographic record of the pointing accuracy of the antenna. These data were then used to correlate variations of signal strength with deviations in antenna pointing angle.

Warthman, K. L.

1961-01-01

95

Prediction of Viking lander camera image quality  

NASA Technical Reports Server (NTRS)

Formulations are presented that permit prediction of image quality as a function of camera performance, surface radiance properties, and lighting and viewing geometry. Predictions made for a wide range of surface radiance properties reveal that image quality depends strongly on proper camera dynamic range command and on favorable lighting and viewing geometry. Proper camera dynamic range commands depend mostly on the surface albedo that will be encountered. Favorable lighting and viewing geometries depend mostly on lander orientation with respect to the diurnal sun path over the landing site, and tend to be independent of surface albedo and illumination scattering function. Side lighting with low sun elevation angles (10 to 30 deg) is generally favorable for imaging spatial details and slopes, whereas high sun elevation angles are favorable for measuring spectral reflectances.

Huck, F. O.; Burcher, E. E.; Jobson, D. J.; Wall, S. D.

1976-01-01

96

Constrained space camera assembly  

DOEpatents

A constrained space camera assembly which is intended to be lowered through a hole into a tank, a borehole or another cavity. The assembly includes a generally cylindrical chamber comprising a head and a body and a wiring-carrying conduit extending from the chamber. Means are included in the chamber for rotating the body about the head without breaking an airtight seal formed therebetween. The assembly may be pressurized and accompanied with a pressure sensing means for sensing if a breach has occurred in the assembly. In one embodiment, two cameras, separated from their respective lenses, are installed on a mounting apparatus disposed in the chamber. The mounting apparatus includes means allowing both longitudinal and lateral movement of the cameras. Moving the cameras longitudinally focuses the cameras, and moving the cameras laterally away from one another effectively converges the cameras so that close objects can be viewed. The assembly further includes means for moving lenses of different magnification forward of the cameras.

Heckendorn, Frank M. (Aiken, SC); Anderson, Erin K. (Augusta, GA); Robinson, Casandra W. (Trenton, SC); Haynes, Harriet B. (Aiken, SC)

1999-01-01

97

Vacuum Camera Cooler  

NASA Technical Reports Server (NTRS)

Acquiring cheap, moving video was impossible in a vacuum environment, due to camera overheating. This overheating is brought on by the lack of cooling media in vacuum. A water-jacketed camera cooler enclosure machined and assembled from copper plate and tube has been developed. The camera cooler (see figure) is cup-shaped and cooled by circulating water or nitrogen gas through copper tubing. The camera, a store-bought "spy type," is not designed to work in a vacuum. With some modifications the unit can be thermally connected when mounted in the cup portion of the camera cooler. The thermal conductivity is provided by copper tape between parts of the camera and the cooled enclosure. During initial testing of the demonstration unit, the camera cooler kept the CPU (central processing unit) of this video camera at operating temperature. This development allowed video recording of an in-progress test, within a vacuum environment.

Laugen, Geoffrey A.

2011-01-01

98

Advanced camera for surveys  

Microsoft Academic Search

The Advanced Camera for Surveys (ACS) is a third generation instrument for the Hubble Space Telescope (HST). It is currently planned for installation in HST during the fourth servicing mission in Summer 2001. The ACS will have three cameras.

Mark Clampin; Holland C. Ford; Frank Bartko; Pierre Y. Bely; Tom Broadhurst; Christopher J. Burrows; Edward S. Cheng; James H. Crocker; Marijn Franx; Paul D. Feldman; David A. Golimowski; George F. Hartig; Garth Illingworth; Randy A. Kimble; Michael P. Lesser; George H. Miley; Marc Postman; Marc D. Rafal; Piero Rosati; William B. Sparks; Zlatan Tsvetanov; Richard L. White; Pamela Sullivan; Paul Volmer; Tom LaJeunesse

2000-01-01

99

ISIS Camera Model Design  

NASA Astrophysics Data System (ADS)

The Integrated Software for Imagers and Spectromters (ISIS) provides camera model support for a variety of past and current NASA missions. Adding new camera models to ISIS has become easier due to object-oriented design.

Anderson, J. A.

2008-03-01

100

Contrail study with ground-based cameras  

NASA Astrophysics Data System (ADS)

Photogrammetric methods and analysis results for contrails observed with wide-angle cameras are described. Four cameras of two different types (view angle < 90° or whole-sky imager) at the ground at various positions are used to track contrails and to derive their altitude, width, and horizontal speed. Camera models for both types are described to derive the observation angles for given image coordinates and their inverse. The models are calibrated with sightings of the Sun, the Moon and a few bright stars. The methods are applied and tested in a case study. Four persistent contrails crossing each other, together with a short-lived one, are observed with the cameras. Vertical and horizontal positions of the contrails are determined from the camera images to an accuracy of better than 230 m and horizontal speed to 0.2 m s-1. With this information, the aircraft causing the contrails are identified by comparison to traffic waypoint data. The observations are compared with synthetic camera pictures of contrails simulated with the contrail prediction model CoCiP, a Lagrangian model using air traffic movement data and numerical weather prediction (NWP) data as input. The results provide tests for the NWP and contrail models. The cameras show spreading and thickening contrails, suggesting ice-supersaturation in the ambient air. The ice-supersaturated layer is found thicker and more humid in this case than predicted by the NWP model used. The simulated and observed contrail positions agree up to differences caused by uncertain wind data. The contrail widths, which depend on wake vortex spreading, ambient shear and turbulence, were partly wider than simulated.

Schumann, U.; Hempel, R.; Flentje, H.; Garhammer, M.; Graf, K.; Kox, S.; Lösslein, H.; Mayer, B.

2013-12-01

101

Contrail study with ground-based cameras  

NASA Astrophysics Data System (ADS)

Photogrammetric methods and analysis results for contrails observed with wide-angle cameras are described. Four cameras of two different types (view angle < 90° or whole-sky imager) at the ground at various positions are used to track contrails and to derive their altitude, width, and horizontal speed. Camera models for both types are described to derive the observation angles for given image coordinates and their inverse. The models are calibrated with sightings of the Sun, the Moon and a few bright stars. The methods are applied and tested in a case study. Four persistent contrails crossing each other together with a short-lived one are observed with the cameras. Vertical and horizontal positions of the contrails are determined from the camera images to an accuracy of better than 200 m and horizontal speed to 0.2 m s-1. With this information, the aircraft causing the contrails are identified by comparison to traffic waypoint data. The observations are compared with synthetic camera pictures of contrails simulated with the contrail prediction model CoCiP, a Lagrangian model using air traffic movement data and numerical weather prediction (NWP) data as input. The results provide tests for the NWP and contrail models. The cameras show spreading and thickening contrails suggesting ice-supersaturation in the ambient air. The ice-supersaturated layer is found thicker and more humid in this case than predicted by the NWP model used. The simulated and observed contrail positions agree up to differences caused by uncertain wind data. The contrail widths, which depend on wake vortex spreading, ambient shear and turbulence, were partly wider than simulated.

Schumann, U.; Hempel, R.; Flentje, H.; Garhammer, M.; Graf, K.; Kox, S.; Lösslein, H.; Mayer, B.

2013-08-01

102

Classroom multispectral imaging using inexpensive digital cameras.  

NASA Astrophysics Data System (ADS)

The proliferation of increasingly cheap digital cameras in recent years means that it has become easier to exploit the broad wavelength sensitivity of their CCDs (360 - 1100 nm) for classroom-based teaching. With the right tools, it is possible to open children's eyes to the invisible world of UVA and near-IR radiation either side of our narrow visual band. The camera-filter combinations I describe can be used to explore the world of animal vision, looking for invisible markings on flowers, or in bird plumage, for example. In combination with a basic spectroscope (such as the Project-STAR handheld plastic spectrometer, 25), it is possible to investigate the range of human vision and camera sensitivity, and to explore the atomic and molecular absorption lines from the solar and terrestrial atmospheres. My principal use of the cameras has been to teach multispectral imaging of the kind used to determine remotely the composition of planetary surfaces. A range of camera options, from 50 circuit-board mounted CCDs up to $900 semi-pro infrared camera kits (including mobile phones along the way), and various UV-vis-IR filter options will be presented. Examples of multispectral images taken with these systems are used to illustrate the range of classroom topics that can be covered. Particular attention is given to learning about spectral reflectance curves and comparing images from Earth and Mars taken using the same filter combination that it used on the Mars Rovers.

Fortes, A. D.

2007-12-01

103

cameras are watching you  

E-print Network

of software expands the small field of view that traditional pan-tilt-zoom security cameras offer. When the viewspaces of all the security cameras in an area overlap. Then it can determine the geocameras are watching you New surveillance camera being developed by Ohio

Davis, James W.

104

Ultra-fast framing camera tube  

DOEpatents

An electronic framing camera tube features focal plane image dissection and synchronized restoration of the dissected electron line images to form two-dimensional framed images. Ultra-fast framing is performed by first streaking a two-dimensional electron image across a narrow slit, thereby dissecting the two-dimensional electron image into sequential electron line images. The dissected electron line images are then restored into a framed image by a restorer deflector operated synchronously with the dissector deflector. The number of framed images on the tube's viewing screen is equal to the number of dissecting slits in the tube. The distinguishing features of this ultra-fast framing camera tube are the focal plane dissecting slits, and the synchronously-operated restorer deflector which restores the dissected electron line images into a two-dimensional framed image. The framing camera tube can produce image frames having high spatial resolution of optical events in the sub-100 picosecond range.

Kalibjian, Ralph (1051 Batavia Ave., Livermore, CA 94550)

1981-01-01

105

Ultrabroadband absorber using a deep metallic grating with narrow slits  

NASA Astrophysics Data System (ADS)

We report an ultrabroadband absorber using a deep metallic grating with narrow slits in the infrared regime. In this absorber, the ultrabroadband electromagnetic wave has been perfectly transmitted through the vacuum-grating interface due to the plasmonic Brewster angle effect, and its energy can attenuate effectively through the slits because of the enhanced electric field inside the slits. In addition, simulation results show that this ultrabroadband absorber works over a narrow angle range which is a very important feature of directional thermal source.

Liao, Yan-Lin; Zhao, Yan

2015-01-01

106

NYC Surveillance Camera Project  

NSDL National Science Digital Library

These two sites focus on the increasing numbers of surveillance cameras in New York City. The first provides a .pdf-formatted map of the more than 2,300 camera locations throughout New York as well as text listings broken down by community. The information was compiled by volunteers from the New York Civil Liberties Union (NYCLU). In addition to information on camera locations, in the news section of the site, users will find links to related Websites, FAQs, and sites related to taxi cameras and traffic cameras. Both of these sites are unabashedly anti-surveillance technology and will be appreciated by New Yorkers concerned with civil liberties issues.

1998-01-01

107

Tower Camera Handbook  

SciTech Connect

The tower camera in Barrow provides hourly images of ground surrounding the tower. These images may be used to determine fractional snow cover as winter arrives, for comparison with the albedo that can be calculated from downward-looking radiometers, as well as some indication of present weather. Similarly, during spring time, the camera images show the changes in the ground albedo as the snow melts. The tower images are saved in hourly intervals. In addition, two other cameras, the skydeck camera in Barrow and the piling camera in Atqasuk, show the current conditions at those sites.

Moudry, D

2005-01-01

108

Ultraviolet spectroscopy of narrow coronal mass ejections  

E-print Network

We present Ultraviolet Coronagraph Spectrometer (UVCS) observations of 5 narrow coronal mass ejections (CMEs) that were among 15 narrow CMEs originally selected by Gilbert et al. (2001). Two events (1999 March 27, April 15) were "structured", i.e. in white light data they exhibited well defined interior features, and three (1999 May 9, May 21, June 3) were "unstructured", i.e. appeared featureless. In UVCS data the events were seen as 4-13 deg wide enhancements of the strongest coronal lines HI Ly-alpha and OVI (1032,1037 A). We derived electron densities for several of the events from the Large Angle Spectrometric Coronagraph (LASCO) C2 white light observations. They are comparable to or smaller than densities inferred for other CMEs. We modeled the observable properties of examples of the structured (1999 April 15) and unstructured (1999 May 9) narrow CMEs at different heights in the corona between 1.5 and 2 R(Sun). The derived electron temperatures, densities and outflow speeds are similar for those two types of ejections. They were compared with properties of polar coronal jets and other CMEs. We discuss different scenarios of narrow CME formation either as a jet formed by reconnection onto open field lines or CME ejected by expansion of closed field structures. Overall, we conclude that the existing observations do not definitively place the narrow CMEs into the jet or the CME picture, but the acceleration of the 1999 April 15 event resembles acceleration seen in many CMEs, rather than constant speeds or deceleration observed in jets.

D. Dobrzycka; J. C. Raymond; D. A. Biesecker; J. Li; A. Ciaravella

2003-01-31

109

LIGHTWEIGHT PEOPLE COUNTING AND LOCALIZING IN INDOOR SPACES USING CAMERA SENSOR NODES  

E-print Network

of wide-angle camera sensor nodes mounted to the ceiling, facing down. Our design targets the architectureLIGHTWEIGHT PEOPLE COUNTING AND LOCALIZING IN INDOOR SPACES USING CAMERA SENSOR NODES Thiago are implemented and tested on a set of iMote2 camera sensor nodes deployed in our lab. 1. INTRODUCTION

Teixeira, Thiago

110

Narrow band 3 × 3 Mueller polarimetric endoscopy.  

PubMed

Mueller matrix polarimetric imaging has shown potential in tissue diagnosis but is challenging to implement endoscopically. In this work, a narrow band 3 × 3 Mueller matrix polarimetric endoscope was designed by rotating the endoscope to generate 0°, 45° and 90° linearly polarized illumination and positioning a rotating filter wheel in front of the camera containing three polarisers to permit polarization state analysis for backscattered light. The system was validated with a rotating linear polarizer and a diffuse reflection target. Initial measurements of 3 × 3 Mueller matrices on a rat are demonstrated, followed by matrix decomposition into the depolarization and retardance matrices for further analysis. Our work shows the feasibility of implementing polarimetric imaging in a rigid endoscope conveniently and economically in order to reveal diagnostic information. PMID:24298405

Qi, Ji; Ye, Menglong; Singh, Mohan; Clancy, Neil T; Elson, Daniel S

2013-01-01

111

Narrow band 3 × 3 Mueller polarimetric endoscopy  

PubMed Central

Mueller matrix polarimetric imaging has shown potential in tissue diagnosis but is challenging to implement endoscopically. In this work, a narrow band 3 × 3 Mueller matrix polarimetric endoscope was designed by rotating the endoscope to generate 0°, 45° and 90° linearly polarized illumination and positioning a rotating filter wheel in front of the camera containing three polarisers to permit polarization state analysis for backscattered light. The system was validated with a rotating linear polarizer and a diffuse reflection target. Initial measurements of 3 × 3 Mueller matrices on a rat are demonstrated, followed by matrix decomposition into the depolarization and retardance matrices for further analysis. Our work shows the feasibility of implementing polarimetric imaging in a rigid endoscope conveniently and economically in order to reveal diagnostic information. PMID:24298405

Qi, Ji; Ye, Menglong; Singh, Mohan; Clancy, Neil T.; Elson, Daniel S.

2013-01-01

112

Microchannel plate streak camera  

DOEpatents

An improved streak camera in which a microchannel plate electron multiplier is used in place of or in combination with the photocathode used in prior streak cameras. The improved streak camera is far more sensitive to photons (uv to gamma-rays) than the conventional x-ray streak camera which uses a photocathode. The improved streak camera offers gamma-ray detection with high temporal resolution. It also offers low-energy x-ray detection without attenuation inside the cathode. Using the microchannel plate in the improved camera has resulted in a time resolution of about 150 ps, and has provided a sensitivity sufficient for 1000 keV x-rays.

Wang, C.L.

1984-09-28

113

Microchannel plate streak camera  

DOEpatents

An improved streak camera in which a microchannel plate electron multiplier is used in place of or in combination with the photocathode used in prior streak cameras. The improved streak camera is far more sensitive to photons (UV to gamma-rays) than the conventional x-ray streak camera which uses a photocathode. The improved streak camera offers gamma-ray detection with high temporal resolution. It also offers low-energy x-ray detection without attenuation inside the cathode. Using the microchannel plate in the improved camera has resulted in a time resolution of about 150 ps, and has provided a sensitivity sufficient for 1000 KeV x-rays.

Wang, Ching L. (Livermore, CA)

1989-01-01

114

Microchannel plate streak camera  

DOEpatents

An improved streak camera in which a microchannel plate electron multiplier is used in place of or in combination with the photocathode used in prior streak cameras is disclosed. The improved streak camera is far more sensitive to photons (UV to gamma-rays) than the conventional x-ray streak camera which uses a photocathode. The improved streak camera offers gamma-ray detection with high temporal resolution. It also offers low-energy x-ray detection without attenuation inside the cathode. Using the microchannel plate in the improved camera has resulted in a time resolution of about 150 ps, and has provided a sensitivity sufficient for 1,000 KeV x-rays. 3 figs.

Wang, C.L.

1989-03-21

115

Analytical multicollimator camera calibration  

USGS Publications Warehouse

Calibration with the U.S. Geological survey multicollimator determines the calibrated focal length, the point of symmetry, the radial distortion referred to the point of symmetry, and the asymmetric characteristiecs of the camera lens. For this project, two cameras were calibrated, a Zeiss RMK A 15/23 and a Wild RC 8. Four test exposures were made with each camera. Results are tabulated for each exposure and averaged for each set. Copies of the standard USGS calibration reports are included. ?? 1978.

Tayman, W.P.

1978-01-01

116

Polarization encoded color camera.  

PubMed

Digital cameras would be colorblind if they did not have pixelated color filters integrated into their image sensors. Integration of conventional fixed filters, however, comes at the expense of an inability to modify the camera's spectral properties. Instead, we demonstrate a micropolarizer-based camera that can reconfigure its spectral response. Color is encoded into a linear polarization state by a chiral dispersive element and then read out in a single exposure. The polarization encoded color camera is capable of capturing three-color images at wavelengths spanning the visible to the near infrared. PMID:24690806

Schonbrun, Ethan; Möller, Guðfríður; Di Caprio, Giuseppe

2014-03-15

117

Impact of CCD camera SNR on polarimetric accuracy.  

PubMed

A comprehensive charge-coupled device (CCD) camera noise model is employed to study the impact of CCD camera signal-to-noise ratio (SNR) on polarimetric accuracy. The study shows that the standard deviations of the measured degree of linear polarization (DoLP) and angle of linear polarization (AoLP) are mainly dependent on the camera SNR. With increase in the camera SNR, both the measurement errors and the standard deviations caused by the CCD camera noise decrease. When the DoLP of the incident light is smaller than 0.1, the camera SNR should be at least 75 to achieve a measurement error of less than 0.01. When the input DoLP is larger than 0.5, a SNR of 15 is sufficient to achieve the same measurement accuracy. An experiment is carried out to verify the simulation results. PMID:25402986

Chen, Zhenyue; Wang, Xia; Pacheco, Shaun; Liang, Rongguang

2014-11-10

118

Human Activity Tracking from Moving Camera Stereo Data  

Microsoft Academic Search

We present a method for tracking human activity using observations from a moving narrow-baseline stereo camera. Range data are computed from the disparity between stereo image pairs. We propose a novel technique for calculating weighting scores from range data given body configuration hy- potheses. We use a modified Annealed Particle Filter to recover the opti- mal tracking candidate from a

John Darby; Baihua Li; Nicholas Costen

2008-01-01

119

Camera Operator and Videographer  

ERIC Educational Resources Information Center

Television, video, and motion picture camera operators produce images that tell a story, inform or entertain an audience, or record an event. They use various cameras to shoot a wide range of material, including television series, news and sporting events, music videos, motion pictures, documentaries, and training sessions. Those who film or…

Moore, Pam

2007-01-01

120

Catadioptric Omnidirectional Camera  

Microsoft Academic Search

Conventional video cameras have limited fields of view that make them restrictive in a variety of vision applications. There are several ways to enhance the field of view of an imaging system. However, the entire imaging system must have a single effective viewpoint to enable the generation of pure perspective images from a sensed image. A new camera with a

Shree K. Nayar

1997-01-01

121

Security camera video authentication  

Microsoft Academic Search

The ability to authenticate images captured by a security camera, and localise any tampered areas, will increase the value of these images as evidence in a court of law. This paper outlines the challenges in security camera video authentication, and discusses the reasons why fingerprinting, a robust type of digital signature, provides a solution preferable to semi-fragile watermarking. A fingerprint

D. K. Roberts

2002-01-01

122

Liquid crystal polarization camera  

Microsoft Academic Search

Presents a fully automated system which unites CCD camera technology with liquid crystal technology to create a polarization camera capable of sensing the polarization of reflected light from objects at pixel resolution. As polarization affords a more general physical description of light than does intensity, it can therefore provide a richer set of descriptive physical constraints for the understanding of

L. B. Wolff; T. A. Mancini

1992-01-01

123

Liquid crystal polarization camera  

Microsoft Academic Search

We present a fully automated system which unites CCD camera technology with liquid crystal technology to create a polarization camera capable of sensing the partial linear polarization of reflected light from objects at pixel resolution. As polarization sensing not only measures intensity but also additional physical parameters of light, it can therefore provide a richer set of descriptive physical constraints

Lawrence B. Wolff; Todd A. Mancini; Philippe Pouliquen; Andreas G. Andreou

1997-01-01

124

Epipolar Geometry of Panoramic Cameras  

Microsoft Academic Search

. This paper presents fundamental theory and design of centralpanoramic cameras. Panoramic cameras combine a convex hyperbolic orparabolic mirror with a perspective camera to obtain a large field of view.We show how to design a panoramic camera with a tractable geometryand we propose a simple calibration method. We derive the image formationfunction for such a camera. The main contribution of

Tomás Svoboda; Tomás Pajdla; Václav Hlavác

1998-01-01

125

Narrow-band PMD Measurements  

E-print Network

by natural or induced birefringence in the optical transmission medium, which in turn causes polarization measurements on narrow-band devices using the Agilent 8509B Lightwave Polarization Analyzer #12;Table . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .17 2 #12;I. Introduction This product note provides information about making narrow-band Polarization

Park, Namkyoo

126

Dry imaging cameras  

PubMed Central

Dry imaging cameras are important hard copy devices in radiology. Using dry imaging camera, multiformat images of digital modalities in radiology are created from a sealed unit of unexposed films. The functioning of a modern dry camera, involves a blend of concurrent processes, in areas of diverse sciences like computers, mechanics, thermal, optics, electricity and radiography. Broadly, hard copy devices are classified as laser and non laser based technology. When compared with the working knowledge and technical awareness of different modalities in radiology, the understanding of a dry imaging camera is often superficial and neglected. To fill this void, this article outlines the key features of a modern dry camera and its important issues that impact radiology workflow. PMID:21799589

Indrajit, IK; Alam, Aftab; Sahni, Hirdesh; Bhatia, Mukul; Sahu, Samaresh

2011-01-01

127

MODELING AND COMPENSATION OF GEOMETRIC DISTORTIONS OF MULTISPECTRAL CAMERAS WITH OPTICAL BANDPASS FILTER WHEELS  

Microsoft Academic Search

High-fidelity colour reproduction requires multispectral cameras for image acquisition, which, unlike RGB cameras, divide the visible electromagnetic spectrum into more than 3 channels. This can be achieved by successively placing narrow-band optical filters with different passbands between object lens and sensor of a standard b\\/w camera. The filters are arranged on a filter wheel, the rotation of which moves the

Johannes Brauers; Nils Schulte; Til Aach

128

Seasonal and vertical changes in leaf angle distribution for selected deciduous broadleaf tree species common to Europe  

NASA Astrophysics Data System (ADS)

Leaf inclination angle distribution is a key parameter in determining the transmission and reflection of radiation by vegetation canopies. It has been previously observed that leaf inclination angle might change gradually from more vertical in the upper canopy and in high light habitats to more horizontal in the lower canopy and in low light habitats [1]. Despite its importance, relatively few measurements on actual leaf angle distributions have been reported for different tree species. Even smaller number of studies have dealt with the possible seasonal changes in leaf angle distribution [2]. In this study the variation of leaf inclination angle distributions was examined both temporally throughout the growing season and vertically at different heights of trees. We report on leaf inclination angle distributions for five deciduous broadleaf species found commonly in several parts of Europe: grey alder (Alnus incana), Silver birch (Betula pendula Roth), chestnut (Castanea), Norway maple (Acer platanoides), and aspen (Populus tremula). The angles were measured using the leveled camera method [3], with the data collected at several separate heights and four times during the period of May-September 2013. The results generally indicate the greatest change in leaf inclination angles for spring, with the changes usually being the most pronounced at the top of the canopy. It should also be noted, however, that whereas the temporal variation proved to be rather consistent for different species, the vertical variation differed more between species. The leveled camera method was additionally tested in terms of sensitivity to different users. Ten people were asked to measure the leaf angles for four different species. The results indicate the method is quite robust in providing coinciding distributions irrespective of the user and level of previous experience with the method. However, certain caution must be exercised when measuring long narrow leaves. References [1] G.G. McMillen, and J.H. McClendon, "Leaf angle: an adaptive feature of sun and shade leaves," Botanical Gazette, vol. 140, pp. 437-442, 1979. [2] J. Pisek, O. Sonnentag, A.D. Richardson, and M. Mõttus, "Is the spherical leaf inclination angle distribution a valid assumption for temperate and boreal broadleaf tree species?" Agricultural and Forest Meteorology, vol. 169, pp. 186-194, 2013. [3] Y. Ryu, O. Sonnentag, T. Nilson, R. Vargas, H. Kobayashi, R. Wenk, and D. Baldocchi, "How to quantify tree leaf area index in a heterogenous savanna ecosystem: a multi-instrument and multimodel approach," Agricultural and Forest Meteorology, vol. 150, pp. 63-76, 2010.

Raabe, Kairi; Pisek, Jan; Sonnentag, Oliver; Annuk, Kalju

2014-05-01

129

Educational Applications for Digital Cameras.  

ERIC Educational Resources Information Center

Discusses uses of digital cameras in education. Highlights include advantages and disadvantages, digital photography assignments and activities, camera features and operation, applications for digital images, accessory equipment, and comparisons between digital cameras and other digitizers. (AEF)

Cavanaugh, Terence; Cavanaugh, Catherine

1997-01-01

130

Characterization of Narrow Band Filters for Infrared The Brfl and H 2 filters  

E-print Network

Characterization of Narrow Band Filters for Infrared Astronomy The Brfl and H 2 filters L. Vanzi used in Infrared Astronomy. Our study mainly quantifies the effect of temperature and tilt angle: Infrared, Narrow Band Filters, Imaging Abbreviations: IR -- Infrared; NIR -- Near infrared JEL codes: D24

Testi, Leonardo

131

Copernican craters: Early results from the Lunar Reconnaissance Orbiter Camera  

NASA Astrophysics Data System (ADS)

The youngest (Copernican) craters on the Moon provide the best examples of original crater morphology and a record of the impact flux over the last ~1 Ga in the Earth-Moon system. The LRO Narrow Angle Cameras (NAC) provide 50 cm pixels from an altitude of 50 km. With changing incidence angle, global access, and very high data rates, these cameras provide unprecedented data on lunar craters. Stereo image pairs are being acquired for detailed topographic mapping. These data allow comparisons of relative ages of the larger young craters, some of which are tied to absolute radiometric ages from Apollo-returned samples. These relative ages, the crater populations at small diameters, and details of crater morphology including ejecta and melt morphologies, allow better delineation of recent lunar history and the formation and modification of impact craters. Crater counts may also reveal differences in the formation and preservation of small diameter craters as a function of target material (e.g., unconsolidated regolith versus solid impact melt). One key question: Is the current cratering rate constant or does it fluctuate. We will constrain the very recent cratering rate (at 10-100 m diameter) by comparing LROC images with those taken by Apollo nearly 40 years ago to determine the number of new impact craters. The current cratering rate and an assumption of constant cratering rate over time may or may not correctly predict the number of craters superimposed over radiometrically-dated surfaces such as South Ray, Cone, and North Ray craters, which range from 2-50 Ma and are not saturated by 10-100 m craters. If the prediction fails with realistic consideration of errors, then the present-day cratering rate must be atypical. Secondary craters complicate this analysis, but the resolution and coverage of LROC enables improved recognition of secondary craters. Of particular interest for the youngest Copernican craters is the possibility of self-cratering. LROC is providing the the image quality needed to classify small craters by state of degradation (i.e., relative age); concentrations of craters with uniform size and age indicate secondary formation. Portion of LROC image M103703826LE showing a sparsely-cratered pond of impact melt on the floor of farside Copernican crater Necho (4.95 S, 123.6 E).

McEwen, A. S.; Hiesinger, H.; Thomas, P. C.; Robinson, M. S.; van der Bogert, C.; Ostrach, L.; Plescia, J. B.; Bray, V. J.; Tornabene, L. L.

2009-12-01

132

Ringfield lithographic camera  

DOEpatents

A projection lithography camera is presented with a wide ringfield optimized so as to make efficient use of extreme ultraviolet radiation from a large area radiation source (e.g., D{sub source} {approx_equal} 0.5 mm). The camera comprises four aspheric mirrors optically arranged on a common axis of symmetry. The camera includes an aperture stop that is accessible through a plurality of partial aperture stops to synthesize the theoretical aperture stop. Radiation from a mask is focused to form a reduced image on a wafer, relative to the mask, by reflection from the four aspheric mirrors. 11 figs.

Sweatt, W.C.

1998-09-08

133

Night Vision Camera  

NASA Technical Reports Server (NTRS)

PixelVision, Inc. developed the Night Video NV652 Back-illuminated CCD Camera, based on the expertise of a former Jet Propulsion Laboratory employee and a former employee of Scientific Imaging Technologies, Inc. The camera operates without an image intensifier, using back-illuminated and thinned CCD technology to achieve extremely low light level imaging performance. The advantages of PixelVision's system over conventional cameras include greater resolution and better target identification under low light conditions, lower cost and a longer lifetime. It is used commercially for research and aviation.

1996-01-01

134

Calibration of action cameras for photogrammetric purposes.  

PubMed

The use of action cameras for photogrammetry purposes is not widespread due to the fact that until recently the images provided by the sensors, using either still or video capture mode, were not big enough to perform and provide the appropriate analysis with the necessary photogrammetric accuracy. However, several manufacturers have recently produced and released new lightweight devices which are: (a) easy to handle, (b) capable of performing under extreme conditions and more importantly (c) able to provide both still images and video sequences of high resolution. In order to be able to use the sensor of action cameras we must apply a careful and reliable self-calibration prior to the use of any photogrammetric procedure, a relatively difficult scenario because of the short focal length of the camera and its wide angle lens that is used to obtain the maximum possible resolution of images. Special software, using functions of the OpenCV library, has been created to perform both the calibration and the production of undistorted scenes for each one of the still and video image capturing mode of a novel action camera, the GoPro Hero 3 camera that can provide still images up to 12 Mp and video up 8 Mp resolution. PMID:25237898

Balletti, Caterina; Guerra, Francesco; Tsioukas, Vassilios; Vernier, Paolo

2014-01-01

135

Calibration of Action Cameras for Photogrammetric Purposes  

PubMed Central

The use of action cameras for photogrammetry purposes is not widespread due to the fact that until recently the images provided by the sensors, using either still or video capture mode, were not big enough to perform and provide the appropriate analysis with the necessary photogrammetric accuracy. However, several manufacturers have recently produced and released new lightweight devices which are: (a) easy to handle, (b) capable of performing under extreme conditions and more importantly (c) able to provide both still images and video sequences of high resolution. In order to be able to use the sensor of action cameras we must apply a careful and reliable self-calibration prior to the use of any photogrammetric procedure, a relatively difficult scenario because of the short focal length of the camera and its wide angle lens that is used to obtain the maximum possible resolution of images. Special software, using functions of the OpenCV library, has been created to perform both the calibration and the production of undistorted scenes for each one of the still and video image capturing mode of a novel action camera, the GoPro Hero 3 camera that can provide still images up to 12 Mp and video up 8 Mp resolution. PMID:25237898

Balletti, Caterina; Guerra, Francesco; Tsioukas, Vassilios; Vernier, Paolo

2014-01-01

136

What are the benefits of having multiple camera angles?  

Atmospheric Science Data Center

... can be interpreted (with appropriate models) to document the properties of the target, just as the more familiar spectral differences are ... or minimize the effect of sun glint over the ocean and other water surfaces, thereby enabling observations even when traditional sensors are ...

2014-12-08

137

Calibration Procedures in Mid Format Camera Setups  

NASA Astrophysics Data System (ADS)

A growing number of mid-format cameras are used for aerial surveying projects. To achieve a reliable and geometrically precise result also in the photogrammetric workflow, awareness on the sensitive parts is important. The use of direct referencing systems (GPS/IMU), the mounting on a stabilizing camera platform and the specific values of the mid format camera make a professional setup with various calibration and misalignment operations necessary. An important part is to have a proper camera calibration. Using aerial images over a well designed test field with 3D structures and/or different flight altitudes enable the determination of calibration values in Bingo software. It will be demonstrated how such a calibration can be performed. The direct referencing device must be mounted in a solid and reliable way to the camera. Beside the mechanical work especially in mounting the camera beside the IMU, 2 lever arms have to be measured in mm accuracy. Important are the lever arms from the GPS Antenna to the IMU's calibrated centre and also the lever arm from the IMU centre to the Camera projection centre. In fact, the measurement with a total station is not a difficult task but the definition of the right centres and the need for using rotation matrices can cause serious accuracy problems. The benefit of small and medium format cameras is that also smaller aircrafts can be used. Like that, a gyro bases stabilized platform is recommended. This causes, that the IMU must be mounted beside the camera on the stabilizer. The advantage is, that the IMU can be used to control the platform, the problematic thing is, that the IMU to GPS antenna lever arm is floating. In fact we have to deal with an additional data stream, the values of the movement of the stabiliser to correct the floating lever arm distances. If the post-processing of the GPS-IMU data by taking the floating levers into account, delivers an expected result, the lever arms between IMU and camera can be applied. However, there is a misalignment (bore side angle) that must be evaluated by photogrammetric process using advanced tools e.g. in Bingo. Once, all these parameters have been determined, the system is capable for projects without or with only a few ground control points. But which effect has the photogrammetric process when directly applying the achieved direct orientation values compared with an AT based on a proper tiepoint matching? The paper aims to show the steps to be done by potential users and gives a kind of quality estimation about the importance and quality influence of the various calibration and adjustment steps.

Pivnicka, F.; Kemper, G.; Geissler, S.

2012-07-01

138

Pinhole Camera Model  

NSDL National Science Digital Library

The Pinhole Camera Model demonstrates the operation of a pinhole camera. Light rays leaving the top and bottom on an object of height h pass through a pinhole and strike a flat screen. These rays travel in straight lines accord with the principles of geometric optics. Drag the object and observe the image on the camera screen. Simple geometry shows that the image is inverted and that the ratio of the image to object size (the magnification) is the same as the ratio of the image to object distance. The Pinhole Camera Model was developed using the Easy Java Simulations (EJS) modeling tool. It is distributed as a ready-to-run (compiled) Java archive. Double clicking the jar file will run the program if Java is installed. You can modify this simulation if you have EJS installed by right-clicking within the map and selecting "Open Ejs Model" from the pop-up menu item.

Christian, Wolfgang

2012-04-20

139

Thermal Camera Pictures  

NSDL National Science Digital Library

This page includes pictures taken with an infrared camera. Though the pictures include very little explanation, they do demonstrate some properties of light, including reflection, as well as conduction of heat.

Falstad, Paul

2004-11-28

140

Advanced CCD camera developments  

SciTech Connect

Two charge coupled device (CCD) camera systems are introduced and discussed, describing briefly the hardware involved, and the data obtained in their various applications. The Advanced Development Group Defense Sciences Engineering Division has been actively designing, manufacturing, fielding state-of-the-art CCD camera systems for over a decade. These systems were originally developed for the nuclear test program to record data from underground nuclear tests. Today, new and interesting application for these systems have surfaced and development is continuing in the area of advanced CCD camera systems, with the new CCD camera that will allow experimenters to replace film for x-ray imaging at the JANUS, USP, and NOVA laser facilities.

Condor, A. [Lawrence Livermore National Lab., CA (United States)

1994-11-15

141

Martian North Polar Cap Recession: 2000 Mars Orbiter Camera Observations  

Microsoft Academic Search

The wide-angle cameras of the Mars Orbiter Camera (MOC) on the Mars Global Surveyor have recorded the 2000 recession of the seasonal CO2 cap in the north polar region from LS=330° to 90°. In stark contrast to the asymmetric behavior of the south seasonal cap, the seasonal north cap remains relatively circular and uniform until mid-spring when the retreat reaches

Philip B James; Bruce A Cantor

2001-01-01

142

Divergent-ray projection method for measuring the flapping angle, lag angle, and torsional angle of a bumblebee wing  

NASA Astrophysics Data System (ADS)

A divergent-ray projection (DRP) method was developed for measuring the flapping angle, lag angle, and torsional angle of bumblebee wing during beating motion. This new method can measure the spatial coordinates of an insect wing by digitizing the images that are projected by two divergent laser rays from different directions. The advantage of the DRP method is its ability to measure those three angles simultaneously using only one high-speed camera. The resolution of the DRP method can be changed easily by adjusting system parameters to meet the needs of different types of objects. The measurement results for these angles of a bumblebee wing probe the effectiveness of the DRP method in studying the flight performance of insects.

Zeng, Lijiang; Matsumoto, Hirokazu; Kawachi, Keiji

1996-11-01

143

Triangle Geometry: Angles  

NSDL National Science Digital Library

This interactive math site teaches students about angles and triangles. There are interactive activities for measuring angles, exploring types of angles, and adding angles. By using a Java applet and pictures, a proof of the Pythagorean Theorem is demonstrated.

Math Cove

2007-12-12

144

Image dissector camera system study  

NASA Technical Reports Server (NTRS)

Various aspects of a rendezvous and docking system using an image dissector detector as compared to a GaAs detector were discussed. Investigation into a gimbled scanning system is also covered and the measured video response curves from the image dissector camera are presented. Rendezvous will occur at ranges greater than 100 meters. The maximum range considered was 1000 meters. During docking, the range, range-rate, angle, and angle-rate to each reflector on the satellite must be measured. Docking range will be from 3 to 100 meters. The system consists of a CW laser diode transmitter and an image dissector receiver. The transmitter beam is amplitude modulated with three sine wave tones for ranging. The beam is coaxially combined with the receiver beam. Mechanical deflection of the transmitter beam, + or - 10 degrees in both X and Y, can be accomplished before or after it is combined with the receiver beam. The receiver will have a field-of-view (FOV) of 20 degrees and an instantaneous field-of-view (IFOV) of two milliradians (mrad) and will be electronically scanned in the image dissector. The increase in performance obtained from the GaAs photocathode is not needed to meet the present performance requirements.

Howell, L.

1984-01-01

145

Deployable Wireless Camera Penetrators  

NASA Technical Reports Server (NTRS)

A lightweight, low-power camera dart has been designed and tested for context imaging of sampling sites and ground surveys from an aerobot or an orbiting spacecraft in a microgravity environment. The camera penetrators also can be used to image any line-of-sight surface, such as cliff walls, that is difficult to access. Tethered cameras to inspect the surfaces of planetary bodies use both power and signal transmission lines to operate. A tether adds the possibility of inadvertently anchoring the aerobot, and requires some form of station-keeping capability of the aerobot if extended examination time is required. The new camera penetrators are deployed without a tether, weigh less than 30 grams, and are disposable. They are designed to drop from any altitude with the boost in transmitting power currently demonstrated at approximately 100-m line-of-sight. The penetrators also can be deployed to monitor lander or rover operations from a distance, and can be used for surface surveys or for context information gathering from a touch-and-go sampling site. Thanks to wireless operation, the complexity of the sampling or survey mechanisms may be reduced. The penetrators may be battery powered for short-duration missions, or have solar panels for longer or intermittent duration missions. The imaging device is embedded in the penetrator, which is dropped or projected at the surface of a study site at 90 to the surface. Mirrors can be used in the design to image the ground or the horizon. Some of the camera features were tested using commercial "nanny" or "spy" camera components with the charge-coupled device (CCD) looking at a direction parallel to the ground. Figure 1 shows components of one camera that weighs less than 8 g and occupies a volume of 11 cm3. This camera could transmit a standard television signal, including sound, up to 100 m. Figure 2 shows the CAD models of a version of the penetrator. A low-volume array of such penetrator cameras could be deployed from an aerobot or a spacecraft onto a comet or asteroid. A system of 20 of these penetrators could be designed and built in a 1- to 2-kg mass envelope. Possible future modifications of the camera penetrators, such as the addition of a chemical spray device, would allow the study of simple chemical reactions of reagents sprayed at the landing site and looking at the color changes. Zoom lenses also could be added for future use.

Badescu, Mircea; Jones, Jack; Sherrit, Stewart; Wu, Jiunn Jeng

2008-01-01

146

Theoretical description of functionality, applications, and limitations of SO2 cameras for the remote sensing of volcanic plumes  

NASA Astrophysics Data System (ADS)

The SO2 camera is a novel device for the remote sensing of volcanic emissions using solar radiation scattered in the atmosphere as a light source for the measurements. The method is based on measuring the ultra-violet absorption of SO2 in a narrow wavelength window around 310 nm by employing a band-pass interference filter and a 2 dimensional UV-sensitive CCD detector. The effect of aerosol scattering can in part be compensated by additionally measuring the incident radiation around 325 nm, where the absorption of SO2 is about 30 times weaker, thus rendering the method applicable to optically thin plumes. For plumes with high aerosol optical densities, collocation of an additional moderate resolution spectrometer is desirable to enable a correction of radiative transfer effects. The ability to deliver spatially resolved images of volcanic SO2 distributions at a frame rate on the order of 1 Hz makes the SO2 camera a very promising technique for volcanic monitoring and for studying the dynamics of volcanic plumes in the atmosphere. This study gives a theoretical basis for the pertinent aspects of working with SO2 camera systems, including the measurement principle, instrument design, data evaluation and technical applicability. Several issues are identified that influence camera calibration and performance. For one, changes in the solar zenith angle lead to a variable light path length in the stratospheric ozone layer and therefore change the spectral distribution of scattered solar radiation incident at the Earth's surface. The varying spectral illumination causes a shift in the calibration of the SO2 camera's results. Secondly, the lack of spectral resolution inherent in the measurement technique leads to a non-linear relationship between measured weighted average optical density and the SO2 column density. Thirdly, as is the case with all remote sensing techniques that use scattered solar radiation as a light source, the radiative transfer between the sun and the instrument is variable, with both "radiative dilution" as well as multiple scattering occurring. These effects can lead to both, over or underestimation of the SO2 column density by more than an order of magnitude. As the accurate assessment of volcanic emissions depends on our ability to correct for these issues, recommendations for correcting the individual effects during data analysis are given. Aside from the above mentioned intrinsic effects, the particular technical design of the SO2 camera can also greatly influence its performance, depending on the setup chosen. A general description of an instrument setup is given, and the advantages and disadvantages of certain specific instrument designs are discussed. Finally, several measurement examples are shown and possibilities to combine SO2 camera measurements with other remote sensing techniques are explored.

Kern, C.; Kick, F.; Lübcke, P.; Vogel, L.; Wöhrbach, M.; Platt, U.

2010-06-01

147

Theoretical description of functionality, applications, and limitations of SO2 cameras for the remote sensing of volcanic plumes  

NASA Astrophysics Data System (ADS)

The SO2 camera is a novel technique for the remote sensing of volcanic emissions using solar radiation scattered in the atmosphere as a light source for the measurements. The method is based on measuring the ultra-violet absorption of SO2 in a narrow wavelength window around 310 nm by employing a band-pass interference filter and a 2-D UV-sensitive CCD detector. The effect of aerosol scattering can be eliminated by additionally measuring the incident radiation around 325 nm where the absorption of SO2 is no longer significant, thus rendering the method applicable to optically opaque plumes. The ability to deliver spatially resolved images of volcanic SO2 distributions at a frame rate on the order of 1 Hz makes the SO2 camera a very promising technique for volcanic monitoring and for studying the dynamics of volcanic plumes in the atmosphere. This study gives a theoretical basis for the pertinent aspects of working with SO2 camera systems, including the measurement principle, instrument design, data evaluation and technical applicability. Several issues are identified that influence camera calibration and performance. For one, changes in the solar zenith angle lead to a variable light path length in the stratospheric ozone layer and therefore change the spectral distribution of scattered solar radiation incident at the Earth's surface. The thus varying spectral illumination causes a shift in the calibration of the SO2 camera's results. Secondly, the lack of spectral resolution inherent in the measurement technique leads to a non-linear relationship between measured weighted average optical density and the SO2 column density. In addition, as is the case with all remote sensing techniques that use scattered solar radiation as a light source, the radiative transfer between the sun and the instrument is variable, with both radiative dilution as well as multiple scattering occurring. These effects can lead to both, over or underestimation of the SO2 column density by more than an order of magnitude. As the accurate assessment of volcanic emissions depends on our ability to correct for these issues, recommendations for correcting the individual effects during data analysis are given. Aside from the above mentioned intrinsic effects, the particular technical design of the SO2 camera can also greatly influence its performance, depending on the chosen setup. A general description of the instrument setup is given, and the advantages and disadvantages of certain specific instrument designs are discussed. Finally, several measurement examples are shown and possibilities to combine SO2 camera measurements with other remote sensing techniques are explored.

Kern, C.; Kick, F.; Lübcke, P.; Vogel, L.; Wöhrbach, M.; Platt, U.

2010-02-01

148

Lightweight, Compact, Long Range Camera Design  

NASA Astrophysics Data System (ADS)

The model 700 camera is the latest in a 30-year series of LOROP cameras developed by McDonnell Douglas Astronautics Company (MDAC) and their predecessor companies. The design achieves minimum size and weight and is optimized for low-contrast performance. The optical system includes a 66-inch focal length, f/5.6, apochromatic lens and three folding mirrors imaging on a 4.5-inch square format. A three-axis active stabilization system provides the capability for long exposure time and, hence, fine grain films can be used. The optical path forms a figure "4" behind the lens. In front of the lens is a 45° pointing mirror. This folded configuration contributed greatly to the lightweight and compact design. This sequential autocycle frame camera has three modes of operation with one, two, and three step positions to provide a choice of swath widths within the range of lateral coverage. The magazine/shutter assembly rotates in relationship with the pointing mirror and aircraft drift angle to maintain film format alignment with the flight path. The entire camera is angular rate stabilized in roll, pitch, and yaw. It also employs a lightweight, electro-magnetically damped, low-natural-frequency spring suspension for passive isolation from aircraft vibration inputs. The combined film transport and forward motion compensation (FMC) mechanism, which is operated by a single motor, is contained in a magazine that can, depending on accessibility which is installation dependent, be changed in flight. The design also stresses thermal control, focus control, structural stiffness, and maintainability. The camera is operated from a remote control panel. This paper describes the leading particulars and features of the camera as related to weight and configuration.

Shafer, Donald V.

1983-08-01

149

MODELING DISTORTION OF SUPER-WIDE-ANGLE LENSES FOR ARCHITECTURAL AND ARCHAEOLOGICAL APPLICATIONS  

Microsoft Academic Search

Mapping of architectural and archaeological objects often encounters limitations in imaging distances (interiors, narrow streets, excavations), usually tackled via large numbers of images or special camera platforms This, however, seriously contradicts the benefits of simple, low-cost photogrammetric procedures. In these cases, furthermore, the use of digital cameras with the currently limited area of sensitive sensors may also be impracticable. In

G. E. Karras; G. Mountrakis; P. Patias; E. Petsa

1998-01-01

150

Early Experience & Multisensory Perceptual Narrowing  

PubMed Central

Perceptual narrowing is a reflection of early experience and contributes in key ways to perceptual and cognitive development. In general, findings have shown that unisensory perceptual sensitivity in early infancy is broadly tuned such that young infants respond to, and discriminate, native as well as non-native sensory inputs, whereas older infants only respond to native inputs. Recently, my colleagues and I discovered that perceptual narrowing occurs at the multisensory processing level as well. The present article reviews this new evidence and puts it in the larger context of multisensory perceptual development and the role that perceptual experience plays in it. Together, the evidence on unisensory and multisensory narrowing shows that early experience shapes the emergence of perceptual specialization and expertise. PMID:24435505

Lewkowicz, David J.

2014-01-01

151

Solid state television camera  

NASA Technical Reports Server (NTRS)

The design, fabrication, and tests of a solid state television camera using a new charge-coupled imaging device are reported. An RCA charge-coupled device arranged in a 512 by 320 format and directly compatible with EIA format standards was the sensor selected. This is a three-phase, sealed surface-channel array that has 163,840 sensor elements, which employs a vertical frame transfer system for image readout. Included are test results of the complete camera system, circuit description and changes to such circuits as a result of integration and test, maintenance and operation section, recommendations to improve the camera system, and a complete set of electrical and mechanical drawing sketches.

1976-01-01

152

Limits on neutrino oscillations in the Fermilab narrow band beam  

SciTech Connect

A search for neutrino oscillations was made using the Fermilab narrow-band neutrino beam and the 15 ft. bubble chamber. No positive signal for neutrino oscillations was observed. Limits were obtained for mixing angles and neutrino mass differences for nu/sub ..mu../ ..-->.. nu/sub e/, nu/sub ..mu../ ..-->.. nu/sub tau/, nu/sub e/ ..-->.. nu/sub e/. 5 refs.

Brucker, E.B.; Jacques, P.F.; Kalelkar, M.; Koller, E.L.; Plano, R.J.; Stamer, P.E.; Baker, N.J.; Connolly, P.L.; Kahn, S.A.; Murtagh, M.J.

1986-01-01

153

Artificial human vision camera  

NASA Astrophysics Data System (ADS)

In this paper we present a real-time vision system modeling the human vision system. Our purpose is to inspire from human vision bio-mechanics to improve robotic capabilities for tasks such as objects detection and tracking. This work describes first the bio-mechanical discrepancies between human vision and classic cameras and the retinal processing stage that takes place in the eye, before the optic nerve. The second part describes our implementation of these principles on a 3-camera optical, mechanical and software model of the human eyes and associated bio-inspired attention model.

Goudou, J.-F.; Maggio, S.; Fagno, M.

2014-10-01

154

The Beagle 2 stereo camera system  

NASA Astrophysics Data System (ADS)

The stereo camera system (SCS) was designed to provide wide-angle multi-spectral stereo imaging of the Beagle 2 landing site. Based on the Space-X micro-cameras, the primary objective was to construct a digital elevation model of the area in reach of the lander's robot arm. The SCS technical specifications and scientific objectives are described; these included panoramic 3-colour imaging to characterise the landing site; multi-spectral imaging to study the mineralogy of rocks and soils beyond the reach of the arm and solar observations to measure water vapour absorption and the atmospheric dust optical density. Also envisaged were stellar observations to determine the lander location and orientation, multi-spectral observations of Phobos & Deimos and observations of the landing site to monitor temporal changes.

Griffiths, A. D.; Coates, A. J.; Josset, J.-L.; Paar, G.; Hofmann, B.; Pullan, D.; Rüffer, P.; Sims, M. R.; Pillinger, C. T.

2005-12-01

155

Image Sensors Enhance Camera Technologies  

NASA Technical Reports Server (NTRS)

In the 1990s, a Jet Propulsion Laboratory team led by Eric Fossum researched ways of improving complementary metal-oxide semiconductor (CMOS) image sensors in order to miniaturize cameras on spacecraft while maintaining scientific image quality. Fossum s team founded a company to commercialize the resulting CMOS active pixel sensor. Now called the Aptina Imaging Corporation, based in San Jose, California, the company has shipped over 1 billion sensors for use in applications such as digital cameras, camera phones, Web cameras, and automotive cameras. Today, one of every three cell phone cameras on the planet feature Aptina s sensor technology.

2010-01-01

156

Do Speed Cameras Reduce Collisions?  

PubMed Central

We investigated the effects of speed cameras along a 26 mile segment in metropolitan Phoenix, Arizona. Motor vehicle collisions were retrospectively identified according to three time periods – before cameras were placed, while cameras were in place and after cameras were removed. A 14 mile segment in the same area without cameras was used for control purposes. Five cofounding variables were eliminated. In this study, the placement or removal of interstate highway speed cameras did not independently affect the incidence of motor vehicle collisions. PMID:24406979

Skubic, Jeffrey; Johnson, Steven B.; Salvino, Chris; Vanhoy, Steven; Hu, Chengcheng

2013-01-01

157

Photogrammetric camera calibration  

USGS Publications Warehouse

Section 2 (Calibration) of the document "Recommended Procedures for Calibrating Photogrammetric Cameras and Related Optical Tests" from the International Archives of Photogrammetry, Vol. XIII, Part 4, is reviewed in the light of recent practical work, and suggestions for changes are made. These suggestions are intended as a basis for a further discussion. ?? 1984.

Tayman, W.P.; Ziemann, H.

1984-01-01

158

Face Fixing Cameras  

E-print Network

Broadcast Transcript: Beauty is in the eye of the beholder--or on the screen of a digital camera. We are our own worst critics, so few people are satisfied with the way they appear in photographs. A Japanese company recently capitalized on this fact...

Boyd, David

2011-06-22

159

Stereo from uncalibrated cameras  

Microsoft Academic Search

The problem of computing placement of points in 3-D space, given two uncalibrated perspective views, is considered. The main theorem shows that the placement of the points is determined only up to an arbitrary projective transformation of 3-space. Given additional ground control points, however, the location of the points and the camera parameters may be determined. The method is linear

Richard Hartley; Rajiv Gupta; Tom Chang

1992-01-01

160

Secure Digital Camera  

Microsoft Academic Search

In this paper, we propose a biometric solution to solve some of the significant problems associated with use of digital camera images as evidence in a court of law. We present a lossless watermarking solution to the problems associated with digital image integrity and the relationship to its chain of custody. The integrity of digital images as evidence rests on

Paul Blythe; Jessica Fridrich

161

Communities, Cameras, and Conservation  

ERIC Educational Resources Information Center

Communities, Cameras, and Conservation (CCC) is the most exciting and valuable program the author has seen in her 30 years of teaching field science courses. In this citizen science project, students and community volunteers collect data on mountain lions ("Puma concolor") at four natural areas and public parks along the Front Range of Colorado.…

Patterson, Barbara

2012-01-01

162

Scanning type scintillation camera  

SciTech Connect

The effective area of observation of a scanning-type scintillation camera is expanded relative to the actual area scanned by shifting the position of a spatial window back and forth in the direction of scanning so that the sum of the window velocity and the velocity of actual scanning represents a predetermined scanning velocity.

Nagasawa, Y.

1981-06-16

163

The LSST Camera System  

NASA Astrophysics Data System (ADS)

The LSST camera provides a 3.2 Gigapixel focal plane array, tiled by 189 4Kx4K CCD science sensors with 10um pixels. This pixel count is a direct consequence of sampling the 9.6 deg^2 field-of-view (0.64m diameter) with 0.2 arcsec pixels (Nyquist sampling in the best expected seeing of 0.4 arcsec). The sensors are deep depleted, back-illuminated devices with a highly segmented architecture that enables the entire array to be read in 2 seconds. The detectors are grouped into 3x3 rafts, each containing its own dedicated front-end and back-end electronics boards. The rafts are mounted on a silicon carbide grid inside a vacuum cryostat, with an intricate thermal cryostat and the third of the three refractive lenses in the camera. The other two lenses are mounted in an optics structure at the front of the camera body, which also contains a mechanical shutter, and a carousel assembly that holds five large optical filters(ugrizy). A sixth optical filter will also be fabricated and can replace any of the others via procedures accomplished during daylight hours. This poster will illustrate the current mechanical design of the camera, FEA and thermal analysis of the cryostat, and overview of the data acquisition system and the performance characteristics of the filters.

Gilmore, D. Kirk; Kahn, S.; Fouts, K.; LSST Camera Team

2009-01-01

164

Camera data sheet for pictorial electronic still cameras  

Microsoft Academic Search

A data sheet is presented outlining the performance and characteristics of a Kodak DCS 200mi camera. In addition to providing information on this camera, the format and content of the data sheet could serve as a guide in the organization and display of pertinent information on electronic still cameras in general. Such data sheets are already common in silver halide

Sabine Susstrunk; Jack M. Holm

1995-01-01

165

Estimation of Relative Camera Positions for Uncalibrated Cameras  

Microsoft Academic Search

This paper considers, the determination of internal camera pa- rameters from two views of a point set in three dimensions. A non-iterative algorithm is given for determining the focal lengths of the two cameras, as well as their relative placement, assuming all other internal camera param- eters to be known. It is shown that this is all the information that

Richard I. Hartley

1992-01-01

166

Observation of planetary motion using a digital camera  

Microsoft Academic Search

A digital SLR camera with a standard lens (50 mm focal length, f\\/1.4) on a fixed tripod is used to obtain photographs of the sky which contain stars up to 8m apparent magnitude. The angle of view is large enough to ensure visual identification of the photograph with a large sky region in a stellar map. The resolution is sufficient

Jan-Peter Meyn

2008-01-01

167

3D Head Reconstruction Using Multi-camera Stream  

Microsoft Academic Search

Given information from many cameras, one can hope to get a complete 3D representation of an object. Pintavirooj and Sangworasil exploit this idea and present a system that records sequentially images from multiple view points to reconstruct a 3D shape of a static object of interest [1]. For instance, using a 60 angle of view on the image, they manage

Donghoon Kim; Rozenn Dahyot

2009-01-01

168

Allowing camera tilts for document navigation in the standard GUI: a discussion and an experiment  

Microsoft Academic Search

The current GUI is like a flight simulator whose camera points fixedly at right angle to the document, thus preventing users from looking ahead while navigating. We argue that perspective viewing of usual planar documents can help navigation. We analyze the scale implosion problem that arises with tilted cameras and we report the data of a formal experiment on document

Yves Guiard; Olivier Chapuis; Yangzhou Du; Michel Beaudouin-lafon

2006-01-01

169

Accidental Pinhole and Pinspeck Cameras  

E-print Network

We identify and study two types of “accidental” images that can be formed in scenes. The first is an accidental pinhole camera image. The second class of accidental images are “inverse” pinhole camera images, formed by ...

Torralba, Antonio

170

Automated Camera Array Fine Calibration  

NASA Technical Reports Server (NTRS)

Using aerial imagery, the JPL FineCalibration (JPL FineCal) software automatically tunes a set of existing CAHVOR camera models for an array of cameras. The software finds matching features in the overlap region between images from adjacent cameras, and uses these features to refine the camera models. It is not necessary to take special imagery of a known target and no surveying is required. JPL FineCal was developed for use with an aerial, persistent surveillance platform.

Clouse, Daniel; Padgett, Curtis; Ansar, Adnan; Cheng, Yang

2008-01-01

171

Streak camera receiver definition study  

NASA Technical Reports Server (NTRS)

Detailed streak camera definition studies were made as a first step toward full flight qualification of a dual channel picosecond resolution streak camera receiver for the Geoscience Laser Altimeter and Ranging System (GLRS). The streak camera receiver requirements are discussed as they pertain specifically to the GLRS system, and estimates of the characteristics of the streak camera are given, based upon existing and near-term technological capabilities. Important problem areas are highlighted, and possible corresponding solutions are discussed.

Johnson, C. B.; Hunkler, L. T., Sr.; Letzring, S. A.; Jaanimagi, P.

1990-01-01

172

Visual Feedback Stabilization of Balancing Tasks with Camera Misalignment  

NASA Astrophysics Data System (ADS)

In this paper, we consider visual feedback stabilization which tolerates small camera misalignment. Specifically, a balancing task with a cart-pendulum system using camera image is examined. Such a task is known to rely heavily on the detection of the vertical direction and the angle measurement error due to the camera misalignment could be fatal for stabilization. From a mathematical model of the measurement error, the effect of the misalignment is naturally represented by affine perturbation to the coefficient matrix of the output equation. Motivated by this fact, a special type of robust dynamic output feedback stabilization against polytopic uncertainty is investigated. By solving the related BMI, one can design a controller which tolerates the camera misalignment to some extent. The result is verified via experiments.

Hirata, Kentaro; Mizuno, Takashi

173

Camera Trajectory fromWide Baseline Images  

NASA Astrophysics Data System (ADS)

Camera trajectory estimation, which is closely related to the structure from motion computation, is one of the fundamental tasks in computer vision. Reliable camera trajectory estimation plays an important role in 3D reconstruction, self localization, and object recognition. There are essential issues for a reliable camera trajectory estimation, for instance, choice of the camera and its geometric projection model, camera calibration, image feature detection and description, and robust 3D structure computation. Most of approaches rely on classical perspective cameras because of the simplicity of their projection models and ease of their calibration. However, classical perspective cameras offer only a limited field of view, and thus occlusions and sharp camera turns may cause that consecutive frames look completely different when the baseline becomes longer. This makes the image feature matching very difficult (or impossible) and the camera trajectory estimation fails under such conditions. These problems can be avoided if omnidirectional cameras, e.g. a fish-eye lens convertor, are used. The hardware which we are using in practice is a combination of Nikon FC-E9 mounted via a mechanical adaptor onto a Kyocera Finecam M410R digital camera. Nikon FC-E9 is a megapixel omnidirectional addon convertor with 180° view angle which provides images of photographic quality. Kyocera Finecam M410R delivers 2272×1704 images at 3 frames per second. The resulting combination yields a circular view of diameter 1600 pixels in the image. Since consecutive frames of the omnidirectional camera often share a common region in 3D space, the image feature matching is often feasible. On the other hand, the calibration of these cameras is non-trivial and is crucial for the accuracy of the resulting 3D reconstruction. We calibrate omnidirectional cameras off-line using the state-of-the-art technique and Mi?ušík's two-parameter model, that links the radius of the image point r to the angle ? of its corresponding rays w.r.t. the optical axis as ? = ar 1+br2 . After a successful calibration, we know the correspondence of the image points to the 3D optical rays in the coordinate system of the camera. The following steps aim at finding the transformation between the camera and the world coordinate systems, i.e. the pose of the camera in the 3D world, using 2D image matches. For computing 3D structure, we construct a set of tentative matches detecting different affine covariant feature regions including MSER, Harris Affine, and Hessian Affine in acquired images. These features are alternative to popular SIFT features and work comparably in our situation. Parameters of the detectors are chosen to limit the number of regions to 1-2 thousands per image. The detected regions are assigned local affine frames (LAF) and transformed into standard positions w.r.t. their LAFs. Discrete Cosine Descriptors are computed for each region in the standard position. Finally, mutual distances of all regions in one image and all regions in the other image are computed as the Euclidean distances of their descriptors and tentative matches are constructed by selecting the mutually closest pairs. Opposed to the methods using short baseline images, simpler image features which are not affine covariant cannot be used because the view point can change a lot between consecutive frames. Furthermore, feature matching has to be performed on the whole frame because no assumptions on the proximity of the consecutive projections can be made for wide baseline images. This is making the feature detection, description, and matching much more time-consuming than it is for short baseline images and limits the usage to low frame rate sequences when operating in real-time. Robust 3D structure can be computed by RANSAC which searches for the largest subset of the set of tentative matches which is, within a predefined threshold ", consistent with an epipolar geometry. We use ordered sampling as suggested in to draw 5-tuples from the list of tentative matches ordered ascendin

Havlena, M.; Torii, A.; Pajdla, T.

2008-09-01

174

Smart Cameras as Embedded Systems  

Microsoft Academic Search

Recent technological advances are enabling a new generation of smart cameras that represent a quantum leap in sophistication. While today's digital cameras capture images, smart cameras capture high-level descriptions of the scene and analyze what they see. These devices could support a wide variety of applications including human and animal detection, surveillance, motion analysis, and facial identification. Video processing has

Wayne Wolf; Burak Ozer; Lv Tiehan

2002-01-01

175

A very narrow spectral band  

Microsoft Academic Search

The power spectrum of the variable z in the Lorenz equations with sigma=10, b=8\\/3, and r=200 appeared in previous work to contain a line superposed on a continuum. Analysis of an extended solution into consecutive segments with rather similar initial states, each segment spanning at least seven maxima of z, shows that the apparent line is actually a narrow band.

Edward N. Lorenz

1984-01-01

176

A very narrow spectral band  

Microsoft Academic Search

The power spectrum of the variablez in the Lorenz equations with s=10,b=8\\/3, andr=200 appeared in previous work to contain a line superposed on a continuum. Analysis of an extended solution into consecutive segments with rather similar initial states, each segment spanning at least seven maxima ofz, shows that the apparent line is actually a narrow band. Solutions spanning at least

Edward N. Lorenz

1984-01-01

177

NSTX Tangential Divertor Camera  

SciTech Connect

Strong magnetic field shear around the divertor x-point is numerically predicted to lead to strong spatial asymmetries in turbulence driven particle fluxes. To visualize the turbulence and associated impurity line emission near the lower x-point region, a new tangential observation port has been recently installed on NSTX. A reentrant sapphire window with a moveable in-vessel mirror images the divertor region from the center stack out to R 80 cm and views the x-point for most plasma configurations. A coherent fiber optic bundle transmits the image through a remotely selected filter to a fast camera, for example a 40500 frames/sec Photron CCD camera. A gas puffer located in the lower inboard divertor will localize the turbulence in the region near the x-point. Edge fluid and turbulent codes UEDGE and BOUT will be used to interpret impurity and deuterium emission fluctuation measurements in the divertor.

A.L. Roquemore; Ted Biewer; D. Johnson; S.J. Zweben; Nobuhiro Nishino; V.A. Soukhanovskii

2004-07-16

178

Combustion pinhole camera system  

DOEpatents

A pinhole camera system utilizing a sealed optical-purge assembly which provides optical access into a coal combustor or other energy conversion reactors. The camera system basically consists of a focused-purge pinhole optical port assembly, a conventional TV vidicon receiver, an external, variable density light filter which is coupled electronically to the vidicon automatic gain control (agc). The key component of this system is the focused-purge pinhole optical port assembly which utilizes a purging inert gas to keep debris from entering the port and a lens arrangement which transfers the pinhole to the outside of the port assembly. One additional feature of the port assembly is that it is not flush with the interior of the combustor.

Witte, Arvel B. (Rolling Hills, CA)

1984-02-21

179

The Dark Energy Camera  

NASA Astrophysics Data System (ADS)

The DES Collaboration has completed construction of the Dark Energy Camera (DECam), a 3 square degree, 570 Megapixel CCD camera which is now mounted at the prime focus of the Blanco 4-meter telescope at the Cerro Tololo Inter-American Observatory. DECam is comprised of 74 250 micron thick fully depleted CCDs: 62 2k x 4k CCDs for imaging and 12 2k x 2k CCDs for guiding and focus. A filter set of u,g,r,i,z, and Y, a hexapod for focus and lateral alignment as well as thermal management of the cage temperature. DECam will be used to perform the Dark Energy Survey with 30% of the telescope time over a 5 year period. During the remainder of the time, and after the survey, DECam will be available as a community instrument. An overview of the DECam design, construction and initial on-sky performance information will be presented.

Flaugher, Brenna; DES Collaboration

2013-01-01

180

Combustion pinhole camera system  

DOEpatents

A pinhole camera system is described utilizing a sealed optical-purge assembly which provides optical access into a coal combustor or other energy conversion reactors. The camera system basically consists of a focused-purge pinhole optical port assembly, a conventional TV vidicon receiver, an external, variable density light filter which is coupled electronically to the vidicon automatic gain control (agc). The key component of this system is the focused-purge pinhole optical port assembly which utilizes a purging inert gas to keep debris from entering the port and a lens arrangement which transfers the pinhole to the outside of the port assembly. One additional feature of the port assembly is that it is not flush with the interior of the combustor. 2 figs.

Witte, A.B.

1984-02-21

181

Mobile Phones Digital Cameras  

E-print Network

News· Tutorials· Reviews· Features· Videos· Search· Mobile Phones· Notebooks· Digital Cameras· Gaming· Computers· Audio· Software· Follow Us· Subscribe· Airport Security to Get New Scanning Device Relations Accredited online university. Get an international relations degree. www.AMUOnline.com security

Suslick, Kenneth S.

182

Hemispherical Laue camera  

DOEpatents

A hemispherical Laue camera comprises a crystal sample mount for positioning a sample to be analyzed at the center of sphere of a hemispherical, X-radiation sensitive film cassette, a collimator, a stationary or rotating sample mount and a set of standard spherical projection spheres. X-radiation generated from an external source is directed through the collimator to impinge onto the single crystal sample on the stationary mount. The diffracted beam is recorded on the hemispherical X-radiation sensitive film mounted inside the hemispherical film cassette in either transmission or back-reflection geometry. The distances travelled by X-radiation diffracted from the crystal to the hemispherical film are the same for all crystal planes which satisfy Bragg's Law. The recorded diffraction spots or Laue spots on the film thereby preserve both the symmetry information of the crystal structure and the relative intensities which are directly related to the relative structure factors of the crystal orientations. The diffraction pattern on the exposed film is compared with the known diffraction pattern on one of the standard spherical projection spheres for a specific crystal structure to determine the orientation of the crystal sample. By replacing the stationary sample support with a rotating sample mount, the hemispherical Laue camera can be used for crystal structure determination in a manner previously provided in conventional Debye-Scherrer cameras.

Li, James C. M. (Pittsford, NY); Chu, Sungnee G. (Rochester, NY)

1980-01-01

183

Gamma ray camera  

DOEpatents

A gamma ray camera for detecting rays emanating from a radiation source such as an isotope. The gamma ray camera includes a sensor array formed of a visible light crystal for converting incident gamma rays to a plurality of corresponding visible light photons, and a photosensor array responsive to the visible light photons in order to form an electronic image of the radiation therefrom. The photosensor array is adapted to record an integrated amount of charge proportional to the incident gamma rays closest to it, and includes a transparent metallic layer, photodiode consisting of a p-i-n structure formed on one side of the transparent metallic layer, and comprising an upper p-type layer, an intermediate layer and a lower n-type layer. In the preferred mode, the scintillator crystal is composed essentially of a cesium iodide (CsI) crystal preferably doped with a predetermined amount impurity, and the p-type upper intermediate layers and said n-type layer are essentially composed of hydrogenated amorphous silicon (a-Si:H). The gamma ray camera further includes a collimator interposed between the radiation source and the sensor array, and a readout circuit formed on one side of the photosensor array.

Perez-Mendez, Victor (Berkeley, CA)

1997-01-01

184

Optical performance analysis of plenoptic camera systems  

NASA Astrophysics Data System (ADS)

Adding an array of microlenses in front of the sensor transforms the capabilities of a conventional camera to capture both spatial and angular information within a single shot. This plenoptic camera is capable of obtaining depth information and providing it for a multitude of applications, e.g. artificial re-focusing of photographs. Without the need of active illumination it represents a compact and fast optical 3D acquisition technique with reduced effort in system alignment. Since the extent of the aperture limits the range of detected angles, the observed parallax is reduced compared to common stereo imaging systems, which results in a decreased depth resolution. Besides, the gain of angular information implies a degraded spatial resolution. This trade-off requires a careful choice of the optical system parameters. We present a comprehensive assessment of possible degrees of freedom in the design of plenoptic systems. Utilizing a custom-built simulation tool, the optical performance is quantified with respect to particular starting conditions. Furthermore, a plenoptic camera prototype is demonstrated in order to verify the predicted optical characteristics.

Langguth, Christin; Oberdörster, Alexander; Brückner, Andreas; Wippermann, Frank; Bräuer, Andreas

2014-09-01

185

Star identification independence on the camera parameters  

NASA Astrophysics Data System (ADS)

Star identification is the key problem of satellite attitude determination from star sensor. Since star angular distance is directly used as matching feature in traditional star identification, the precision and success rate of traditional method are highly dependent on the calibrated accuracy of the star camera parameters. In this paper, a star identification algorithm is improved. The algorithm uses interior angles of a triangle composed of observation stars as matching feature. The triangles composed by nautical star and observed star are homothetic. Interior angles of triangle are independent from both the focal length f. Thus this method is not dependent on camera parameters, and the position information is unnecessarily priori. Monte Carlo experiment shows that the probability of failing Star identification is less than 6.63%. Generally, the time of star identification process is restricted to 30ms. In addition, it can work well within the 50% error of f. Compared with traditional algorithm; this algorithm has advantage on successful identification rate and reliability.

Su, Dezhi; Chen, Dong; Zhou, Mingyu; Wang, Kun

2014-09-01

186

Angled Layers in Super Resolution  

NASA Technical Reports Server (NTRS)

Researchers used a special imaging technique with the panoramic camera on NASA's Mars Exploration Rover Opportunity to get as detailed a look as possible at a target region near eastern foot of 'Burns Cliff.' The intervening terrain was too difficult for driving the rover closer. The target is the boundary between two sections of layered rock. The layers in lower section (left) run at a marked angle to the layers in next higher section (right).

This view is the product of a technique called super resolution. It was generated from data acquired on sol 288 of Opportunity's mission (Nov. 14, 2004) from a position along the southeast wall of 'Endurance Crater.' Resolution slightly higher than normal for the panoramic camera was synthesized for this view by combining 17 separate images of this scene, each one 'dithered' or pointed slightly differently from the previous one. Computer manipulation of the individual images was then used to generate a new synthetic view of the scene in a process known mathematically as iterative deconvolution, but referred to informally as super resolution. Similar methods have been used to enhance the resolution of images from the Mars Pathfinder mission and the Hubble Space Telescope.

2004-01-01

187

Adaptive compressive sensing camera  

NASA Astrophysics Data System (ADS)

We have embedded Adaptive Compressive Sensing (ACS) algorithm on Charge-Coupled-Device (CCD) camera based on the simplest concept that each pixel is a charge bucket, and the charges comes from Einstein photoelectric conversion effect. Applying the manufactory design principle, we only allow altering each working component at a minimum one step. We then simulated what would be such a camera can do for real world persistent surveillance taking into account of diurnal, all weather, and seasonal variations. The data storage has saved immensely, and the order of magnitude of saving is inversely proportional to target angular speed. We did design two new components of CCD camera. Due to the matured CMOS (Complementary metal-oxide-semiconductor) technology, the on-chip Sample and Hold (SAH) circuitry can be designed for a dual Photon Detector (PD) analog circuitry for changedetection that predicts skipping or going forward at a sufficient sampling frame rate. For an admitted frame, there is a purely random sparse matrix [?] which is implemented at each bucket pixel level the charge transport bias voltage toward its neighborhood buckets or not, and if not, it goes to the ground drainage. Since the snapshot image is not a video, we could not apply the usual MPEG video compression and Hoffman entropy codec as well as powerful WaveNet Wrapper on sensor level. We shall compare (i) Pre-Processing FFT and a threshold of significant Fourier mode components and inverse FFT to check PSNR; (ii) Post-Processing image recovery will be selectively done by CDT&D adaptive version of linear programming at L1 minimization and L2 similarity. For (ii) we need to determine in new frames selection by SAH circuitry (i) the degree of information (d.o.i) K(t) dictates the purely random linear sparse combination of measurement data a la [?]M,N M(t) = K(t) Log N(t).

Hsu, Charles; Hsu, Ming K.; Cha, Jae; Iwamura, Tomo; Landa, Joseph; Nguyen, Charles; Szu, Harold

2013-05-01

188

Phoenix Robotic Arm Camera  

NASA Astrophysics Data System (ADS)

The Phoenix Robotic Arm Camera (RAC) is a variable-focus color camera mounted to the Robotic Arm (RA) of the Phoenix Mars Lander. It is designed to acquire both close-up images of the Martian surface and microscopic images (down to a scale of 23 ?m/pixel) of material collected in the RA scoop. The mounting position at the end of the Robotic Arm allows the RAC to be actively positioned for imaging of targets not easily seen by the Stereo Surface Imager (SSI), such as excavated trench walls and targets under the Lander structure. Color information is acquired by illuminating the target with red, green, and blue light-emitting diodes. Digital terrain models (DTM) can be generated from RAC images acquired from different view points. This can provide high-resolution stereo information about fine details of the trench walls. The large stereo baseline possible with the arm can also provide a far-field DTM. The primary science objectives of the RAC are the search for subsurface soil/ice layering at the landing site and the characterization of scoop samples prior to delivery to other instruments on board Phoenix. The RAC shall also provide low-resolution panoramas in support of SSI activities and acquire images of the Lander deck for instrument and Lander check out. The camera design was inherited from the unsuccessful Mars Polar Lander mission (1999) and further developed for the (canceled) Mars Surveyor 2001 Lander (MSL01). Extensive testing and partial recalibration qualified the MSL01 RAC flight model for integration into the Phoenix science payload.

Keller, H. U.; Goetz, W.; Hartwig, H.; Hviid, S. F.; Kramm, R.; Markiewicz, W. J.; Reynolds, R.; Shinohara, C.; Smith, P.; Tanner, R.; Woida, P.; Woida, R.; Bos, B. J.; Lemmon, M. T.

2008-10-01

189

Mars Science Laboratory Engineering Cameras  

NASA Technical Reports Server (NTRS)

NASA's Mars Science Laboratory (MSL) Rover, which launched to Mars in 2011, is equipped with a set of 12 engineering cameras. These cameras are build-to-print copies of the Mars Exploration Rover (MER) cameras, which were sent to Mars in 2003. The engineering cameras weigh less than 300 grams each and use less than 3 W of power. Images returned from the engineering cameras are used to navigate the rover on the Martian surface, deploy the rover robotic arm, and ingest samples into the rover sample processing system. The navigation cameras (Navcams) are mounted to a pan/tilt mast and have a 45-degree square field of view (FOV) with a pixel scale of 0.82 mrad/pixel. The hazard avoidance cameras (Haz - cams) are body-mounted to the rover chassis in the front and rear of the vehicle and have a 124-degree square FOV with a pixel scale of 2.1 mrad/pixel. All of the cameras utilize a frame-transfer CCD (charge-coupled device) with a 1024x1024 imaging region and red/near IR bandpass filters centered at 650 nm. The MSL engineering cameras are grouped into two sets of six: one set of cameras is connected to rover computer A and the other set is connected to rover computer B. The MSL rover carries 8 Hazcams and 4 Navcams.

Maki, Justin N.; Thiessen, David L.; Pourangi, Ali M.; Kobzeff, Peter A.; Lee, Steven W.; Dingizian, Arsham; Schwochert, Mark A.

2012-01-01

190

6.RP Security Camera  

NSDL National Science Digital Library

This is a task from the Illustrative Mathematics website that is one part of a complete illustration of the standard to which it is aligned. Each task has at least one solution and some commentary that addresses important asects of the task and its potential use. Here are the first few lines of the commentary for this task: A shop owner wants to prevent shoplifting. He decides to install a security camera on the ceiling of his shop. Below is a picture of the shop floor pla...

2012-05-01

191

Neutron Imaging Camera  

NASA Technical Reports Server (NTRS)

The Neutron Imaging Camera (NIC) is based on the Three-dimensional Track Imager (3DTI) technology developed at GSFC for gamma-ray astrophysics applications. The 3-DTI, a large volume time-projection chamber, provides accurate, approximately 0.4 mm resolution, 3-D tracking of charged particles. The incident direction of fast neutrons, En > 0.5 MeV, are reconstructed from the momenta and energies of the proton and triton fragments resulting from (sup 3)He(n,p) (sup 3)H interactions in the 3-DTI volume. The performance of the NIC from laboratory and accelerator tests is presented.

Hunter, Stanley; deNolfo, G. A.; Barbier, L. M.; Link, J. T.; Son, S.; Floyd, S. R.; Guardala, N.; Skopec, M.; Stark, B.

2008-01-01

192

Universal ICT Picosecond Camera  

NASA Astrophysics Data System (ADS)

The paper reports on the design of an ICI camera operating in the mode of linear or three-frame image scan. The camera incorporates two tubes: time-analyzing ICI PIM-107 1 with cathode S-11, and brightness amplifier PMU-2V (gain about 104) for the image shaped by the first tube. The camera is designed on the basis of streak camera AGAT-SF3 2 with almost the same power sources, but substantially modified pulse electronics. Schematically, the design of tube PIM-107 is depicted in the figure. The tube consists of cermet housing 1, photocathode 2 made in a separate vacuum volume and introduced into the housing by means of a manipulator. In a direct vicinity of the photocathode, accelerating electrode is located made of a fine-structure grid. An electrostatic lens formed by focusing electrode 4 and anode diaphragm 5 produces a beam of electrons with a "remote crossover". The authors have suggested this term for an electron beam whose crossover is 40 to 60 mm away from the anode diaphragm plane which guarantees high sensitivity of scan plates 6 with respect to multiaperture framing diaphragm 7. Beyond every diaphragm aperture, a pair of deflecting plates 8 is found shielded from compensation plates 10 by diaphragm 9. The electronic image produced by the photocathode is focused on luminescent screen 11. The tube is controlled with the help of two saw-tooth voltages applied in antiphase across plates 6 and 10. Plates 6 serve for sweeping the electron beam over the surface of diaphragm 7. The beam is either allowed toward the screen, or delayed by the diaphragm walls. In such a manner, three frames are obtained, the number corresponding to that of the diaphragm apertures. Plates 10 serve for stopping the compensation of the image streak sweep on the screen. To avoid overlapping of frames, plates 8 receive static potentials responsible for shifting frames on the screen. Changing the potentials applied to plates 8, one can control the spacing between frames and partially or fully overlap the frames. This sort of control is independent of the frequency of frame running and of their duration, and can only determine frame positioning on the screen. Since diaphragm 7 is located in the area of crossover and electron trajectories cross in the crossover, the frame is not decomposed into separate elements during its formation. The image is transferred onto the screen practically within the entire time of frame duration increasing the aperture ratio of the tube as compared to that in Ref. 3.

Lebedev, Vitaly B.; Syrtzev, V. N.; Tolmachyov, A. M.; Feldman, Gregory G.; Chernyshov, N. A.

1989-06-01

193

Uniform Lazy Narrowing Maria Alpuente1  

E-print Network

Uniform Lazy Narrowing Mar´ia Alpuente1 , Moreno Falaschi2 , Pascual Juli´an3 , and Germ´an Vidal1 narrowing [5] is currently considered the best lazy narrowing strategy. It generalizes Huet and L´evy's [15

Vidal, Germán

194

LRO Camera Imaging of Potential Landing Sites in the South Pole-Aitken Basin  

NASA Astrophysics Data System (ADS)

We show results of WAC (Wide Angle Camera) and NAC (Narrow Angle Camera) imaging of candidate landing sites within the South Pole-Aitken (SPA) basin of the Moon obtained by the Lunar Reconnaissance Orbiter during the first full year of operation. These images enable a greatly improved delineation of geologic units, determination of unit thicknesses and stratigraphy, and detailed surface characterization that has not been possible with previous data. WAC imaging encompasses the entire SPA basin, located within an area ranging from ~ 130-250 degrees east longitude and ~15 degrees south latitude to the South Pole, at different incidence angles, with the specific range of incidence dependent on latitude. The WAC images show morphology and surface detail at better than 100 m per pixel, with spatial coverage and quality unmatched by previous data sets. NAC images reveal details at the sub-meter pixel scale that enable new ways to evaluate the origins and stratigraphy of deposits. Key among new results is the capability to discern extents of ancient volcanic deposits that are covered by later crater ejecta (cryptomare) [see Petro et al., this conference] using new, complementary color data from Kaguya and Chandrayaan-1. Digital topographic models derived from WAC and NAC geometric stereo coverage show broad intercrater-plains areas where slopes are acceptably low for high-probability safe landing [see Archinal et al., this conference]. NAC images allow mapping and measurement of small, fresh craters that excavated boulders and thus provide information on surface roughness and depth to bedrock beneath regolith and plains deposits. We use these data to estimate deposit thickness in areas of interest for landing and potential sample collection to better understand the possible provenance of samples. Also, small regions marked by fresh impact craters and their associated boulder fields are readily identified by their bright ejecta patterns and marked as lander keep-out zones. We will show examples of LROC data including those for Constellation sites on the SPA rim and interior, a site between Bose and Alder Craters, sites east of Bhabha Crater, and sites on and near the “Mafic Mound” [see Pieters et al., this conference]. Together the LROC data and complementary products provide essential information for ensuring identification of safe landing and sampling sites within SPA basin that has never before been available for a planetary mission.

Jolliff, B. L.; Wiseman, S. M.; Gibson, K. E.; Lauber, C.; Robinson, M.; Gaddis, L. R.; Scholten, F.; Oberst, J.; LROC Science; Operations Team

2010-12-01

195

Accuracy Evaluation of Stereo Camera Systems with Generic Camera Models  

NASA Astrophysics Data System (ADS)

In the last decades the consumer and industrial market for non-projective cameras has been growing notably. This has led to the development of camera description models other than the pinhole model and their employment in mostly homogeneous camera systems. Heterogeneous camera systems (for instance, combine Fisheye and Catadioptric cameras) can also be easily thought of for real applications. However, it has not been quite clear, how accurate stereo vision with these cameras and models can be. In this paper, different accuracy aspects are addressed by analytical inspection, numerical simulation as well as real image data evaluation. This analysis is generic, for any camera projection model, although only polynomial and rational projection models are used for distortion free, Catadioptric and Fisheye lenses. Note that this is different to polynomial and rational radial distortion models which have been addressed extensively in literature. For single camera analysis it turns out that point features towards the image sensor borders are significantly more accurate than in center regions of the sensor. For heterogeneous two camera systems it turns out, that reconstruction accuracy decreases significantly towards image borders as different projective distortions occur.

Rueß, D.; Luber, A.; Manthey, K.; Reulke, R.

2012-07-01

196

Reflectance characteristics of the Viking lander camera reference test charts  

NASA Technical Reports Server (NTRS)

Reference test charts provide radiometric, colorimetric, and spatial resolution references for the Viking lander cameras on Mars. Reflectance measurements of these references are described, including the absolute bidirectional reflectance of the radiometric references and the relative spectral reflectance of both radiometric and colorimetric references. Results show that the bidirection reflectance of the radiometric references is Lambertian to within + or - 7% for incidence angles between 20 deg and 60 deg, and that their spectral reflectance is constant with wavelength to within + or - 5% over the spectral range of the cameras. Estimated accuracy of the measurements is + or - 0.05 in relative spectral reflectance.

Wall, S. D.; Burcher, E. E.; Jabson, D. J.

1975-01-01

197

Digital synchroballistic schlieren camera for high-speed photography of bullets and rocket sleds  

NASA Astrophysics Data System (ADS)

A high-speed digital streak camera designed for simultaneous high-resolution color photography and focusing schlieren imaging is described. The camera uses a computer-controlled galvanometer scanner to achieve synchroballistic imaging through a narrow slit. Full color 20 megapixel images of a rocket sled moving at 480 m/s and of projectiles fired at around 400 m/s were captured, with high-resolution schlieren imaging in the latter cases, using conventional photographic flash illumination. The streak camera can achieve a line rate for streak imaging of up to 2.4 million lines/s.

Buckner, Benjamin D.; L'Esperance, Drew

2013-08-01

198

Application of narrow-band television to industrial and commercial communications  

NASA Technical Reports Server (NTRS)

The development of narrow-band systems for use in space systems is presented. Applications of the technology to future spacecraft requirements are discussed along with narrow-band television's influence in stimulating development within the industry. The transferral of the technology into industrial and commercial communications is described. Major areas included are: (1) medicine; (2) education; (3) remote sensing for traffic control; and (5) weather observation. Applications in data processing, image enhancement, and information retrieval are provided by the combination of the TV camera and the computer.

Embrey, B. C., Jr.; Southworth, G. R.

1974-01-01

199

MEMS digital camera  

NASA Astrophysics Data System (ADS)

MEMS technology uses photolithography and etching of silicon wafers to enable mechanical structures with less than 1 ?m tolerance, important for the miniaturization of imaging systems. In this paper, we present the first silicon MEMS digital auto-focus camera for use in cell phones with a focus range of 10 cm to infinity. At the heart of the new silicon MEMS digital camera, a simple and low-cost electromagnetic actuator impels a silicon MEMS motion control stage on which a lens is mounted. The silicon stage ensures precise alignment of the lens with respect to the imager, and enables precision motion of the lens over a range of 300 ?m with < 5 ?m hysteresis and < 2 ?m repeatability. Settling time is < 15 ms for 200 ?m step, and < 5ms for 20 ?m step enabling AF within 0.36 sec at 30 fps. The precise motion allows COTS optics to maintain MTF > 0.8 at 20 cy/mm up to 80% field over the full range of motion. Accelerated lifetime testing has shown that the alignment and precision of motion is maintained after 8,000 g shocks, thermal cycling from - 40 C to 85 C, and operation over 20 million cycles.

Gutierrez, R. C.; Tang, T. K.; Calvet, R.; Fossum, E. R.

2007-02-01

200

Multispectral EO LOROP camera  

NASA Astrophysics Data System (ADS)

Results of an investigation into the concept of a multi- spectral (Vis/MWIR) electro-optical long-range oblique-looking digital camera are presented. Choice of the medium-wave band for the IR channel is discussed. An advanced EO camera would take advantage of available multi-element two-dimensional focal-plane-array sensors. Such detectors being of the staring type with integration times in the order of milliseconds imply provisions to be made to keep the sensor foot print fixed on ground during exposure. So precisely controlled forward-motion compensation as well as passive vibration isolation along with excellent active three-axis stabilization is of paramount importance. In order to provide large ground coverage together with good geometrical resolution, the ground is paved with slightly overlapping sensor foot prints by means of step-wise sweeping the line of sight. Space constraints due to existing- pod geometry as opposed to desired sensitivity performance particularly govern the optics design. An off-axis catoptric system has been chosen for the front objective as a most promising method to cope with the two far-away wavebands as well as to have the different sizes of respective detectors coincide with the instantaneous images of identical foot prints of the two channels.

Hoefft, Jens-Rainer; Tietz, Traugott

1998-10-01

201

Polarization-sensitive spectral-domain optical coherence tomography using a multi-line single camera spectrometer.  

PubMed

We describe a polarization sensitive spectral domain optical coherence tomography technique based on a single camera spectrometer that includes a multiplexed custom grating, camera lenses, and a high-speed three-line CCD camera. Two orthogonally polarized beams could be separately taken by two lines of the camera as a result of vertically different incident angles. The system could provide the imaging capabilities of a full camera speed and increased measurable depth. The proposed optical coherence tomography system could make a distinction between the normal muscle and cancerous tissue from the chest of a DSred GFP mouse and the OCT images were compared with those of in vivo confocal microscopy. PMID:21164725

Song, Cheol; Ahn, MyoungKi; Gweon, DaeGab

2010-11-01

202

Automatic feature extraction for panchromatic Mars Global Surveyor Mars Orbiter camera imagery  

NASA Astrophysics Data System (ADS)

The Mars Global Surveyor Mars Orbiter Camera (MOC) has produced tens of thousands of images, which contain a wealth of information about the surface of the planet Mars. Current manual analysis techniques are inadequate for the comprehensive analysis of such a large dataset, while development of handwritten feature extraction algorithms is laborious and expensive. This project investigates application of an automatic feature extraction approach to analysis of the MOC narrow angle panchromatic dataset, using an evolutionary computation software package called GENIE. GENIE uses a genetic algorithm to assemble feature extraction tools from low-level image operators. Each generated tool is evaluated against training data provided by the user. The best tools in each generation are allowed to 'reproduce' to produce the next generation, and the population of tools is permitted to evolve until it converges to a solution or reaches a level of performance specified by the user. Craters are one of the most scientifically interesting and most numerous features in the MOC data set, and present a wide range of shapes at many spatial scales. We now describe preliminary results on development of a crater finder algorithm using the GENIE software.

Plesko, Catherine S.; Brumby, Steven P.; Leovy, Conway B.

2002-01-01

203

In-flight calibration of the Dawn Framing Camera II: Flat fields and stray light correction  

NASA Astrophysics Data System (ADS)

The NASA Dawn spacecraft acquired thousands of images of asteroid Vesta during its year-long orbital tour, and is now on its way to asteroid Ceres. A method for calibrating images acquired by the onboard Framing Camera was described by Schröder et al. (Schröder et al. [2013]. Icarus 226, 1304). However, their method is only valid for point sources. In this paper we extend the calibration to images of extended sources like Vesta. For this, we devise a first-order correction for in-field stray light, which is known to plague images taken through the narrow band filters, and revise the flat fields that were acquired in an integrating sphere before launch. We used calibrated images of the Vesta surface to construct simple photometric models for all filters, that allow us to study how the spectrum changes with increasing phase angle (phase reddening). In combination with these models, our calibration method can be used to create near-seamless mosaics that are radiometrically accurate to a few percent. Such mosaics are provided in JVesta, the Vesta version of the JMARS geographic information system.

Schröder, S. E.; Mottola, S.; Matz, K.-D.; Roatsch, T.

2014-05-01

204

The All Sky Camera Network  

NSDL National Science Digital Library

In 2001, the All Sky Camera Network came to life as an outreach program to connect the Denver Museum of Nature and Science (DMNS) exhibit Space Odyssey with Colorado schools. The network is comprised of cameras placed strategically at schools throughout Colorado to capture fireballs--rare events that produce meteorites. Students involved in the network participate in an authentic, inquiry-based experience by tracking meteor events. This article discusses the past, present, and future of the All Sky Camera Network.

Caldwell, Andy

2005-02-01

205

Performance evaluation of a pixellated Ge Compton camera  

NASA Astrophysics Data System (ADS)

An ongoing project is being carried out to develop a high purity germanium (HPGe) Compton camera for medical applications. The Compton camera offers many potential advantages over the conventional gamma camera. The camera reported in this paper comprises two pixellated germanium detector planes housed 9.6 cm apart in the same vacuum housing. The camera has 177 pixels, 152 in the scatter detector and 25 in the absorption detector. The pixels are 4 × 4 mm2 with a thickness of 4 mm in the scatter detector and 10 mm in the absorption detector. Images have been taken for a variety of test objects including point sources, a ring source and a Perspex phantom. The measured angular resolution is 9.4° ± 0.4° for a 662 keV gamma-ray source at 3 cm. Due to the limited number of readout modules a multiple-view technique was used to image the source distributions from different angles and simulate the pixel arrangement in the full camera.

Alnaaimi, M. A.; Royle, G. J.; Ghoggali, W.; Banoqitah, E.; Cullum, I.; Speller, R. D.

2011-06-01

206

Angles of Reflection  

NSDL National Science Digital Library

This interactive simulation shows what happens to light when it hits a mirror. The simluation allows the user to change the angle of the incoming or incident light wave and see the corresponding reflected angle.

Davidson, Michael W.; Tchourioukanov, Kirill I.

2006-06-15

207

Tacoma Narrows Bridge: Extreme History  

NSDL National Science Digital Library

Stretching across the southern portion of Puget Sound, the elegant Tacoma Narrows bridge is considered one of the finest suspension bridges in the United States. The current bridge is the second on the site, as it was constructed in 1950 to serve as a replacement to the famous "Galloping Gertie" bridge, which collapsed in a windstorm in the fall of 1940. Currently, the Washington State Department of Transportation is building a bridge to replace the existing structure, and it is anticipated that it will be completed in 2007. This site offers a host of materials on all three structures, including ample information on the construction of the bridges and their aesthetic appeal. Along with these materials, the site also provides a glossary of related terms, Weird Facts, and some information about the dog "Tubby", who perished when "Galloping Gertie" collapsed on that fateful fall day back in 1940.

208

Raskar, Camera Culture, MIT Media Lab Camera Culture  

E-print Network

Raskar, Camera Culture, MIT Media Lab Camera Culture Ramesh Raskar Computational Light Transport Computational Photography Inverse problems MIT Media Lab Ramesh Raskar http://raskar.info raskar@mit.edu #12;MIT CultureCreating new ways to capture and share visual information MIT Media Lab Ramesh Raskar http

209

What's Your Angle?  

NSDL National Science Digital Library

In this activity, students devise procedures for using a protractor to measure the number of degrees in an angle, and use inductive reasoning to develop "angle sense." Then they describe circumstances and careers that require a working knowledge of angles and their measurements.

2010-01-01

210

A Motionless Camera  

NASA Technical Reports Server (NTRS)

Omniview, a motionless, noiseless, exceptionally versatile camera was developed for NASA as a receiving device for guiding space robots. The system can see in one direction and provide as many as four views simultaneously. Developed by Omniview, Inc. (formerly TRI) under a NASA Small Business Innovation Research (SBIR) grant, the system's image transformation electronics produce a real-time image from anywhere within a hemispherical field. Lens distortion is removed, and a corrected "flat" view appears on a monitor. Key elements are a high resolution charge coupled device (CCD), image correction circuitry and a microcomputer for image processing. The system can be adapted to existing installations. Applications include security and surveillance, teleconferencing, imaging, virtual reality, broadcast video and military operations. Omniview technology is now called IPIX. The company was founded in 1986 as TeleRobotics International, became Omniview in 1995, and changed its name to Interactive Pictures Corporation in 1997.

1994-01-01

211

Neutron Imaging Camera  

NASA Technical Reports Server (NTRS)

We describe the Neutron Imaging Camera (NIC) being developed for DTRA applications by NASA/GSFC and NSWC/Carderock. The NIC is based on the Three-dimensional Track Imager (3-DTI) technology developed at GSFC for gamma-ray astrophysics applications. The 3-DTI, a large volume time-projection chamber, provides accurate, approximately 0.4 mm resolution. 3-D tracking of charged particles. The incident direction of fast neutrons, E(sub N) > 0.5 MeV. arc reconstructed from the momenta and energies of the proton and triton fragments resulting from 3He(n,p)3H interactions in the 3-DTI volume. We present angular and energy resolution performance of the NIC derived from accelerator tests.

Hunter, Stanley D.; DeNolfo, Georgia; Floyd, Sam; Krizmanic, John; Link, Jason; Son, Seunghee; Guardala, Noel; Skopec, Marlene; Stark, Robert

2008-01-01

212

Narrow-band radiation wavelength measurement by processing digital photographs in RAW format  

SciTech Connect

The technique of measuring the mean wavelength of narrow-band radiation in the 455 - 625-nm range using the image of the emitting surface is presented. The data from the camera array unprocessed by the built-in processor (RAW format) are used. The method is applied for determining the parameters of response of holographic sensors. Depending on the wavelength and brightness of the image fragment, the mean square deviation of the wavelength amounts to 0.3 - 3 nm. (experimental techniques)

Kraiskii, A V; Mironova, T V; Sultanov, T T [P N Lebedev Physical Institute, Russian Academy of Sciences, Moscow (Russian Federation)

2012-12-31

213

UT 15-color dichroic-mirror camera and future prospects  

NASA Astrophysics Data System (ADS)

We describe the design and performance of a dichroic-mirror camera (DMC) which can take 15 narrow-band images simultaneously. We separate the wavelength range of 390 - 950 nm into 15 narrow bands with 14 dichroic mirrors. The detector of DMC is a mosaic CCD camera which has 15 CCDs (TI TC-215). When we put DMC to the MAGNUM 2-m (F/9) telescope being built at Haleakala, Hawaii, the field of view becomes about 4.5 arcmin in diameter. The design of optics shows that we can get an image size of about 0.13 arcsec r.m.s. or better (without atmosphere), though we use only two different kind of lenses (the camera lens and the collimator lens). The system throughput of DMC as a function of wavelength is quantitatively estimated. Simulations using spectra of galaxies ad Qnd QSOs show that DMC can get a signal-to-noise (S/N) of approximately greater than 5/band/object for galaxies (I(subscript AB) equals 22) and QSOs (I(subscript AB) equals 23) in the images of 30 min - 1 hour exposure taken with a 2-m telescope. Future prospects for possible enhancements and applications of dichroic-mirror system are also discussed. Having a DMC with resolution of about 30 would be very adequate for high redshift supernovae search. To get higher resolution, DMC combined with Fabry-Perot's is an interesting possibility.

Doi, Mamoru; Furusawa, Hisanori; Nakata, Fumiaki; Okamura, Sadanori; Sekiguchi, Masaki; Shimasaku, Kazuhiro; Takeyama, Norihide

1998-07-01

214

Saturn's hydrogen aurora: Wide field and planetary camera 2 imaging from the Hubble Space Telescope  

Microsoft Academic Search

Wide field and planetary camera 2\\/Hubble Space Telescope (WFPC2\\/HST) images of Saturn's far ultraviolet aurora reveal emissions confined to a narrow band of latitudes near Saturn's north and south poles. The aurorae are most prominent in the morning sector with patterns that appear fixed in local time. The geographic distribution and vertical extent of the auroral emissions seen in these

John T. Trauger; John T. Clarke; Gilda E. Ballester; Robin W. Evans; Christopher J. Burrows; David Crisp; John S. Gallagher; Richard E. Griffiths; J. Jeff Hester; John G. Hoessel; Jon A. Holtzman; John E. Krist; Jeremy R. Mould; Raghvendra Sahai; Paul A. Scowen; Karl R. Stapelfeldt; Alan M. Watson

1998-01-01

215

Lightweight Video-Camera Head  

NASA Technical Reports Server (NTRS)

Compact, lightweight video camera head constructed by remounting lens and charge-coupled-device image detector from small commercial video camera in separate assembly. Useful in robotics, artificial vision, and vision guidance systems. Designed to be mounted on visor of helmet to monitor motions of eyes in experiments on vestibulo-ocular reflexes.

Proctor, David R.

1988-01-01

216

Bayesian Multi-Camera Surveillance  

Microsoft Academic Search

The task of multi-camera surveillance is to re- construct the paths taken by all moving objects that are temporarily visible from multiple non-overlapping cameras. We present a Bayesian formalization of this task, where the optimal solution is the set of object paths with the highest posterior probability given the observed data. We show how to eciently approximate the maximum a

Vera Kettnaker; Ramin Zabih

1999-01-01

217

The "All Sky Camera Network"  

ERIC Educational Resources Information Center

In 2001, the "All Sky Camera Network" came to life as an outreach program to connect the Denver Museum of Nature and Science (DMNS) exhibit "Space Odyssey" with Colorado schools. The network is comprised of cameras placed strategically at schools throughout Colorado to capture fireballs--rare events that produce meteorites. Meteorites have great…

Caldwell, Andy

2005-01-01

218

Airborne ballistic camera tracking systems  

NASA Technical Reports Server (NTRS)

An operational airborne ballistic camera tracking system was tested for operational and data reduction feasibility. The acquisition and data processing requirements of the system are discussed. Suggestions for future improvements are also noted. A description of the data reduction mathematics is outlined. Results from a successful reentry test mission are tabulated. The test mission indicated that airborne ballistic camera tracking systems are feasible.

Redish, W. L.

1976-01-01

219

The Eye of the Camera  

Microsoft Academic Search

This study addresses the effects of security cameras on prosocial behavior. Results from previous studies indicate that the presence of others can trigger helping behavior, arising from the need for approval of others. Extending these findings, the authors propose that security cameras can likewise trigger such approval-seeking behaviors by implying the presence of a watchful eye. Because people vary in

Rompay van Thomas J. L; Dorette J. Vonk; Marieke L. Fransen

2009-01-01

220

Camera artifacts in IUE spectra  

NASA Technical Reports Server (NTRS)

This study of emission line mimicking features in the IUE cameras has produced an atlas of artifiacts in high-dispersion images with an accompanying table of prominent artifacts and a table of prominent artifacts in the raw images along with a medium image of the sky background for each IUE camera.

Bruegman, O. W.; Crenshaw, D. M.

1994-01-01

221

Liquid-crystal polarization camera  

Microsoft Academic Search

We present a fully automated system which unites CCD camera technology with liquid crystal technology to create a polarization camera capable of sensing the polarization of reflected light from objects at pixel resolution. As polarization affords a more general physical description of light than does intensity, it can therefore provide a richer set of descriptive physical constraints for the understanding

Lawrence B. Wolff; Todd A. Mancini

1992-01-01

222

SEOS frame camera applications study  

NASA Technical Reports Server (NTRS)

A research and development satellite is discussed which will provide opportunities for observation of transient phenomena that fall within the fixed viewing circle of the spacecraft. The evaluation of possible applications for frame cameras, for SEOS, are studied. The computed lens characteristics for each camera are listed.

1974-01-01

223

THE DEATH OF THE CAMERA  

Microsoft Academic Search

In this paper I examine how Edward Branigan, in his new book Projecting a Camera: Language?Games in Film Theory (2006), uses Wittgenstein's later philosophy to describe the multiple, contradictory, literal and metaphorical meanings of fundamental concepts in film theory—such as ‘movement’, ‘point of view’, ‘camera’, ‘frame’ and ‘causality’. Towards the end of the paper I rationally reconstruct Branigan's main arguments

Warren Buckland

2006-01-01

224

Math Applications with Digital Cameras  

NSDL National Science Digital Library

Digital cameras are excellent tools for enhancing the math classroom. They may also help, under the current math reform movement, to move away from isolated problems in a drill-and-practice format to one more rooted in authentic experiences and problem solving. Check this Web site for lesson ideas that combine digitial cameras and math.

Cavanaugh, Terrance; Cavanaugh, Catherine

225

Advisory Surveillance Cameras Page 1 of 2  

E-print Network

be produced and how will it be secured, who will have access to the tape? 7. At what will the camera to ensure the cameras' presence doesn't create a false sense of security #12;Advisory ­ Surveillance CamerasAdvisory ­ Surveillance Cameras May 2008 Page 1 of 2 ADVISORY -- USE OF CAMERAS/VIDEO SURVEILLANCE

Liebling, Michael

226

Testing of the Apollo 15 Metric Camera System.  

NASA Technical Reports Server (NTRS)

Description of tests conducted (1) to assess the quality of Apollo 15 Metric Camera System data and (2) to develop production procedures for total block reduction. Three strips of metric photography over the Hadley Rille area were selected for the tests. These photographs were utilized in a series of evaluation tests culminating in an orbitally constrained block triangulation solution. Results show that film deformations up to 25 and 5 microns are present in the mapping and stellar materials, respectively. Stellar reductions can provide mapping camera orientations with an accuracy that is consistent with the accuracies of other parameters in the triangulation solutions. Pointing accuracies of 4 to 10 microns can be expected for the mapping camera materials, depending on variations in resolution caused by changing sun angle conditions.

Helmering, R. J.; Alspaugh, D. H.

1972-01-01

227

What convention is used for the illumination and view angles?  

Atmospheric Science Data Center

... from the direction of travel to local north. For both the Sun and cameras, azimuth describes the direction in which the photons are ... light, and near 180 degrees for backward scattered light. Sun and View angles are available in the MISR Geometric Parameters (MIB2GEOP) ...

2014-12-08

228

IMAX camera (12-IML-1)  

NASA Technical Reports Server (NTRS)

The IMAX camera system is used to record on-orbit activities of interest to the public. Because of the extremely high resolution of the IMAX camera, projector, and audio systems, the audience is afforded a motion picture experience unlike any other. IMAX and OMNIMAX motion picture systems were designed to create motion picture images of superior quality and audience impact. The IMAX camera is a 65 mm, single lens, reflex viewing design with a 15 perforation per frame horizontal pull across. The frame size is 2.06 x 2.77 inches. Film travels through the camera at a rate of 336 feet per minute when the camera is running at the standard 24 frames/sec.

1992-01-01

229

Performance of new low-cost 1/3" security cameras for meteor surveillance  

NASA Astrophysics Data System (ADS)

It has been almost 5 years since the CAMS (Cameras for All-sky Meteor Surveillance) system specifications were designed for video meteor surveillance. CAMS has been based on a relatively expensive black-and-white Watec WAT-902H2 Ultimate camera, which uses a 1/2" sensor. In this paper, we investigate the ability of new, lower cost color cameras based on smaller 1/3" sensors to be able to perform adequately for CAMS. We did not expect them to equal or outperform the sensitivity for the same field of view of the Watec 1/2" camera, but the goal was to see if they could perform within the tolerances of the sensitivity requirements for the CAMS project. Their lower cost brings deployment of meteor surveillance cameras within reach of amateur astronomers and makes it possible to deploy many more cameras to increase yield. The lens focal length is matched to the elevation angle of the camera to maintain an image scale and spatial resolution close to that of the standard CAMS camera and lens combination, crucial for obtaining sufficiently accurate orbital elements. An all-sky array based on 16 such cameras, to be operated from a single computer, was built and the performance of individual cameras was tested.

Samuels, Dave; Wray, James; Gural, Peter S.; Jenniskens, Peter

2014-02-01

230

Accuracy in fixing ship's positions by camera survey of bearings  

NASA Astrophysics Data System (ADS)

The paper presents the results of research on the possibilities of fixing ship position coordinates based on results of surveying bearings on navigational marks with the use of the CCD camera. Accuracy of the determination of ship position coordinates, expressed in terms of the mean error, was assumed to be the basic criterion of this estimation. The first part of the paper describes the method of the determination of the resolution and the mean error of the angle measurement, taken with a camera, and also the method of the determination of the mean error of position coordinates when two or more bearings were measured. There have been defined three software applications assigned for the development of navigational sea charts with accuracy areas mapped on. The second part contains the results of studying accuracy in fixing ship position coordinates, carried out in the Gulf of Gdansk, with the use of bearings taken obtained with the Rolleiflex and Sony cameras. The results are presented in a form of diagrams of the mean error of angle measurement, also in the form of navigational charts with accuracy fields mapped on. In the final part, basing on results obtained, the applicability of CCD cameras in automation of coastal navigation performance process is discussed.

Naus, Krzysztof; W??, Mariusz

2011-01-01

231

Observation of planetary motion using a digital camera  

NASA Astrophysics Data System (ADS)

A digital SLR camera with a standard lens (50 mm focal length, f/1.4) on a fixed tripod is used to obtain photographs of the sky which contain stars up to 8m apparent magnitude. The angle of view is large enough to ensure visual identification of the photograph with a large sky region in a stellar map. The resolution is sufficient to observe the motion of Saturn, and of the satellites of Jupiter, within 24 h.

Meyn, Jan-Peter

2008-09-01

232

Observation of Planetary Motion Using a Digital Camera  

ERIC Educational Resources Information Center

A digital SLR camera with a standard lens (50 mm focal length, f/1.4) on a fixed tripod is used to obtain photographs of the sky which contain stars up to 8[superscript m] apparent magnitude. The angle of view is large enough to ensure visual identification of the photograph with a large sky region in a stellar map. The resolution is sufficient to…

Meyn, Jan-Peter

2008-01-01

233

Mars Orbiter Camera Views the 'Face on Mars' - Best View from Viking  

NASA Technical Reports Server (NTRS)

Shortly after midnight Sunday morning (5 April 1998 12:39 AM PST), the Mars Orbiter Camera (MOC) on the Mars Global Surveyor (MGS) spacecraft successfully acquired a high resolution image of the 'Face on Mars' feature in the Cydonia region. The image was transmitted to Earth on Sunday, and retrieved from the mission computer data base Monday morning (6 April 1998). The image was processed at the Malin Space Science Systems (MSSS) facility 9:15 AM and the raw image immediately transferred to the Jet Propulsion Laboratory (JPL) for release to the Internet. The images shown here were subsequently processed at MSSS.

The picture was acquired 375 seconds after the spacecraft's 220th close approach to Mars. At that time, the 'Face', located at approximately 40.8o N, 9.6o W, was 275 miles (444 km) from the spacecraft. The 'morning' sun was 25o above the horizon. The picture has a resolution of 14.1 feet (4.3 meters) per pixel, making it ten times higher resolution than the best previous image of the feature, which was taken by the Viking Mission in the mid-1970's. The full image covers an area 2.7 miles (4.4 km) wide and 25.7 miles (41.5 km) long.

This Viking Orbiter image is one of the best Viking pictures of the area Cydonia where the 'Face' is located. Marked on the image are the 'footprint' of the high resolution (narrow angle) Mars Orbiter Camera image and the area seen in enlarged views (dashed box). See PIA01440-1442 for these images in raw and processed form.

Malin Space Science Systems and the California Institute of Technology built the MOC using spare hardware from the Mars Observer mission. MSSS operates the camera from its facilities in San Diego, CA. The Jet Propulsion Laboratory's Mars Surveyor Operations Project operates the Mars Global Surveyor spacecraft with its industrial partner, Lockheed Martin Astronautics, from facilities in Pasadena, CA and Denver, CO.

1998-01-01

234

Critical Heat Flux In Inclined Rectangular Narrow Long Channel  

SciTech Connect

In the TMI-2 accident, the lower part of the reactor pressure vessel had been overheated and then rather rapidly cooled down, as was later identified in a vessel investigation project. This accounted for the possibility of gap cooling feasibility. For this reason, several investigations were performed to determine the critical heat flux (CHF) from the standpoint of invessel retention. The experiments are conducted to investigate the general boiling phenomena, and the triggering mechanism for the CHF in a narrow gap using a 5 x 105 mm2 crevice type heater assembly and de-mineralized water. The test parameters include the gap size of 5 mm, and the surface orientation angles from the downward facing position (180o) to the vertical position (90o). The orientation angle affects the bubble layer and escape from the narrow gap. The CHF is less than that in a shorter channel, compared with the previous experiments having a heated length of 35 mmin the copper test section.

J. L. Rempe; S. W. Noh; Y. H. Kim; K. Y. Suh; F.B.Cheung; S. B. Kim

2005-05-01

235

21 CFR 886.1120 - Opthalmic camera.  

Code of Federal Regulations, 2011 CFR

...2011-04-01 2011-04-01 false Opthalmic camera. 886.1120 Section 886.1120 Food...Diagnostic Devices § 886.1120 Opthalmic camera. (a) Identification. An ophthalmic camera is an AC-powered device intended to...

2011-04-01

236

21 CFR 886.1120 - Ophthalmic camera.  

Code of Federal Regulations, 2014 CFR

...2014-04-01 2014-04-01 false Ophthalmic camera. 886.1120 Section 886.1120 Food...Diagnostic Devices § 886.1120 Ophthalmic camera. (a) Identification. An ophthalmic camera is an AC-powered device intended to...

2014-04-01

237

21 CFR 886.1120 - Opthalmic camera.  

Code of Federal Regulations, 2010 CFR

...2010-04-01 2010-04-01 false Opthalmic camera. 886.1120 Section 886.1120 Food...Diagnostic Devices § 886.1120 Opthalmic camera. (a) Identification. An ophthalmic camera is an AC-powered device intended to...

2010-04-01

238

21 CFR 886.1120 - Opthalmic camera.  

Code of Federal Regulations, 2012 CFR

...2012-04-01 2012-04-01 false Opthalmic camera. 886.1120 Section 886.1120 Food...Diagnostic Devices § 886.1120 Opthalmic camera. (a) Identification. An ophthalmic camera is an AC-powered device intended to...

2012-04-01

239

21 CFR 886.1120 - Ophthalmic camera.  

Code of Federal Regulations, 2013 CFR

...2013-04-01 2013-04-01 false Ophthalmic camera. 886.1120 Section 886.1120 Food...Diagnostic Devices § 886.1120 Ophthalmic camera. (a) Identification. An ophthalmic camera is an AC-powered device intended to...

2013-04-01

240

Imaging chlorophyll fluorescence with an airborne narrow-band multispectral camera for vegetation stress detection  

Microsoft Academic Search

Progress in assessing the feasibility for imaging fluorescence using the O2-A band with 1 nm full-width half-maximum (FWHM) bands centered at 757.5 and 760.5 nm is reported in this paper. Multispectral airborne data was acquired at 150 m above ground level in the thermal, visible and near infrared regions yielding imagery at 15 cm spatial resolution. Simultaneous field experiments conducted in olive, peach, and

P. J. Zarco-Tejada; J. A. J. Berni; L. Suárez; G. Sepulcre-Cantó; F. Morales; J. R. Miller

2009-01-01

241

Cameras for digital microscopy.  

PubMed

This chapter reviews the fundamental characteristics of charge-coupled devices (CCDs) and related detectors, outlines the relevant parameters for their use in microscopy, and considers promising recent developments in the technology of detectors. Electronic imaging with a CCD involves three stages--interaction of a photon with the photosensitive surface, storage of the liberated charge, and readout or measurement of the stored charge. The most demanding applications in fluorescence microscopy may require as much as four orders of greater magnitude sensitivity. The image in the present-day light microscope is usually acquired with a CCD camera. The CCD is composed of a large matrix of photosensitive elements (often referred to as "pixels" shorthand for picture elements, which simultaneously capture an image over the entire detector surface. The light-intensity information for each pixel is stored as electronic charge and is converted to an analog voltage by a readout amplifier. This analog voltage is subsequently converted to a numerical value by a digitizer situated on the CCD chip, or very close to it. Several (three to six) amplifiers are required for each pixel, and to date, uniform images with a homogeneous background have been a problem because of the inherent difficulties of balancing the gain in all of the amplifiers. Complementary metal oxide semiconductor sensors also exhibit relatively high noise associated with the requisite high-speed switching. Both of these deficiencies are being addressed, and sensor performance is nearing that required for scientific imaging. PMID:23931507

Spring, Kenneth R

2013-01-01

242

Overview of Neutrino Mixing Models and Their Mixing Angle Predictions  

E-print Network

An overview of neutrino-mixing models is presented with emphasis on the types of horizontal flavor and vertical family symmetries that have been invoked. Distributions for the mixing angles of many models are displayed. Ways to differentiate among the models and to narrow the list of viable models are discussed.

Carl H. Albright

2009-11-12

243

X-ray Pinhole Camera Measurements  

SciTech Connect

The development of the rod pinch diode [1] has led to high-resolution radiography for dynamic events such as explosive tests. Rod pinch diodes use a small diameter anode rod, which extends through the aperture of a cathode plate. Electrons borne off the aperture surface can self-insulate and pinch onto the tip of the rod, creating an intense, small x-ray source (Primary Pinch). This source has been utilized as the main diagnostic on numerous experiments that include high-value, single-shot events. In such applications there is an emphasis on machine reliability, x-ray reproducibility, and x-ray quality [2]. In tests with the baseline rod pinch diode, we have observed that an additional pinch (Secondary Pinch) occurs at the interface near the anode rod and the rod holder. This suggests that stray electrons exist that are not associated with the Primary Pinch. In this paper we present measurements on both pinches using an x-ray pinhole camera. The camera is placed downstream of the Primary Pinch at an angle of 60° with respect to the diode centerline. This diagnostic will be employed to diagnose x-ray reproducibility and quality. In addition, we will investigate the performance of hybrid diodes relating to the formation of the Primary and Secondary Pinches.

Nelson, D. S. [NSTec; Berninger, M. J. [NSTec; Flores, P. A. [NSTec; Good, D. E. [NSTec; Henderson, D. J. [NSTec; Hogge, K. W. [NSTec; Huber, S. R. [NSTec; Lutz, S. S. [NSTec; Mitchell, S. E. [NSTec; Howe, R. A. [NSTec; Mitton, C. V. [NSTec; Molina, I. [NSTec; Bozman, D. R. [SNL; Cordova, S. R. [SNL; Mitchell, D. R. [SNL; Oliver, B. V. [SNL; Ormond, E. C. [SNL

2013-07-01

244

Narrow gap electronegative capacitive discharges  

SciTech Connect

Narrow gap electronegative (EN) capacitive discharges are widely used in industry and have unique features not found in conventional discharges. In this paper, plasma parameters are determined over a range of decreasing gap length L from values for which an electropositive (EP) edge exists (2-region case) to smaller L-values for which the EN region connects directly to the sheath (1-region case). Parametric studies are performed at applied voltage V{sub rf}=500 V for pressures of 10, 25, 50, and 100 mTorr, and additionally at 50 mTorr for 1000 and 2000 V. Numerical results are given for a parallel plate oxygen discharge using a planar 1D3v (1 spatial dimension, 3 velocity components) particle-in-cell (PIC) code. New interesting phenomena are found for the case in which an EP edge does not exist. This 1-region case has not previously been investigated in detail, either numerically or analytically. In particular, attachment in the sheaths is important, and the central electron density n{sub e0} is depressed below the density n{sub esh} at the sheath edge. The sheath oscillations also extend into the EN core, creating an edge region lying within the sheath and not characterized by the standard diffusion in an EN plasma. An analytical model is developed using minimal inputs from the PIC results, and compared to the PIC results for a base case at V{sub rf}=500 V and 50 mTorr, showing good agreement. Selected comparisons are made at the other voltages and pressures. A self-consistent model is also developed and compared to the PIC results, giving reasonable agreement.

Kawamura, E.; Lieberman, M. A.; Lichtenberg, A. J. [Department of Electrical Engineering and Computer Sciences, University of California, Berkeley, California 94720 (United States)] [Department of Electrical Engineering and Computer Sciences, University of California, Berkeley, California 94720 (United States)

2013-10-15

245

Solidifying the solid angle.  

PubMed

This paper evaluates the position of the solid angle in its application to modeling in electrocardiology. Particular attention is paid to the use of the solid angle for linking cardiac electric activity to the potentials observed on the body surface. In this application, the solid angle is a dominant factor in the expression of the sources during depolarization known as the uniform double layer. In the related equivalent double layer model, the contributions of the elementary sources are also expressed in terms of solid angles, their strength not being uniform. A recently developed theory allows the equivalent double layer to be applied to both depolarization and repolarization. PMID:12539116

van Oosterom, A

2002-01-01

246

Design of motion compensation mechanism of satellite remote sensing camera  

NASA Astrophysics Data System (ADS)

With the development of aerospace remote sensing technology, the ground resolution of remote sensing camera enhances continuously. Since there is relative motion between camera and ground target when taking pictures, the target image recorded in recording media is moved and blurred. In order to enhance the imaging quality and resolution of the camera, the image motion had to be compensated. In order to abate the effect of image motion to image quality of space camera and improve the resolution of the camera, the compensation method of image motion to space camera is researched. First, the reason of producing drift angle and adjustment principle are analyzed in this paper. This paper introduce the composition and transmission principle of image motion compensation mechanism. Second, the system adopts 80C31 as controller of drift angle, and adopts stepping motor for actuators, and adopts absolute photoelectric encoder as the drift Angle measuring element. Then the control mathematical model of the image motion compensation mechanism are deduced, and it achieve the closed-loop control of the drift angle position. At the last, this paper analyses the transmission precision of the mechanism. Through the experiment, we measured the actual precision of the image motion compensation mechanism, and compared with the theoretical analysis.There are two major contributions in this paper. First, the traditional image motion compensation mechanism is big volume and quality heavy. This has not fit for the development trend of space camera miniaturization and lightweight. But if reduce the volume and quality of mechanism, it will bring adverse effects for the precision and stiffness of mechanism. For this problem, This paper designed a image motion compensation that have some advantages such as small size, light weight at the same time, high precision, stiffness and so on. This image motion compensation can be applicable to the small optics cameras with high resolution. Second, the traditional mechanism control need to corrected, fitting and iterative for the control formula of mechanism. Only in this way, we can get the optimal control mathematical model. This paper has high precision of the control formula derived. It can achieve the high precision control without fitting, It also simplify the difficulty of control mathematical model establishment.This paper designed the range of adjusting of image motion compensation mechanism between -5°~ +5°. Based on choosing-5°, -4°, -3°, -2°, -1°, 0°, +1°, +2, +3°, +4°, +4° as the expectation value of the imaginary drift angle, we get ten groups of the fact data in adjusting drift angle measured. The test results show that the precision of the drift angle control system can be achieved in 1. It can meet the system requirements that the precision of the control system is less than 3 ', and it can achieve the high-precision image motion compensation.

Gu, Song; Yan, Yong; Xu, Kai; Jin, Guang

2011-08-01

247

BLAST Autonomous Daytime Star Cameras  

E-print Network

We have developed two redundant daytime star cameras to provide the fine pointing solution for the balloon-borne submillimeter telescope, BLAST. The cameras are capable of providing a reconstructed pointing solution with an absolute accuracy camera combines a 1 megapixel CCD with a 200 mm f/2 lens to image a 2 degree x 2.5 degree field of the sky. The instruments are autonomous. An internal computer controls the temperature, adjusts the focus, and determines a real-time pointing solution at 1 Hz. The mechanical details and flight performance of these instruments are presented.

Marie Rex; Edward Chapin; Mark J. Devlin; Joshua Gundersen; Jeff Klein; Enzo Pascale; Donald Wiebe

2006-05-01

248

Omnidirectional narrow bandpass filters based on one-dimensional superconductor-dielectric photonic crystal heterostructors  

NASA Astrophysics Data System (ADS)

By using transfer matrix method, narrow passbands of TE wave from one-dimensional superconductor-dielectric photonic crystal heterostructures are presented. Various superconductor within the two-fluid model are considered. Results show that by selecting proper width for superconductor and dielectric layers and proper materials selection, single narrow passband in visible region can be obtained. Behavior of these passbands versus the temperature of superconductors, external magnetic field and incident angle are considered. We have shown that it is possible to obtain omnidirectional passbands with examining temperature, the dilation factor of the half part of a heterostructure and the other parameters of the heterostrutures. These tunable narrow passband may be useful in designing of narrow band filters or multichannel filters.

Barvestani, Jamal

2015-01-01

249

A Three-Line Stereo Camera Concept for Planetary Exploration  

NASA Technical Reports Server (NTRS)

This paper presents a low-weight stereo camera concept for planetary exploration. The camera uses three CCD lines within the image plane of one single objective. Some of the main features of the camera include: focal length-90 mm, FOV-18.5 deg, IFOV-78 (mu)rad, convergence angles-(+/-)10 deg, radiometric dynamics-14 bit, weight-2 kg, and power consumption-12.5 Watts. From an orbit altitude of 250 km the ground pixel size is 20m x 20m and the swath width is 82 km. The CCD line data is buffered in the camera internal mass memory of 1 Gbit. After performing radiometric correction and application-dependent preprocessing the data is compressed and ready for downlink. Due to the aggressive application of advanced technologies in the area of microelectronics and innovative optics, the low mass and power budgets of 2 kg and 12.5 Watts is achieved, while still maintaining high performance. The design of the proposed light-weight camera is also general purpose enough to be applicable to other planetary missions such as the exploration of Mars, Mercury, and the Moon. Moreover, it is an example of excellent international collaboration on advanced technology concepts developed at DLR, Germany, and NASA's Jet Propulsion Laboratory, USA.

Sandau, Rainer; Hilbert, Stefan; Venus, Holger; Walter, Ingo; Fang, Wai-Chi; Alkalai, Leon

1997-01-01

250

Person re-identification over camera networks using multi-task distance metric learning.  

PubMed

Person reidentification in a camera network is a valuable yet challenging problem to solve. Existing methods learn a common Mahalanobis distance metric by using the data collected from different cameras and then exploit the learned metric for identifying people in the images. However, the cameras in a camera network have different settings and the recorded images are seriously affected by variability in illumination conditions, camera viewing angles, and background clutter. Using a common metric to conduct person reidentification tasks on different camera pairs overlooks the differences in camera settings; however, it is very time-consuming to label people manually in images from surveillance videos. For example, in most existing person reidentification data sets, only one image of a person is collected from each of only two cameras; therefore, directly learning a unique Mahalanobis distance metric for each camera pair is susceptible to over-fitting by using insufficiently labeled data. In this paper, we reformulate person reidentification in a camera network as a multitask distance metric learning problem. The proposed method designs multiple Mahalanobis distance metrics to cope with the complicated conditions that exist in typical camera networks. We address the fact that these Mahalanobis distance metrics are different but related, and learned by adding joint regularization to alleviate over-fitting. Furthermore, by extending, we present a novel multitask maximally collapsing metric learning (MtMCML) model for person reidentification in a camera network. Experimental results demonstrate that formulating person reidentification over camera networks as multitask distance metric learning problem can improve performance, and our proposed MtMCML works substantially better than other current state-of-the-art person reidentification methods. PMID:24956368

Ma, Lianyang; Yang, Xiaokang; Tao, Dacheng

2014-08-01

251

Infants Experience Perceptual Narrowing for Nonprimate Faces  

ERIC Educational Resources Information Center

Perceptual narrowing--a phenomenon in which perception is broad from birth, but narrows as a function of experience--has previously been tested with primate faces. In the first 6 months of life, infants can discriminate among individual human and monkey faces. Though the ability to discriminate monkey faces is lost after about 9 months, infants…

Simpson, Elizabeth A.; Varga, Krisztina; Frick, Janet E.; Fragaszy, Dorothy

2011-01-01

252

Vision Sensors and Cameras  

NASA Astrophysics Data System (ADS)

Silicon charge-coupled-device (CCD) imagers have been and are a specialty market ruled by a few companies for decades. Based on CMOS technologies, active-pixel sensors (APS) began to appear in 1990 at the 1 ?m technology node. These pixels allow random access, global shutters, and they are compatible with focal-plane imaging systems combining sensing and first-level image processing. The progress towards smaller features and towards ultra-low leakage currents has provided reduced dark currents and ?m-size pixels. All chips offer Mega-pixel resolution, and many have very high sensitivities equivalent to ASA 12.800. As a result, HDTV video cameras will become a commodity. Because charge-integration sensors suffer from a limited dynamic range, significant processing effort is spent on multiple exposure and piece-wise analog-digital conversion to reach ranges >10,000:1. The fundamental alternative is log-converting pixels with an eye-like response. This offers a range of almost a million to 1, constant contrast sensitivity and constant colors, important features in professional, technical and medical applications. 3D retino-morphic stacking of sensing and processing on top of each other is being revisited with sub-100 nm CMOS circuits and with TSV technology. With sensor outputs directly on top of neurons, neural focal-plane processing will regain momentum, and new levels of intelligent vision will be achieved. The industry push towards thinned wafers and TSV enables backside-illuminated and other pixels with a 100% fill-factor. 3D vision, which relies on stereo or on time-of-flight, high-speed circuitry, will also benefit from scaled-down CMOS technologies both because of their size as well as their higher speed.

Hoefflinger, Bernd

253

Narrow band gap amorphous silicon semiconductors  

DOEpatents

Disclosed is a narrow band gap amorphous silicon semiconductor comprising an alloy of amorphous silicon and a band gap narrowing element selected from the group consisting of Sn, Ge, and Pb, with an electron donor dopant selected from the group consisting of P, As, Sb, Bi and N. The process for producing the narrow band gap amorphous silicon semiconductor comprises the steps of forming an alloy comprising amorphous silicon and at least one of the aforesaid band gap narrowing elements in amount sufficient to narrow the band gap of the silicon semiconductor alloy below that of amorphous silicon, and also utilizing sufficient amounts of the aforesaid electron donor dopant to maintain the amorphous silicon alloy as an n-type semiconductor.

Madan, A.; Mahan, A.H.

1985-01-10

254

Color processing in digital cameras  

Microsoft Academic Search

In seconds, a digital camera performs full-color rendering that includes color filter array interpolation, color calibration, anti-aliasing, infrared rejection, and white-point correction. This article describes the design decisions that make this processing possible

J. Adams; K. Parulski; K. Spaulding

1998-01-01

255

An Inexpensive Digital Infrared Camera  

ERIC Educational Resources Information Center

Details are given for the conversion of an inexpensive webcam to a camera specifically sensitive to the near infrared (700-1000 nm). Some experiments and practical applications are suggested and illustrated. (Contains 9 figures.)

Mills, Allan

2012-01-01

256

National Park Service Web Cameras  

NSDL National Science Digital Library

The National Park Service (NPS) operates digital cameras at many parks in the lower 48 states, Alaska, and Hawaii to help educate the public on air quality issues. These cameras often show the effects of air pollution, especially visibility impairment. Because the cameras are typically located near air quality monitoring sites, their web pages display other information along with the photo, such as current levels of ozone, particulate matter, or sulfur dioxide, visual range, and weather conditions. The digital photos are usually updated every 15 minutes, while air quality data values are revised hourly. Charts of the last ten days of hourly weather, ozone, particulate matter, or sulfur dioxide data are also available. The cameras are accessible by clicking on an interactive map.

257

DIP ANGLE MEASURING DEVICE  

E-print Network

tude of the angle between the true horizon and the visible horizon known as the angle .... chred within the bearing by a support member 38 which is screwed onto the ... and the bearing. 55 to prevent a metal to metal contact, and a steel spring.

Boris J. Gavrisheff

258

What's My Angle?  

NSDL National Science Digital Library

This interactive module offers learners the opportunity to check their knowledge of angle measure and estimation, and the use of a protractor. There are ten activities that vary the tasks and the degree of precision. The site is designed for whiteboard demonstration as well, and it includes a tutorial on angle types and protractor use.

2011-01-01

259

Reading Angles in Maps  

ERIC Educational Resources Information Center

Preschool children can navigate by simple geometric maps of the environment, but the nature of the geometric relations they use in map reading remains unclear. Here, children were tested specifically on their sensitivity to angle. Forty-eight children (age 47:15-53:30 months) were presented with fragments of geometric maps, in which angle sections…

Izard, Véronique; O'Donnell, Evan; Spelke, Elizabeth S.

2014-01-01

260

Solid State Television Camera (CID)  

NASA Technical Reports Server (NTRS)

The design, development and test are described of a charge injection device (CID) camera using a 244x248 element array. A number of video signal processing functions are included which maximize the output video dynamic range while retaining the inherently good resolution response of the CID. Some of the unique features of the camera are: low light level performance, high S/N ratio, antiblooming, geometric distortion, sequential scanning and AGC.

Steele, D. W.; Green, W. T.

1976-01-01

261

The Dark Energy Camera (DECam)  

Microsoft Academic Search

We describe the Dark Energy Camera (DECam), which will be the primary instrument used in the Dark Energy Survey. DECam will be a 3 sq. deg. mosaic camera mounted at the prime focus of the Blanco 4m telescope at the Cerro-Tololo International Observatory (CTIO). DECam includes a large mosaic CCD focal plane, a five element optical corrector, five filters (g,r,i,z,Y),

D. L. DePoy; T. Abbott; J. Annis; M. Antonik; M. Barceló; R. Bernstein; B. Bigelow; D. Brooks; E. Buckley-Geer; J. Campa; L. Cardiel; F. Castander; J. Castilla; H. Cease; S. Chappa; E. Dede; G. Derylo; H. T. Diehl; P. Doel; J. DeVicente; J. Estrada; D. Finley; B. Flaugher; E. Gaztanaga; D. Gerdes; M. Gladders; V. Guarino; G. Gutierrez; M. Haney; S. Holland; K. Honscheid; D. Huffman; I. Karliner; D. Kau; S. Kent; M. Kozlovsky; D. Kubik; K. Kuehn; S. Kuhlmann; K. Kuk; F. Leger; H. Lin; G. Martinez; M. Martinez; W. Merritt; J. Mohr; P. Moore; T. Moore; B. Nord; R. Ogando; J. Olsen; B. Onal; J. Peoples; T. Qian; N. Roe; E. Sanchez; V. Scarpine; R. Schmidt; R. Schmitt; M. Schubnell; K. Schultz; M. Selen; T. Shaw; V. Simaitis; J. Slaughter; C. Smith; H. Spinka; A. Stefanik; W. Stuermer; R. Talaga; G. Tarle; J. Thaler; D. Tucker; A. Walker; S. Worswick; A. Zhao

2008-01-01

262

Special Angle Pairs Discovery Activity  

NSDL National Science Digital Library

This lesson uses a discovery approach to identify the special angles formed when a set of parallel lines is cut by a transversal. During this lesson students identify the angle pair and the relationship between the angles. Students use this relationship and special angle pairs to make conjectures about which angle pairs are considered special angles.

Barbara Henry

2012-04-16

263

Teleconferencing system using virtual camera  

NASA Astrophysics Data System (ADS)

Teleconferencing systems are becoming more popular because of advance in image processing and broadband network. Nevertheless, communicating with someone at a remote location through a teleconferencing system still presents problems because of the difficulty of establishing and maintaining eye contact. Eye contact is essential to having a natural dialog. The purpose of our study is to make eye contact possible during dialog by using image processing with no particular devices, such as color markers, sensors, which are equipped users with and IR cameras. Proposed teleconferencing system is composed of a computer, a display attached to the computer, and four cameras. We define virtual camera as the camera, which exists virtually in 3D space. By using the proposed method, we can acquire a front view of a person that is taken with the virtual camera. The image taken with virtual camera is generated by extracting a same feature point among four face images. Feature point sets among four face images are automatically corresponded by using Epipolar Plane Images (EPIs). The users can establish eye contact by acquiring the front face view, and moreover, they also can obtain various views of the image because 3D points of the object can be extracted from EPIs. Through these facilities, the proposed system will provide users with better communication than previous systems. In this paper, we describe the concept, implementation and the evaluation are described in various perspective.

Shibuichi, Daisuke; Tanaka, Tsukasa; Terashima, Nobuyoshi; Tominaga, Hideyoshi

2000-05-01

264

Method of rotation angle measurement in machine vision based on calibration pattern with spot array  

SciTech Connect

We propose a method of rotation angle measurement with high precision in machine vision. An area scan CCD camera, imaging lens, and calibration pattern with a spot array make up the measurement device for measuring the rotation angle. The calibration pattern with a spot array is installed at the rotation part, and the CCD camera is set at a certain distance from the rotation components. The coordinates of the spots on the calibration pattern is acquired through the vision image of the calibration pattern captured by the CCD camera. At the initial position of the calibration pattern, the camera is calibrated with the spot array; the mathematical model of distortion error of the CCD camera is built. With the equation of coordinate rotation measurement, the rotation angle of the spot array is detected. In the theoretic simulation, noise of different levels is added to the coordinates of the spot array. The experiment results show that the measurement device can measure the rotation angle precisely with a noncontact method. The standard deviation of rotation angle measurement is smaller than 3 arc sec. The measurement device can measure both microangles and large angles.

Li Weimin; Jin Jing; Li Xiaofeng; Li Bin

2010-02-20

265

An accumulative x-ray streak camera with 280-fs resolution  

NASA Astrophysics Data System (ADS)

We demonstrated a significant improvement in the resolution of the x-ray streak camera by reducing the electron beam size in the deflection plates. This was accomplished by adding a slit in front of the focusing lens and the deflection plates. The temporal resolution reached 280 fs when the slit width was 5 mm. The camera was operated in an accumulative mode and tested by using a 25 fs laser with 2 kHz repetition rate and 1-2% RMS pulse energy stability. We conclude that deflection aberrations, which limit the resolution of the camera, can be appreciably reduced by eliminating the wide-angle electrons.

Shakya, Mahendra M.; Chang, Zenghu

2004-11-01

266

Inversion formulas for cone transforms arising in application of Compton cameras  

NASA Astrophysics Data System (ADS)

It has been suggested that a Compton camera should be used in single photon emission computed tomography because a conventional gamma camera has low efficiency. It brings about a cone transform, which maps a function onto the set of its surface integrals over cones determined by the detector position, the central axis, and the opening angle of the Compton camera. We provide inversion formulas using complete Compton data for three- and two-dimensional cases. Numerical simulations are presented to demonstrate the suggested algorithms in dimension two. Also, we discuss other inversions and the stability estimates of a cone transform with a fixed central axis.

Jung, Chang-Yeol; Moon, Sunghwan

2015-01-01

267

The European Photon Imaging Camera on XMM-Newton: The MOS cameras : The MOS cameras  

Microsoft Academic Search

The EPIC focal plane imaging spectrometers on XMM-Newton use CCDs to record the images and spectra of celestial X-ray sources focused by the three X-ray mirrors. There is one camera at the focus of each mirror; two of the cameras contain seven MOS CCDs, while the third uses twelve PN CCDs, defining a circular field of view of 30' diameter

M. J. L. Turner; A. Abbey; M. Arnaud; M. Balasini; M. Barbera; E. Belsole; P. J. Bennie; J. P. Bernard; G. F. Bignami; M. Boer; U. Briel; I. Butler; C. Cara; C. Chabaud; R. Cole; A. Collura; M. Conte; A. Cros; M. Denby; P. Dhez; G. Di Coco; J. Dowson; P. Ferrando; S. Ghizzardi; F. Gianotti; C. V. Goodall; L. Gretton; R. G. Griffiths; O. Hainaut; J. F. Hochedez; A. D. Holland; E. Jourdain; E. Kendziorra; A. Lagostina; R. Laine; N. La Palombara; M. Lortholary; D. Lumb; P. Marty; S. Molendi; C. Pigot; E. Poindron; K. A. Pounds; J. N. Reeves; C. Reppin; R. Rothenflug; P. Salvetat; J. L. Sauvageot; D. Schmitt; S. Sembay; A. D. T. Short; J. Spragg; J. Stephen; L. Strüder; A. Tiengo; M. Trifoglio; J. Trümper; S. Vercellone; L. Vigroux; G. Villa; M. J. Ward; S. Whitehead; E. Zonca

2001-01-01

268

A testbed for wide-field, high-resolution, gigapixel-class cameras.  

PubMed

The high resolution and wide field of view (FOV) of the AWARE (Advanced Wide FOV Architectures for Image Reconstruction and Exploitation) gigapixel class cameras present new challenges in calibration, mechanical testing, and optical performance evaluation. The AWARE system integrates an array of micro-cameras in a multiscale design to achieve gigapixel sampling at video rates. Alignment and optical testing of the micro-cameras is vital in compositing engines, which require pixel-level accurate mappings over the entire array of cameras. A testbed has been developed to automatically calibrate and measure the optical performance of the entire camera array. This testbed utilizes translation and rotation stages to project a ray into any micro-camera of the AWARE system. A spatial light modulator is projected through a telescope to form an arbitrary object space pattern at infinity. This collimated source is then reflected by an elevation stage mirror for pointing through the aperture of the objective into the micro-optics and eventually the detector of the micro-camera. Different targets can be projected with the spatial light modulator for measuring the modulation transfer function (MTF) of the system, fiducials in the overlap regions for registration and compositing, distortion mapping, illumination profiles, thermal stability, and focus calibration. The mathematics of the testbed mechanics are derived for finding the positions of the stages to achieve a particular incident angle into the camera, along with calibration steps for alignment of the camera and testbed coordinate axes. Measurement results for the AWARE-2 gigapixel camera are presented for MTF, focus calibration, illumination profile, fiducial mapping across the micro-camera for registration and distortion correction, thermal stability, and alignment of the camera on the testbed. PMID:23742532

Kittle, David S; Marks, Daniel L; Son, Hui S; Kim, Jungsang; Brady, David J

2013-05-01

269

A testbed for wide-field, high-resolution, gigapixel-class cameras  

NASA Astrophysics Data System (ADS)

The high resolution and wide field of view (FOV) of the AWARE (Advanced Wide FOV Architectures for Image Reconstruction and Exploitation) gigapixel class cameras present new challenges in calibration, mechanical testing, and optical performance evaluation. The AWARE system integrates an array of micro-cameras in a multiscale design to achieve gigapixel sampling at video rates. Alignment and optical testing of the micro-cameras is vital in compositing engines, which require pixel-level accurate mappings over the entire array of cameras. A testbed has been developed to automatically calibrate and measure the optical performance of the entire camera array. This testbed utilizes translation and rotation stages to project a ray into any micro-camera of the AWARE system. A spatial light modulator is projected through a telescope to form an arbitrary object space pattern at infinity. This collimated source is then reflected by an elevation stage mirror for pointing through the aperture of the objective into the micro-optics and eventually the detector of the micro-camera. Different targets can be projected with the spatial light modulator for measuring the modulation transfer function (MTF) of the system, fiducials in the overlap regions for registration and compositing, distortion mapping, illumination profiles, thermal stability, and focus calibration. The mathematics of the testbed mechanics are derived for finding the positions of the stages to achieve a particular incident angle into the camera, along with calibration steps for alignment of the camera and testbed coordinate axes. Measurement results for the AWARE-2 gigapixel camera are presented for MTF, focus calibration, illumination profile, fiducial mapping across the micro-camera for registration and distortion correction, thermal stability, and alignment of the camera on the testbed.

Kittle, David S.; Marks, Daniel L.; Son, Hui S.; Kim, Jungsang; Brady, David J.

2013-05-01

270

StartleCam: A Cybernetic Wearable Camera  

Microsoft Academic Search

StartleCam is a wearable video camera, computer, and sensing system, which enables the camera to be controlled via both conscious and preconscious events involving the wearer. Traditionally, a wearer consciously hits record on the video camera, or runs a computer script to trigger the camera according to some pre-specified frequency. The sys- tem described here offers an additional option: images

Jennifer Healey; Rosalind W. Picard

1998-01-01

271

Campus Security Camera Issued: April 2009  

E-print Network

Campus Security Camera Policy Issued: April 2009 Revised: November 2009 Responsible Administrative locations on the Colorado School of Mines campus. CCTV cameras, also known as security cameras, are utilized and management of security cameras and access to their recordings. There is also a need to standardize

272

Autoconfiguration of a Dynamic Nonoverlapping Camera Network  

Microsoft Academic Search

In order to monitor sufficiently large areas of interest for surveillance or any event detection, we need to look beyond stationary cameras and employ an automatically configurable network of nonoverlapping cameras. These cameras need not have an overlapping field of view and should be allowed to move freely in space. Moreover, features like zooming in\\/out, readily available in security cameras

Imran N. Junejo; Xiaochun Cao; Hassan Foroosh

2007-01-01

273

Multilayer dielectric narrow band mangin mirror  

NASA Astrophysics Data System (ADS)

The design of multilayer stack of dielectric films for narrow band mirror is developed using thin film coating software. The proposed design is materialized by employing thin film coating (PVD) method and reflectance in narrow band spectrum range is achieved. Thickness of high and low refractive index material is taken precisely up to nanometer level. The curved coated substrate is cemented with another K9 matching substrate that forms a Mangin mirror for wavelength 650nm. Narrow band mirrors with reflectivity more than 90% has been produced by properly stacking of 21 layers and advantage of the use of this type of mirror as an interference filter is discussed.

Ahmed, K.; Khan, A. N.; Rauf, A.; Gul, A.

2014-06-01

274

Photometric Calibration of Consumer Video Cameras  

NASA Technical Reports Server (NTRS)

Equipment and techniques have been developed to implement a method of photometric calibration of consumer video cameras for imaging of objects that are sufficiently narrow or sufficiently distant to be optically equivalent to point or line sources. Heretofore, it has been difficult to calibrate consumer video cameras, especially in cases of image saturation, because they exhibit nonlinear responses with dynamic ranges much smaller than those of scientific-grade video cameras. The present method not only takes this difficulty in stride but also makes it possible to extend effective dynamic ranges to several powers of ten beyond saturation levels. The method will likely be primarily useful in astronomical photometry. There are also potential commercial applications in medical and industrial imaging of point or line sources in the presence of saturation.This development was prompted by the need to measure brightnesses of debris in amateur video images of the breakup of the Space Shuttle Columbia. The purpose of these measurements is to use the brightness values to estimate relative masses of debris objects. In most of the images, the brightness of the main body of Columbia was found to exceed the dynamic ranges of the cameras. A similar problem arose a few years ago in the analysis of video images of Leonid meteors. The present method is a refined version of the calibration method developed to solve the Leonid calibration problem. In this method, one performs an endto- end calibration of the entire imaging system, including not only the imaging optics and imaging photodetector array but also analog tape recording and playback equipment (if used) and any frame grabber or other analog-to-digital converter (if used). To automatically incorporate the effects of nonlinearity and any other distortions into the calibration, the calibration images are processed in precisely the same manner as are the images of meteors, space-shuttle debris, or other objects that one seeks to analyze. The light source used to generate the calibration images is an artificial variable star comprising a Newtonian collimator illuminated by a light source modulated by a rotating variable neutral- density filter. This source acts as a point source, the brightness of which varies at a known rate. A video camera to be calibrated is aimed at this source. Fixed neutral-density filters are inserted in or removed from the light path as needed to make the video image of the source appear to fluctuate between dark and saturated bright. The resulting video-image data are analyzed by use of custom software that determines the integrated signal in each video frame and determines the system response curve (measured output signal versus input brightness). These determinations constitute the calibration, which is thereafter used in automatic, frame-by-frame processing of the data from the video images to be analyzed.

Suggs, Robert; Swift, Wesley, Jr.

2007-01-01

275

Laser angle sensor development  

NASA Technical Reports Server (NTRS)

Electrical and optical parameters were developed for a two axis (pitch/roll) laser angle sensor. The laser source and detector were mounted in the plenum above the model. Two axis optical distortion measurements of flow characteristics in a 0.3 transonic cryogenic tunnel were made with a shearing interferometer. The measurement results provide a basis for estimating the optical parameters of the laser angle sensor. Experimental and analytical information was generated on model windows to cover the reflector. A two axis breadboard was assembled to evaluate different measurement concepts. The measurement results were used to develop a preliminary design of a laser angle sensor. Schematics and expected performance specifications are included.

Pond, C. R.; Texeira, P. D.

1980-01-01

276

An efficient correction method of wide-angle lens distortion for surveillance systems  

Microsoft Academic Search

Wide-angle or fish-eye lenses are popularly used for the surveillance system due to their large field of view (FOV). However, images obtained by wide-angle cameras tend to be nonlinearly distorted owing to lens optics. In this paper, we propose a novel framework to correct the wide-angle lens distortion for surveillance systems. Our approach is based on the FOV model, which

Wonjun Kim; Changick Kim

2009-01-01

277

WIDE-FIELD ASTRONOMICAL MULTISCALE CAMERAS  

SciTech Connect

In order to produce sufficiently low aberrations with a large aperture, telescopes have a limited field of view. Because of this narrow field, large areas of the sky at a given time are unobserved. We propose several telescopes based on monocentric reflective, catadioptric, and refractive objectives that may be scaled to wide fields of view and achieve 1.''1 resolution, which in most locations is the practical seeing limit of the atmosphere. The reflective and Schmidt catadioptric objectives have relatively simple configurations and enable large fields to be captured at the expense of the obscuration of the mirror by secondary optics, a defect that may be managed by image plane design. The refractive telescope design does not have an obscuration but the objective has substantial bulk. The refractive design is a 38 gigapixel camera which consists of a single monocentric objective and 4272 microcameras. Monocentric multiscale telescopes, with their wide fields of view, may observe phenomena that might otherwise be unnoticed, such as supernovae, glint from orbital space debris, and near-earth objects.

Marks, Daniel L.; Brady, David J., E-mail: dbrady@ee.duke.edu [Department of Electrical and Computer Engineering and Fitzpatrick Institute for Photonics, Box 90291, Duke University, Durham, NC 27708 (United States)

2013-05-15

278

11/18/2006 Mohanty 1 Secure Digital CameraSecure Digital Camera  

E-print Network

11/18/2006 Mohanty 1 Secure Digital CameraSecure Digital Camera Saraju P. Mohanty VLSI Design Management (DRM) Secure Digital Camera (SDC) One of our Digital Integrated Circuit solutions Invisible;11/18/2006 Mohanty 10 Our Solution for DRM:Our Solution for DRM: Secure Digital Camera (SDC)Secure Digital Camera

Mohanty, Saraju P.

279

1/15/2007 Mohanty 1 Secure Digital CameraSecure Digital Camera  

E-print Network

1/15/2007 Mohanty 1 Secure Digital CameraSecure Digital Camera Saraju P. Mohanty VLSI Design Management (DRM) Secure Digital Camera (SDC) One of our Digital Integrated Circuit solutions Invisible;1/15/2007 Mohanty 10 Our Solution for DRM:Our Solution for DRM: Secure Digital Camera (SDC)Secure Digital Camera

Mohanty, Saraju P.

280

Wetting and Contact Angle  

NSDL National Science Digital Library

Students are presented with the concepts of wetting and contact angle. They are also introduced to the distinction between hydrophobic and hydrophilic surfaces. Students observe how different surfaces are used to maintain visibility under different conditions.

NSF CAREER Award and RET Program, Mechanical Engineering and Material Science,

281

Reassessing Narrow Rings at Uranus and Neptune  

NASA Astrophysics Data System (ADS)

We outline questions surrounding narrow rings, their radial confinement, global modes, and arcs. The rings of Chariklo may prove a helpful analogue. Additional observations by telescopes and spacecraft, with continued modeling, may lead to new answers.

Tiscareno, M. S.

2014-07-01

282

THZ EMISSION SPECTROSCOPY OF NARROW BANDGAP SEMICONDUCTORS  

E-print Network

THZ EMISSION SPECTROSCOPY OF NARROW BANDGAP SEMICONDUCTORS By Ricardo Asc´azubi A Thesis Submitted-Domain Spectroscopy . . . . . . . . . . . . . . . . . . . . . . . . 4 2.1 Optically Excited THz Emission Processes Description . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19 5.1 THz-TDS Setup

Wilke, Ingrid

283

The fly's eye camera system  

NASA Astrophysics Data System (ADS)

We introduce the Fly's Eye Camera System, an all-sky monitoring device intended to perform time domain astronomy. This camera system design will provide complementary data sets for other synoptic sky surveys such as LSST or Pan-STARRS. The effective field of view is obtained by 19 cameras arranged in a spherical mosaic form. These individual cameras of the device stand on a hexapod mount that is fully capable of achieving sidereal tracking for the subsequent exposures. This platform has many advantages. First of all it requires only one type of moving component and does not include unique parts. Hence this design not only eliminates problems implied by unique elements, but the redundancy of the hexapod allows smooth operations even if one or two of the legs are stuck. In addition, it can calibrate itself by observed stars independently from both the geographical location (including northen and southern hemisphere) and the polar alignment of the full mount. All mechanical elements and electronics are designed within the confines of our institute Konkoly Observatory. Currently, our instrument is in testing phase with an operating hexapod and reduced number of cameras.

Mészáros, L.; Pál, A.; Csépány, G.; Jaskó, A.; Vida, K.; Oláh, K.; Mezö, G.

2014-12-01

284

Development of filter exchangeable 3CCD camera for multispectral imaging acquisition  

NASA Astrophysics Data System (ADS)

There are a lot of methods to acquire multispectral images. Dynamic band selective and area-scan multispectral camera has not developed yet. This research focused on development of a filter exchangeable 3CCD camera which is modified from the conventional 3CCD camera. The camera consists of F-mounted lens, image splitter without dichroic coating, three bandpass filters, three image sensors, filer exchangeable frame and electric circuit for parallel image signal processing. In addition firmware and application software have developed. Remarkable improvements compared to a conventional 3CCD camera are its redesigned image splitter and filter exchangeable frame. Computer simulation is required to visualize a pathway of ray inside of prism when redesigning image splitter. Then the dimensions of splitter are determined by computer simulation which has options of BK7 glass and non-dichroic coating. These properties have been considered to obtain full wavelength rays on all film planes. The image splitter is verified by two line lasers with narrow waveband. The filter exchangeable frame is designed to make swap bandpass filters without displacement change of image sensors on film plane. The developed 3CCD camera is evaluated to application of detection to scab and bruise on Fuji apple. As a result, filter exchangeable 3CCD camera could give meaningful functionality for various multispectral applications which need to exchange bandpass filter.

Lee, Hoyoung; Park, Soo Hyun; Kim, Moon S.; Noh, Sang Ha

2012-05-01

285

Hot Wax Sweeps Debris From Narrow Passages  

NASA Technical Reports Server (NTRS)

Safe and effective technique for removal of debris and contaminants from narrow passages involves entrainment of undesired material in thermoplastic casting material. Semisolid wax slightly below melting temperature pushed along passage by pressurized nitrogen to remove debris. Devised to clean out fuel passages in main combustion chamber of Space Shuttle main engine. Also applied to narrow, intricate passages in internal-combustion-engine blocks, carburetors, injection molds, and other complicated parts.

Ricklefs, Steven K.

1990-01-01

286

'Magic Angle Precession'  

SciTech Connect

An advanced and exact geometric description of nonlinear precession dynamics modeling very accurately natural and artificial couplings showing Lorentz symmetry is derived. In the linear description it is usually ignored that the geometric phase of relativistic motion couples back to the orbital motion providing for a non-linear recursive precession dynamics. The high coupling strength in the nonlinear case is found to be a gravitomagnetic charge proportional to the precession angle and angular velocity generated by geometric phases, which are induced by high-speed relativistic rotations and are relevant to propulsion technologies but also to basic interactions. In the quantum range some magic precession angles indicating strong coupling in a phase-locked chaotic system are identified, emerging from a discrete time dynamical system known as the cosine map showing bifurcations at special precession angles relevant to heavy nuclei stability. The 'Magic Angle Precession' (MAP) dynamics can be simulated and visualized by cones rolling in or on each other, where the apex and precession angles are indexed by spin, charge or precession quantum numbers, and corresponding magic angles. The most extreme relativistic warping and twisting effect is given by the Dirac spinor half spin constellation with 'Hyperdiamond' MAP, which resembles quark confinement.

Binder, Bernd [Quanics.com, Germany, 88679 Salem, P.O. Box 1247 (United States)], E-mail: binder@quanics.com

2008-01-21

287

Camera-on-a-Chip  

NASA Technical Reports Server (NTRS)

Jet Propulsion Laboratory's research on a second generation, solid-state image sensor technology has resulted in the Complementary Metal- Oxide Semiconductor Active Pixel Sensor (CMOS), establishing an alternative to the Charged Coupled Device (CCD). Photobit Corporation, the leading supplier of CMOS image sensors, has commercialized two products of their own based on this technology: the PB-100 and PB-300. These devices are cameras on a chip, combining all camera functions. CMOS "active-pixel" digital image sensors offer several advantages over CCDs, a technology used in video and still-camera applications for 30 years. The CMOS sensors draw less energy, they use the same manufacturing platform as most microprocessors and memory chips, and they allow on-chip programming of frame size, exposure, and other parameters.

1999-01-01

288

Perceptual color characterization of cameras.  

PubMed

Color camera characterization, mapping outputs from the camera sensors to an independent color space, such as \\(XYZ\\), is an important step in the camera processing pipeline. Until now, this procedure has been primarily solved by using a \\(3 \\times 3\\) matrix obtained via a least-squares optimization. In this paper, we propose to use the spherical sampling method, recently published by Finlayson al., to perform a perceptual color characterization. In particular, we search for the \\(3 \\times 3\\) matrix that minimizes three different perceptual errors, one pixel based and two spatially based. For the pixel-based case, we minimize the CIE \\(\\Delta E\\) error, while for the spatial-based case, we minimize both the S-CIELAB error and the CID error measure. Our results demonstrate an improvement of approximately 3for the \\(\\Delta E\\) error, 7& for the S-CIELAB error and 13% for the CID error measures. PMID:25490586

Vazquez-Corral, Javier; Connah, David; Bertalmío, Marcelo

2014-01-01

289

Perceptual Color Characterization of Cameras  

PubMed Central

Color camera characterization, mapping outputs from the camera sensors to an independent color space, such as XY Z, is an important step in the camera processing pipeline. Until now, this procedure has been primarily solved by using a 3 × 3 matrix obtained via a least-squares optimization. In this paper, we propose to use the spherical sampling method, recently published by Finlayson et al., to perform a perceptual color characterization. In particular, we search for the 3 × 3 matrix that minimizes three different perceptual errors, one pixel based and two spatially based. For the pixel-based case, we minimize the CIE ?E error, while for the spatial-based case, we minimize both the S-CIELAB error and the CID error measure. Our results demonstrate an improvement of approximately 3% for the ?E error, 7% for the S-CIELAB error and 13% for the CID error measures. PMID:25490586

Vazquez-Corral, Javier; Connah, David; Bertalmío, Marcelo

2014-01-01

290

Solder wetting kinetics in narrow V-grooves  

SciTech Connect

Experiments are performed to observe capillary flow in grooves cut into copper surfaces. Flow kinetics of two liquids, 1-heptanol and eutectic Sn-Pb solder, are modeled with modified Washburn kinetics and compared to flow data. It is shown that both liquids flow parabolically in narrow V-grooves, and the data scale as predicted by the modified Washburn model. The early portions of the flow kinetics are characterized by curvature in the length vs time relationship which is not accounted for in the modified Washburn model. This effect is interpreted in terms of a dynamic contact angle. It is concluded that under conditions of rapid flow, solder spreading can be understood as a simple fluid flow process. Slower kinetics, e.g. solder droplet spreading on flat surfaces, may be affected by subsidiary chemical processes such as reaction.

Yost, F.G.; Rye, R.R. [Sandia National Labs., Albuquerque, NM (United States)] [Sandia National Labs., Albuquerque, NM (United States); Mann, J.A. Jr. [Case Western Reserve Univ., Cleveland, OH (United States)] [Case Western Reserve Univ., Cleveland, OH (United States)

1997-12-01

291

ERICA PLUS: compact MWIR camera with 20x step zoom optics and advanced processing  

NASA Astrophysics Data System (ADS)

The development of a compact and high performance MWIR step zoom camera based on the 640x480 staring focal plane array (FPA) is described. The camera has a 20 magnification step zoom ranging between 24°x20° for the wide field of view up to 1.2° x 1° for the narrow field of view and an aperture of F#4. The processing electronics is based on a flexible and expandable architecture. Special emphasis is spent on the solutions adopted for the design of this high zoom ratio and fast optics FLIR and on the electronic architecture and algorithms for image processing. An overview of the performance is given.

Porta, A.; Romagnoli, M.; Lavacchini, P.; Olivieri, M.; Torrini, D.

2007-10-01

292

Irradiation induced changes in small angle grain boundaries in mosaic Cu thin films  

NASA Astrophysics Data System (ADS)

We studied the effect of irradiation on small angle grain boundaries in mosaic structured Cu thin films. The films showed a decrease in mosaic spread via a narrowing of the full width at half maximum in XRD rocking curves and a smaller minimum yield of RBS channeling after irradiation. These data indicate the irradiation decreased the misorientation angles between mosaic blocks separated by small angle grain boundaries. Mechanisms involving interactions between grain boundary dislocations and irradiation induced defects are discussed.

Fu, E. G.; Wang, Y. Q.; Zou, G. F.; Xiong, J.; Zhuo, M. J.; Wei, Q. M.; Baldwin, J. K.; Jia, Q. X.; Shao, L.; Misra, A.; Nastasi, M.

2012-07-01

293

The GRAVITY metrology system: narrow-angle astrometry via phase-shifting interferometry  

E-print Network

The VLTI instrument GRAVITY will provide very powerful astrometry by combining the light from four telescopes for two objects simultaneously. It will measure the angular separation between the two astronomical objects to a precision of 10 microarcseconds. This corresponds to a differential optical path difference (dOPD) between the targets of few nanometers and the paths within the interferometer have to be maintained stable to that level. For this purpose, the novel metrology system of GRAVITY will monitor the internal dOPDs by means of phase-shifting interferometry. We present the four-step phase-shifting concept of the metrology with emphasis on the method used for calibrating the phase shifts. The latter is based on a phase-step insensitive algorithm which unambiguously extracts phases in contrast to other methods that are strongly limited by non-linearities of the phase-shifting device. The main constraint of this algorithm is to introduce a robust ellipse fitting routine. Via this approach we are able t...

Lippa, M; Gillessen, S; Kok, Y; Weber, J; Eisenhauer, F; Pfuhl, O; Janssen, A; Haug, M; Haußmann, F; Kellner, S; Hans, O; Wieprecht, E; Ott, T; Burtscher, L; Genzel, R; Sturm, E; Hofmann, R; Huber, S; Huber, D; Senftleben, S; Pflüger, A; Greßmann, R; Perrin, G; Perraut, K; Brandner, W; Straubmeier, C; Amorim, A; Schöller, M

2015-01-01

294

Color decorrelation for the PHOBOS mission camera experiment  

NASA Astrophysics Data System (ADS)

The surface characteristics of Phobos are reexamined based on new images provided by the VSK-Fregat camera experiment together with modern processing techniques for color analysis. The VSK-Fregat camera provided a quasi-simultaneous recording of panchromatic high resolution images together with lower resolution two-channel spectral images. Contrast enhancement, geometrical coregistration, band ratioing, principal component analysis, and HSI-color transformations were all performed during image processing. It is concluded that at low phase angles the crater rims appear brighter and redder than the surrounding material and that a slightly reddish patch was discovered that cannot be explained simply by topographic or illumination effects. It is also concluded, however, that the bright crater rims and the slightly reddish surface patch cannot be attributed merely to detector noise or similar effects. Similar observations were made by other research teams and it is hypothesized that a decrease in particle size may be responsible for the reddish appearance in these areas.

Hauber, E.; Regner, P.; Schmidt, K.; Neukum, G.; Schwarz, G.

1991-02-01

295

A cubic gamma camera with an active collimator.  

PubMed

Mechanical collimation with photon absorption and electronic collimation using Compton scattering are combined to form a cubic gamma camera with an active collimator. The collimator is made active by constructing it with a uniformly redundant array of patterned Bi4Ge3O12 (BGO) scintillators, which not only attenuates incident radiations but also detects scattered radiation, in a gamma-camera consisting of and five planar CsI(Na) scintillators. The entire module forms a cubic structure that generates images on the basis of radiation interactions from every direction. The coverage angle for detecting scattered radiation is 2? with a detection efficiency approximately 17 times higher than previous systems that comprised only one pair of detectors. PMID:24709608

Lee, Taewoong; Lee, Wonho

2014-08-01

296

Recent advances in digital camera optics  

NASA Astrophysics Data System (ADS)

The digital camera market has extremely expanded in the last ten years. The zoom lens for digital camera is especially the key determining factor of the camera body size and image quality. Its technologies have been based on several analog technological progresses including the method of aspherical lens manufacturing and the mechanism of image stabilization. Panasonic is one of the pioneers of both technologies. I will introduce the previous trend in optics of zoom lens as well as original optical technologies of Panasonic digital camera "LUMIX", and in addition optics in 3D camera system. Besides, I would like to suppose the future trend in digital cameras.

Ishiguro, Keizo

2012-10-01

297

Measuring Distances Using Digital Cameras  

ERIC Educational Resources Information Center

This paper presents a generic method of calculating accurate horizontal and vertical object distances from digital images taken with any digital camera and lens combination, where the object plane is parallel to the image plane or tilted in the vertical plane. This method was developed for a project investigating the size, density and spatial…

Kendal, Dave

2007-01-01

298

Close-Range Camera Calibration  

Microsoft Academic Search

For highest accuracies it is necessary in close range photogrammetryto account for the variation of lens distortion within the photographic field. Atheory to accomplish this is developed along with a practical method for calibratingradial and decentering distortion of close-range cameras. This method, theanalytical plumb line method, is applied in an experimental investigation leadingto confirmation of the validity of the theoretical

Duane C. Brown

1971-01-01

299

All-sky camera revitalized  

Microsoft Academic Search

Development and implementation of a low-cost all-sky camera (ASC) system is reported. The ASC system provides continuous unmanned recording for about a month before tapes are replaced. Radiometric calibration, geometric correction and projection of the image onto a geographic or geomagnetic coordinate system are performed by a user-friendly software.

Israel Oznovich; Ronald Yee; Andreas Schiffler; Donald J. McEwen; Gerorge J. Sofko

1994-01-01

300

All-sky camera revitalized  

NASA Astrophysics Data System (ADS)

Development and implementation of a low-cost all-sky camera (ASC) system is reported. The ASC system provides continuous unmanned recording for about a month before tapes are replaced. Radiometric calibration, geometric correction and projection of the image onto a geographic or geomagnetic coordinate system are performed by a user-friendly software.

Oznovich, Israel; Yee, Ronald; Schiffler, Andreas; McEwen, Donald J.; Sofko, Gerorge J.

1994-10-01

301

High speed multiwire photon camera  

NASA Technical Reports Server (NTRS)

An improved multiwire proportional counter camera having particular utility in the field of clinical nuclear medicine imaging. The detector utilizes direct coupled, low impedance, high speed delay lines, the segments of which are capacitor-inductor networks. A pile-up rejection test is provided to reject confused events otherwise caused by multiple ionization events occurring during the readout window.

Lacy, Jeffrey L. (Inventor)

1989-01-01

302

High speed multiwire photon camera  

NASA Technical Reports Server (NTRS)

An improved multiwire proportional counter camera having particular utility in the field of clinical nuclear medicine imaging. The detector utilizes direct coupled, low impedance, high speed delay lines, the segments of which are capacitor-inductor networks. A pile-up rejection test is provided to reject confused events otherwise caused by multiple ionization events occuring during the readout window.

Lacy, Jeffrey L. (Inventor)

1991-01-01

303

Gamma-ray camera flyby  

SciTech Connect

Animation based on an actual classroom demonstration of the prototype CCI-2 gamma-ray camera's ability to image a hidden radioactive source, a cesium-137 line source, in three dimensions. For more information see http://newscenter.lbl.gov/feature-stories/2010/06/02/applied-nuclear-physics/.

None

2010-01-01

304

New data on separation and position angle of selected binaries  

NASA Astrophysics Data System (ADS)

We report on a sample of the data aquired on may 2012 at the 31 inch NURO telescope at Anderson Mesa near Flagstaff, Arizona pertaining the separation and position angle of selected binary stars. A CCD camera coupled to the NURO telescope allows for a simple and straightforward procedure for obtaining the images of the binaries. Analysis of the images is straigthforward and both direct and software methodology yield the separation and position angle of the binary stars. The data obtained is suitable for insertion in the Washington Double Star Catalog of the US Naval Observatory.

Muller, Rafael J.; Lopez, Andy J.; Torres, Brian S.; Mendoza, Lizyan; Vergara, Nelson; Cersosimo, Juan; Martinez, Luis

2015-01-01

305

Method for shaping and aiming narrow beams. [sonar mapping and target identification  

NASA Technical Reports Server (NTRS)

A sonar method and apparatus is discribed which utilizes a linear frequency chirp in a transmitter/receiver having a correlator to synthesize a narrow beamwidth pattern from otherwise broadbeam transducers when there is relative velocity between the transmitter/receiver and the target. The chirp is so produced in a generator in bandwidth, B, and time, T, as to produce a time bandwidth product, TB, that is increased for a narrower angle. A replica of the chirp produced in a generator is time delayed and Doppler shifted for use as a reference in the receiver for correlation of received chirps from targets. This reference is Doppler shifted to select targets preferentially, thereby to not only synthesize a narrow beam but also aim the beam in azimuth and elevation.

Heyser, R. C. (inventor)

1981-01-01

306

Wide-angle catadioptric lens with a rectilinear projection scheme  

Microsoft Academic Search

A catadioptric wide-angle lens having a rectilinear projection scheme has been developed with a view to possible applications in the security-surveillance area. The lens has been designed for a miniature camera with a video graphics array-grade 1\\/3 in. color CCD sensor. The field of view of the lens is over 151°, and still distortion is under 1%. Furthermore, the modulation

Gyeong-Il Kweon; Seung Hwang-Bo; Geon-Hee Kim; Sun-Cheol Yang; Young-Hun Lee

2006-01-01

307

Discovery of a narrow line quasar  

NASA Technical Reports Server (NTRS)

A stellar object is reported which, while having X-ray and optical luminosities typical of quasars, has narrow permitted and forbidden emission lines over the observed spectral range. The narrow-line spectrum is high-excitation, the Balmer lines seem to be recombinational, and a redder optical spectrum than that of most quasars is exhibited, despite detection as a weak radio source. The object does not conform to the relationships between H-beta parameters and X-ray flux previously claimed for a large sample of the active galactic nuclei. Because reddish quasars with narrow lines, such as the object identified, may not be found by the standard techniques for the discovery of quasars, the object may be a prototype of a new class of quasars analogous to high-luminosity Seyfert type 2 galaxies. It is suggested that these objects cannot comprise more than 10% of all quasars.

Stocke, J.; Liebert, J.; Maccacaro, T.; Griffiths, R. E.; Steiner, J. E.

1982-01-01

308

Narrowing of intersensory speech perception in infancy  

PubMed Central

The conventional view is that perceptual/cognitive development is an incremental process of acquisition. Several striking findings have revealed, however, that the sensitivity to non-native languages, faces, vocalizations, and music that is present early in life declines as infants acquire experience with native perceptual inputs. In the language domain, the decline in sensitivity is reflected in a process of perceptual narrowing that is thought to play a critical role during the acquisition of a native-language phonological system. Here, we provide evidence that such a decline also occurs in infant response to multisensory speech. We found that infant intersensory response to a non-native phonetic contrast narrows between 6 and 11 months of age, suggesting that the perceptual system becomes increasingly more tuned to key native-language audiovisual correspondences. Our findings lend support to the notion that perceptual narrowing is a domain-general as well as a pan-sensory developmental process. PMID:19541648

Pons, Ferran; Lewkowicz, David J.; Soto-Faraco, Salvador; Sebastián-Gallés, Núria

2009-01-01

309

Future Planetary Surface Imager Development by the Beagle 2 Stereo Camera System Team  

NASA Astrophysics Data System (ADS)

The Stereo Camera System provided Beagle 2 with wide-angle multi-spectral stereo imaging (IFOV=0.043°). The SCS team plans to build on this design heritage to provide improved stereo capabilities to the Pasteur payload of the Aurora ExoMars rover.

Griffiths, A. D.; Coates, A. J.; Josset, J.-L.; Paar, G.

2004-03-01

310

PHOTOGRAMMETRIC MAPPING OF MEDITERRANEAN DEFENSE STRUCTURES USING AN AMATEUR DIGITAL CAMERA, GPS AND THEODOLITE  

Microsoft Academic Search

This work deals with the precise mapping of external faces of a structure called Tekes within the Medieval castle of Mytilene, Greece. The methodology using GPS and theodolite angle measurements to establish control on the faces of the structure are well described. Calibration procedures for an amateur digital camera which is used to make stereo photographs are presented. Final results

John N. Hatzopoulos; Christos Vasilakos

311

Measurement of the surface wavelength distribution of narrow-band radiation by a colorimetric method  

SciTech Connect

A method is suggested for determining the wavelength of narrow-band light from a digital photograph of a radiating surface. The digital camera used should be appropriately calibrated. The accuracy of the wavelength measurement is better than 1 nm. The method was tested on the yellow doublet of mercury spectrum and on the adjacent continuum of the incandescent lamp radiation spectrum. By means of the method suggested the homogeneity of holographic sensor swelling was studied in stationary and transient cases. (laser applications and other topics in quantum electronics)

Kraiskii, A V; Mironova, T V; Sultanov, T T [P N Lebedev Physical Institute, Russian Academy of Sciences, Moscow (Russian Federation)

2010-09-10

312

Boiling Visualization and Critical Heat Flux Phenomena In Narrow Rectangular Gap  

SciTech Connect

An experimental study was performed to investifate the pool boling critical hear flux (CHF) on one-dimensional inclined rectangular channels with narrow gaps by changing the orientation of a copper test heater assembly. In a pool of saturated water at atmospheric pressure, the test parameters include the gap sizes of 1,2,5, and 10 mm, andthe surface orientation angles from the downward facing position (180 degrees) to the vertical position (90 degress) respectively.

J. J. Kim; Y. H. Kim; S. J. Kim; S. W. Noh; K. Y. Suh; J. Rempe; F. B. Cheung; S. B. Kim

2004-12-01

313

Taxicab Angles and Trigonometry  

E-print Network

A natural analogue to angles and trigonometry is developed in taxicab geometry. This structure is then analyzed to see which, if any, congruent triangle relations hold. A nice application involving the use of parallax to determine the exact (taxicab) distance to an object is also discussed.

Thompson, Kevin

2011-01-01

314

Casting and Angling.  

ERIC Educational Resources Information Center

As part of a series of books and pamphlets on outdoor education, this manual consists of easy-to-follow instructions for fishing activities dealing with casting and angling. The manual may be used as a part of the regular physical education program in schools and colleges or as a club activity for the accomplished weekend fisherman or the…

Smith, Julian W.

315

Selection of video cameras for stroboscopic videolaryngoscopy.  

PubMed

Stroboscopic evaluation for the analysis of laryngeal function and disease has been reemphasized recently and its routine clinical use recommended. Many have found, however, that it is not always possible to obtain consistently satisfactory video images of stroboscopic laryngoscopy. The problem is related to the low intensity of the xenon light source during stroboscopy. The authors have tried many different video cameras available, along with the Brüel & Kjaer Rhino-Larynx Stroboscope type 4914, and two types of endoscopes (flexible and rigid). The cameras included 1) single tube camera, 2) single chip metal oxide semiconductors (MOS) solid-state camera, 3) single chip charge-coupled devices (CCD) solid-state camera, 4) three-tube camera, and 5) three-chip CCD camera. Currently available video cameras and their adaptability for stroboscopic videolaryngoscopy are discussed. PMID:3674656

Yanagisawa, E; Godley, F; Muta, H

1987-01-01

316

General linear cameras : theory and applications  

E-print Network

I present a General Linear Camera (GLC) model that unifies many previous camera models into a single representation. The GLC model describes all perspective (pinhole), orthographic, and many multiperspective (including ...

Yu, Jingyi, 1978-

2005-01-01

317

Lytro camera technology: theory, algorithms, performance analysis  

NASA Astrophysics Data System (ADS)

The Lytro camera is the first implementation of a plenoptic camera for the consumer market. We consider it a successful example of the miniaturization aided by the increase in computational power characterizing mobile computational photography. The plenoptic camera approach to radiance capture uses a microlens array as an imaging system focused on the focal plane of the main camera lens. This paper analyzes the performance of Lytro camera from a system level perspective, considering the Lytro camera as a black box, and uses our interpretation of Lytro image data saved by the camera. We present our findings based on our interpretation of Lytro camera file structure, image calibration and image rendering; in this context, artifacts and final image resolution are discussed.

Georgiev, Todor; Yu, Zhan; Lumsdaine, Andrew; Goma, Sergio

2013-03-01

318

21 CFR 892.1110 - Positron camera.  

Code of Federal Regulations, 2011 CFR

... Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES RADIOLOGY DEVICES Diagnostic Devices § 892.1110 Positron camera. (a) Identification. A positron camera is a...

2011-04-01

319

21 CFR 892.1110 - Positron camera.  

Code of Federal Regulations, 2013 CFR

... Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES RADIOLOGY DEVICES Diagnostic Devices § 892.1110 Positron camera. (a) Identification. A positron camera is a...

2013-04-01

320

21 CFR 892.1110 - Positron camera.  

Code of Federal Regulations, 2014 CFR

... Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES RADIOLOGY DEVICES Diagnostic Devices § 892.1110 Positron camera. (a) Identification. A positron camera is a...

2014-04-01

321

21 CFR 892.1110 - Positron camera.  

Code of Federal Regulations, 2010 CFR

... Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES RADIOLOGY DEVICES Diagnostic Devices § 892.1110 Positron camera. (a) Identification. A positron camera is a...

2010-04-01

322

21 CFR 892.1110 - Positron camera.  

Code of Federal Regulations, 2012 CFR

... Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES RADIOLOGY DEVICES Diagnostic Devices § 892.1110 Positron camera. (a) Identification. A positron camera is a...

2012-04-01

323

Apogee Imaging Systems Camera Installation Guide  

E-print Network

camera 4) Begin using your Apogee camera! 1.1 Install MaxIm DL/CCD Your Apogee camera may have included a copy of MaxIm DL/CCD image capture and processing software, developed by Diffraction Limited. If you will be using the MaxIm DL/CCD software, we recommend that you install it prior to setting up your Apogee camera

Kleinfeld, David

324

Camera-based interactions for augmented reality  

Microsoft Academic Search

We investigate camera-based interaction techniques suitable for generating simple, easy-to-use augmented reality applications. All the interaction techniques described are based on using only the camera as input device, so that users can interact with 3D content of the application by gesturing with camera movements. With two test applications, we have identified several camera-based interaction techniques that work well with interactive

Tatu Harviainen; Otto Korkalo; Charles Woodward

2009-01-01

325

Corrective Optics For Camera On Telescope  

NASA Technical Reports Server (NTRS)

Assembly of tilted, aspherical circularly symmetric mirrors used as corrective optical subsystem for camera mounted on telescope exhibiting both large spherical wave-front error and inherent off-axis astigmatism. Subsystem provides unobscured camera aperture and diffraction-limited camera performance, despite large telescope aberrations. Generic configuration applied in other optical systems in which aberations deliberately introduced into telescopes and corrected in associated cameras. Concept of corrective optical subsystem provides designer with additional degrees of freedom used to optimize optical system.

Macenka, Steven A.; Meinel, Aden B.

1994-01-01

326

The Streak Camera Development at LLE  

SciTech Connect

The Diagnostic Development Group at the Laboratory for Laser Energetics has endeavored to build a stand-alone, remotely operated streak camera with comprehensive autofocus and self-calibration capability. Designated as the Rochester Optical Streak System (ROSS), it is a generic streak camera platform, capable of accepting a variety of streak tubes. The system performance is limited by the installed tube's electron optics, not by any camera subsystem. Moreover, the ROSS camera can be photometrically calibrated.

Jaanimagi, P.A.; Boni. R.; Butler, D.; Ghosh, S.; Donaldson, W.R.; Keck, R.L.

2005-03-31

327

Optimising Camera Traps for Monitoring Small Mammals  

PubMed Central

Practical techniques are required to monitor invasive animals, which are often cryptic and occur at low density. Camera traps have potential for this purpose, but may have problems detecting and identifying small species. A further challenge is how to standardise the size of each camera’s field of view so capture rates are comparable between different places and times. We investigated the optimal specifications for a low-cost camera trap for small mammals. The factors tested were 1) trigger speed, 2) passive infrared vs. microwave sensor, 3) white vs. infrared flash, and 4) still photographs vs. video. We also tested a new approach to standardise each camera’s field of view. We compared the success rates of four camera trap designs in detecting and taking recognisable photographs of captive stoats (Mustelaerminea), feral cats (Felis catus) and hedgehogs (Erinaceuseuropaeus). Trigger speeds of 0.2–2.1 s captured photographs of all three target species unless the animal was running at high speed. The camera with a microwave sensor was prone to false triggers, and often failed to trigger when an animal moved in front of it. A white flash produced photographs that were more readily identified to species than those obtained under infrared light. However, a white flash may be more likely to frighten target animals, potentially affecting detection probabilities. Video footage achieved similar success rates to still cameras but required more processing time and computer memory. Placing two camera traps side by side achieved a higher success rate than using a single camera. Camera traps show considerable promise for monitoring invasive mammal control operations. Further research should address how best to standardise the size of each camera’s field of view, maximise the probability that an animal encountering a camera trap will be detected, and eliminate visible or audible cues emitted by camera traps. PMID:23840790

Glen, Alistair S.; Cockburn, Stuart; Nichols, Margaret; Ekanayake, Jagath; Warburton, Bruce

2013-01-01

328

Solid state replacement of rotating mirror cameras  

Microsoft Academic Search

Rotating mirror cameras have been the mainstay of mega-frame per second imaging for decades. There is still no electronic camera that can match a film based rotary mirror camera for the combination of frame count, speed, resolution and dynamic range. The rotary mirror cameras are predominantly used in the range of 0.1 to 100 micro-seconds per frame, for 25 to

Alan M. Frank; Joseph M. Bartolick

2007-01-01

329

Interference-induced angle-independent acoustical transparency  

NASA Astrophysics Data System (ADS)

It is revealed that the Fano-like interference leads to the extraordinary acoustic transmission through a slab metamaterial of thickness much smaller than the wavelength, with each unit cell consisting of a Helmholtz resonator and a narrow subwavelength slit. More importantly, both the theoretical analysis and experimental measurement show that the angle-independent acoustical transparency can be realized by grafting a Helmholtz resonator and a quarter-wave resonator to the wall of a narrow subwavelength slit in each unit cell of a slit array. The observed phenomenon results from the interferences between the waves propagating in the slit, those re-radiated by the Helmholtz resonator, and those re-radiated by the quarter-wave resonator. The proposed design may find its applications in designing angle-independent acoustical filters and controlling the phase of the transmitted waves.

Qi, Lehua; Yu, Gaokun; Wang, Xinlong; Wang, Guibo; Wang, Ning

2014-12-01

330

Collaborative real-time scheduling of multiple PTZ cameras for multiple object tracking in video surveillance  

NASA Astrophysics Data System (ADS)

This paper proposes a multi-PTZ-camera control mechanism to acquire close-up imagery of human objects in a surveillance system. The control algorithm is based on the output of multi-camera, multi-target tracking. Three main concerns of the algorithm are (1) the imagery of human object's face for biometric purposes, (2) the optimal video quality of the human objects, and (3) minimum hand-off time. Here, we define an objective function based on the expected capture conditions such as the camera-subject distance, pan tile angles of capture, face visibility and others. Such objective function serves to effectively balance the number of captures per subject and quality of captures. In the experiments, we demonstrate the performance of the system which operates in real-time under real world conditions on three PTZ cameras.

Liu, Yu-Che; Huang, Chung-Lin

2013-03-01

331

Self-Calibration of Stationary Cameras  

Microsoft Academic Search

A new practical method is given for the self-calibration of a camera. In this method, at least three images are taken from the same point in space with different orientations of the camera and calibration is computed from an analysis of point matches between the images. The method requires no knowledge of the orientations of the camera. Calibration is based

Richard I. Hartley

1997-01-01

332

Digital video camera workshop Sony VX2000  

E-print Network

Digital video camera workshop Sony VX2000 Sony DSR-PDX10 #12;Borrowing Eligibility · Currently · Completed quiz with score of 100% #12;Sony VX2000 and Panasonic AG- DVC7P · These are 3CCD cameras · Both ­ Firewire #12;Video Camera Operation Installing the Battery Sony VX2000 Insert the battery with the arrow

333

Removing camera shake from a single photograph  

Microsoft Academic Search

Camera shake during exposure leads to objectionable image blur and ruins many photographs. Conventional blind deconvolution methods typically assume frequency-domain constraints on images, or overly simpli ed parametric forms for the motion path during camera shake. Real camera motions can follow convoluted paths, and a spatial domain prior can better maintain visually salient im-age characteristics. We introduce a method to

Robert Fergus; Barun Singh; Aaron Hertzmann; Sam T. Roweis; William T. Freeman

2006-01-01

334

Motion Tracking with an Active Camera  

Microsoft Academic Search

Abstract-This work describes a method for real-time motion detection using an active camera mounted on a padtilt platform. Image mapping is used to align images of different viewpoints so that static camera motion detection can be applied. In the presence of camera position noise, the image mapping is inexact and compensation techniques fail. The use of morphological filtering of motion

Don Murray; Anup Basu

1994-01-01

335

A New Concept of Security Camera Monitoring  

Microsoft Academic Search

We present a novel framework for encoding images obtained by a security monitoring camera with protecting the privacy of moving objects in the images. We are motivated by the fact that although secu- rity monitoring cameras can deter crimes, they may infringe the privacy of those who and objects which are recorded by the cameras. Moving objects, whose privacy should

Kenichi YABUTA; Hitoshi KITAZAWA; Toshihisa TANAKA

336

Camera Self-Calibration: Theory and Experiments  

Microsoft Academic Search

The problem of finding the internal orientation of a camera (camera calibration) is extremely important for practical applications. In this paper a complete method for calibrating a camera is presented. In contrast with existing methods it does not require a calibration object with a known 3D shape. The new method requires only point matches from image sequences. It is shown,

Olivier D. Faugeras; Quang-tuan Luong; Stephen J. Maybank

1992-01-01

337

The Danish Faint Object Spectrograph and Camera  

E-print Network

Chapter 2 The Danish Faint Object Spectrograph and Camera The Danish Faint Object Spectrograph.3mm Camera linear field 30:7 \\Theta 30:7mm Reduction ratio 0.58 On the Danish 1.54m (nominal values. THE DANISH FAINT OBJECT SPECTROGRAPH AND CAMERA \\Gamma100 ffi C. Beware that the bias level is temperature

338

An auto-focusing CCD camera mount  

NASA Astrophysics Data System (ADS)

The traditional methods of focusing a CCD camera are either time consuming, difficult or, more importantly, indecisive. This paper describes a device designed to allow the observer to be confident that the camera will always be properly focused by sensing a selected star image and automatically adjusting the camera's focal position.

Arbour, R. W.

1994-08-01

339

Fundamental Matrix for Cameras with Radial Distortion  

Microsoft Academic Search

When deploying a heterogeneous camera network or when we use cheap zoom cameras like in cell-phones, it is not practical, if not impossible to off-line calibrate the ra- dial distortion of each camera using reference objects. It is rather desirable to have an automatic procedure with- out strong assumptions about the scene. In this paper, we present a new algorithm

João P. Barreto; Kostas Daniilidis

2005-01-01

340

Distributed Calibration of Smart Cameras John Jannotti  

E-print Network

-grained to allow fu- sion between overlapping camera views. This paper introduces Lighthouse, a distributed calibra). Lighthouse finds matches between cameras, even be- tween distant cameras, without centralizing observa- tions. Lighthouse also contributes several advancements in the cooperative creation of GHTs, including boot

Jannotti, John

341

Cooling the dark energy camera instrument  

SciTech Connect

DECam, camera for the Dark Energy Survey (DES), is undergoing general design and component testing. For an overview see DePoy, et al in these proceedings. For a description of the imager, see Cease, et al in these proceedings. The CCD instrument will be mounted at the prime focus of the CTIO Blanco 4m telescope. The instrument temperature will be 173K with a heat load of 113W. In similar applications, cooling CCD instruments at the prime focus has been accomplished by three general methods. Liquid nitrogen reservoirs have been constructed to operate in any orientation, pulse tube cryocoolers have been used when tilt angles are limited and Joule-Thompson or Stirling cryocoolers have been used with smaller heat loads. Gifford-MacMahon cooling has been used at the Cassegrain but not at the prime focus. For DES, the combined requirements of high heat load, temperature stability, low vibration, operation in any orientation, liquid nitrogen cost and limited space available led to the design of a pumped, closed loop, circulating nitrogen system. At zenith the instrument will be twelve meters above the pump/cryocooler station. This cooling system expected to have a 10,000 hour maintenance interval. This paper will describe the engineering basis including the thermal model, unbalanced forces, cooldown time, the single and two-phase flow model.

Schmitt, R.L.; Cease, H.; /Fermilab; DePoy, D.; /Ohio State U.; Diehl, H.T.; Estrada, J.; Flaugher, B.; /Fermilab; Kuhlmann, S.; /Ohio State U.; Onal, Birce; Stefanik, A.; /Fermilab

2008-06-01

342

Normal Q-angle in an Adult Nigerian Population  

Microsoft Academic Search

The Q-angle has been studied among the adult Caucasian population with the establishment of reference values. Scientists are\\u000a beginning to accept the concept of different human races. Physical variability exists between various African ethnic groups\\u000a and Caucasians as exemplified by differences in anatomic features such as a flat nose compared with a pointed nose, wide rather\\u000a than narrow faces, and

Bade B. Omololu; Olusegun S. Ogunlade; Vinod K. Gopaldasani

2009-01-01

343

Combustion pinhole-camera system  

DOEpatents

A pinhole camera system is described utilizing a sealed optical-purge assembly which provides optical access into a coal combustor or other energy conversion reactors. The camera system basically consists of a focused-purge pinhole optical port assembly, a conventional TV vidicon receiver, an external, variable density light filter which is coupled electronically to the vidicon automatic gain control (agc). The key component of this system is the focused-purge pinhole optical port assembly which utilizes a purging inert gas to keep debris from entering the port and a lens arrangement which transfers the pinhole to the outside of the port assembly. One additional feature of the port assembly is that it is not flush with the interior of the combustor.

Witte, A.B.

1982-05-19

344

Electronographic cameras for space astronomy.  

NASA Technical Reports Server (NTRS)

Magnetically-focused electronographic cameras have been under development at the Naval Research Laboratory for use in far-ultraviolet imagery and spectrography, primarily in astronomical and optical-geophysical observations from sounding rockets and space vehicles. Most of this work has been with cameras incorporating internal optics of the Schmidt or wide-field all-reflecting types. More recently, we have begun development of electronographic spectrographs incorporating an internal concave grating, operating at normal or grazing incidence. We also are developing electronographic image tubes of the conventional end-window-photo-cathode type, for far-ultraviolet imagery at the focus of a large space telescope, with image formats up to 120 mm in diameter.

Carruthers, G. R.; Opal, C. B.

1972-01-01

345

Lazy Narrowing with Simpli cation Michael Hanus  

E-print Network

Lazy Narrowing with Simpli#12;cation #3; Michael Hanus Informatik II, RWTH Aachen D-52056 Aachen of two natural numbers which are represented by terms built from 0 and s: 0 + y ! y (R 1 ) s(x) + y ! s(x + y) (R 2 ) #3; This paper is a revised version of papers appeared in the proceedings of ESOP'94

Hanus, Michael

346

Policy message A narrow focus on conventional  

E-print Network

Policy message n A narrow focus on conventional sanitation technologies and top- down planning often prevents improvement of sanitation in poor settlements. n Simple, affordable, effective tech studies featured here were conducted in: Lao PDR, Tanzania, and Nepal Local solutions for sanitation Urban

Richner, Heinz

347

Narrow-Band Applications of Communications Satellites.  

ERIC Educational Resources Information Center

This paper attempts to describe the advantages of "narrow-band" applications of communications satellites for education. It begins by discussing the general controversy surrounding the use of satellites in education, by placing the concern within the larger context of the general debate over the uses of new technologies in education, and by…

Cowlan, Bert; Horowitz, Andrew

348

The Advanced Camera for Surveys  

NSDL National Science Digital Library

The Johns Hopkins University describes the Advanced Camera for Surveys (ACS), which was installed in the Hubble Space Telescope in 2002 to "detect light from the ultraviolet to the near infrared." Users can view a photo gallery of the filters, detectors, optical bench, astronomers, and other aspects of ACS optical and mechanical components. While some parts of the website are restricted, scientists can find abstracts and full-text scientific papers, explanations of calibration, the coronagraph and other instruments, and press releases.

349

The PS1 Gigapixel Camera  

NASA Astrophysics Data System (ADS)

The world's largest and most advanced digital camera has been installed on the Pan-STARRS-1 (PS1) telescope on Haleakala, Maui. Built at the University of Hawaii at Manoa's Institute for Astronomy (IfA) in Honolulu, the gigapixel camera will capture images that will be used to scan the skies for killer asteroids, and to create the most comprehensive catalog of stars and galaxies ever produced. The CCD sensors at the heart of the camera were developed in collaboration with Lincoln Laboratory of the Massachusetts Institute of Technology. The image area, which is about 40 cm across, contains 60 identical silicon chips, each of which contains 64 independent imaging circuits. Each of these imaging circuits contains approximately 600 x 600 pixels, for a total of about 1.4 gigapixels in the focal plane. The CCDs themselves employ the innovative technology called "orthogonal transfer." Splitting the image area into about 4,000 separate regions in this way has three advantages: data can be recorded more quickly, saturation of the image by a very bright star is confined to a small region, and any defects in the chips only affect only a small part of the image area. The CCD camera is controlled by an ultrafast 480-channel control system developed at the IfA. The individual CCD cells are grouped in 8 x 8 arrays on a single silicon chip called an orthogonal transfer array (OTA), which measures about 5 cm square. There are a total of 60 OTAs in the focal plane of each telescope.

Tonry, John L.; Isani, S.; Onaka, P.

2007-12-01

350

Graphic design of pinhole cameras  

NASA Technical Reports Server (NTRS)

The paper describes a graphic technique for the analysis and optimization of pinhole size and focal length. The technique is based on the use of the transfer function of optical elements described by Scott (1959) to construct the transfer function of a circular pinhole camera. This transfer function is the response of a component or system to a pattern of lines having a sinusoidally varying radiance at varying spatial frequencies. Some specific examples of graphic design are presented.

Edwards, H. B.; Chu, W. P.

1979-01-01

351

The MVACS Robotic Arm Camera  

NASA Astrophysics Data System (ADS)

The Robotic Arm Camera (RAC) is one of the key instruments newly developed for the Mars Volatiles and Climate Surveyor payload of the Mars Polar Lander. This lightweight instrument employs a front lens with variable focus range and takes images at distances from 11 mm (image scale 1:1) to infinity. Color images with a resolution of better than 50 ?m can be obtained to characterize the Martian soil. Spectral information of nearby objects is retrieved through illumination with blue, green, and red lamp sets. The design and performance of the camera are described in relation to the science objectives and operation. The RAC uses the same CCD detector array as the Surface Stereo Imager and shares the readout electronics with this camera. The RAC is mounted at the wrist of the Robotic Arm and can characterize the contents of the scoop, the samples of soil fed to the Thermal Evolved Gas Analyzer, the Martian surface in the vicinity of the lander, and the interior of trenches dug out by the Robotic Arm. It can also be used to take panoramic images and to retrieve stereo information with an effective baseline surpassing that of the Surface Stereo Imager by about a factor of 3.

Keller, H. U.; Hartwig, H.; Kramm, R.; Koschny, D.; Markiewicz, W. J.; Thomas, N.; Fernades, M.; Smith, P. H.; Reynolds, R.; Lemmon, M. T.; Weinberg, J.; Marcialis, R.; Tanner, R.; Boss, B. J.; Oquest, C.; Paige, D. A.

2001-08-01

352

ANIR : Atacama Near-Infrared Camera for the 1.0-m miniTAO Telescope  

E-print Network

We have developed a near-infrared camera called ANIR (Atacama Near-InfraRed camera) for the University of Tokyo Atacama Observatory 1.0m telescope (miniTAO) installed at the summit of Cerro Chajnantor (5640 m above sea level) in northern Chile. The camera provides a field of view of 5'.1 $\\times$ 5'.1 with a spatial resolution of 0".298 /pixel in the wavelength range of 0.95 to 2.4 $\\mu$m. Taking advantage of the dry site, the camera is capable of hydrogen Paschen-$\\alpha$ (Pa$\\alpha$, $\\lambda=$1.8751 $\\mu$m in air) narrow-band imaging observations, at which wavelength ground-based observations have been quite difficult due to deep atmospheric absorption mainly from water vapor. We have been successfully obtaining Pa$\\alpha$ images of Galactic objects and nearby galaxies since the first-light observation in 2009 with ANIR. The throughputs at the narrow-band filters ($N1875$, $N191$) including the atmospheric absorption show larger dispersion (~10%) than those at broad-band filters (a few %), indicating that ...

Konishi, Masahiro; Tateuchi, Ken; Takahashi, Hidenori; Kitagawa, Yutaro; Kato, Natsuko; Sako, Shigeyuki; Uchimoto, Yuka K; Toshikawa, Koji; Ohsawa, Ryou; Yamamuro, Tomoyasu; Asano, Kentaro; Ita, Yoshifusa; Kamizuka, Takafumi; Komugi, Shinya; Koshida, Shintaro; Manabe, Sho; Matsunaga, Noriyuki; Minezaki, Takeo; Morokuma, Tomoki; Nakashima, Asami; Takagi, Toshinobu; Tanabé, Toshihiko; Uchiyama, Mizuho; Aoki, Tsutomu; Doi, Mamoru; Handa, Toshihiro; Kato, Daisuke; Kawara, Kimiaki; Kohno, Kotaro; Miyata, Takashi; Nakamura, Tomohiko; Okada, Kazushi; Soyano, Takao; Tamura, Yoichi; Tanaka, Masuo; Tarusawa, Ken'ichi; Yoshii, Yuzuru

2015-01-01

353

Correction of calculation method for boresight on aerial remote sensing camera  

NASA Astrophysics Data System (ADS)

The boresight of the aerial remote sensing camera (ARSC) need to be elicited to the reference coordinate system of the satellite after the assemblage of the whole satellite. Because it is difficult to aim the boresight after finish fixing the camera to the satellite, the boresight must be elicited by a cube before the fixing. So the cube coordinate system can be transited to the reference coordinate system. The boresight of the camera is measured by a theodolite. The orientation of the boresight can be solved through measuring four angles of the CCD, the top left corner, the bottom left corner, the top right corner, and the bottom right corner, and then the spatial angle of the boresight can be solved. According to the traditional methods of the data processing after boresight measuring, the limitation has been analyzed by using the Mat lab software. The trace of motion of the theodolite is provided, while it is rotating horizontally with a vertical angle around the vertical axis and rotating vertically with a horizontal angle around the horizontal axis. Based on the vector combination theory, the normalized vector of the boresight can be obtained, so the spatial angle of the boresight can also be calculated. At last, this paper shows two applications in factual measuring.

Xing, Hui; Mu, Sheng-bo; Chen, Jia-yi

2012-10-01

354

On the absolute calibration of SO2 cameras  

NASA Astrophysics Data System (ADS)

Sulphur dioxide emission rate measurements are an important tool for volcanic monitoring and eruption risk assessment. The SO2 camera technique remotely measures volcanic emissions by analysing the ultraviolet absorption of SO2 in a narrow spectral window between 300 and 320 nm using solar radiation scattered in the atmosphere. The SO2 absorption is selectively detected by mounting band-pass interference filters in front of a two-dimensional, UV-sensitive CCD detector. One important step for correct SO2 emission rate measurements that can be compared with other measurement techniques is a correct calibration. This requires conversion from the measured optical density to the desired SO2 column density (CD). The conversion factor is most commonly determined by inserting quartz cells (cuvettes) with known amounts of SO2 into the light path. Another calibration method uses an additional narrow field-of-view Differential Optical Absorption Spectroscopy system (NFOV-DOAS), which measures the column density simultaneously in a small area of the camera's field-of-view. This procedure combines the very good spatial and temporal resolution of the SO2 camera technique with the more accurate column densities obtainable from DOAS measurements. This work investigates the uncertainty of results gained through the two commonly used, but quite different, calibration methods (DOAS and calibration cells). Measurements with three different instruments, an SO2 camera, a NFOV-DOAS system and an Imaging DOAS (I-DOAS), are presented. We compare the calibration-cell approach with the calibration from the NFOV-DOAS system. The respective results are compared with measurements from an I-DOAS to verify the calibration curve over the spatial extent of the image. The results show that calibration cells, while working fine in some cases, can lead to an overestimation of the SO2 CD by up to 60% compared with CDs from the DOAS measurements. Besides these errors of calibration, radiative transfer effects (e.g. light dilution, multiple scattering) can significantly influence the results of both instrument types. The measurements presented in this work were taken at Popocatépetl, Mexico, between 1 March 2011 and 4 March 2011. Average SO2 emission rates between 4.00 and 14.34 kg s-1 were observed.

Lübcke, P.; Bobrowski, N.; Illing, S.; Kern, C.; Alvarez Nieves, J. M.; Vogel, L.; Zielcke, J.; Delgado Granados, H.; Platt, U.

2013-03-01

355

Passive Millimeter Wave Camera (PMMWC) at TRW  

NASA Technical Reports Server (NTRS)

Engineers at TRW, Redondo Beach, California, inspect the Passive Millimeter Wave Camera, a weather-piercing camera designed to see through fog, clouds, smoke and dust. Operating in the millimeter wave portion of the electromagnetic spectrum, the camera creates visual-like video images of objects, people, runways, obstacles and the horizon. A demonstration camera (shown in photo) has been completed and is scheduled for checkout tests and flight demonstration. Engineer (left) holds a compact, lightweight circuit board containing 40 complete radiometers, including antenna, monolithic millimeter wave integrated circuit (MMIC) receivers and signal processing and readout electronics that forms the basis for the camera's 1040-element focal plane array.

1997-01-01

356

Passive Millimeter Wave Camera (PMMWC) at TRW  

NASA Technical Reports Server (NTRS)

Engineers at TRW, Redondo Beach, California, inspect the Passive Millimeter Wave Camera, a weather-piercing camera designed to 'see' through fog, clouds, smoke and dust. Operating in the millimeter wave portion of the electromagnetic spectrum, the camera creates visual-like video images of objects, people, runways, obstacles and the horizon. A demonstration camera (shown in photo) has been completed and is scheduled for checkout tests and flight demonstration. Engineer (left) holds a compact, lightweight circuit board containing 40 complete radiometers, including antenna, monolithic millimeter wave integrated circuit (MMIC) receivers and signal processing and readout electronics that forms the basis for the camera's 1040-element focal plane array.

1997-01-01

357

Ejs Brewsterâs Angle Model  

NSDL National Science Digital Library

The Ejs Brewsterâs Angle model displays the electric field of an electromagnetic wave incident on a change of index of refraction. The simulation allows an arbitrarily linearly (in parallel and perpendicular components) polarized wave to encounter the change of index of refraction. The initial electric field and incidence angle change of index of refraction can all be changed via sliders. You can modify this simulation if you have Ejs installed by right-clicking within the plot and selecting âOpen Ejs Modelâ from the pop-up menu item. Ejs Brewsterâs Angle model was created using the Easy Java Simulations (Ejs) modeling tool. It is distributed as a ready-to-run (compiled) Java archive. Double clicking the ejs_ehu_waves_brewster.jar file will run the program if Java is installed. Ejs is a part of the Open Source Physics Project and is designed to make it easier to access, modify, and generate computer models. Additional Ejs models for wave optics are available. They can be found by searching ComPADRE for Open Source Physics, OSP, or Ejs.

Aguirregabiria, Juan

2008-08-20

358

On the absolute calibration of SO2 cameras  

NASA Astrophysics Data System (ADS)

Sulphur dioxide emission flux measurements are an important tool for volcanic monitoring and eruption risk assessment. The SO2 camera technique remotely measures volcanic emissions by analysing the ultraviolet absorption of SO2 in a narrow spectral window between 305 nm and 320 nm using solar radiation scattered in the atmosphere. The SO2 absorption is selectively detected by mounting band-pass interference filters in front of a two-dimensional, UV-sensitive CCD detector. While this approach is simple and delivers valuable insights into the two-dimensional SO2 distribution, absolute calibration has proven to be difficult. An accurate calibration of the SO2 camera (i.e., conversion from optical density to SO2 column density, CD) is crucial to obtain correct SO2 CDs and flux measurements that are comparable to other measurement techniques and can be used for volcanological applications. The most common approach for calibrating SO2 camera measurements is based on inserting quartz cells (cuvettes) containing known amounts of SO2 into the light path. It has been found, however, that reflections from the windows of the calibration cell can considerably affect the signal measured by the camera. Another possibility for calibration relies on performing simultaneous measurements in a small area of the camera's field-of-view (FOV) by a narrow-field-of-view Differential Optical Absorption Spectroscopy (NFOV-DOAS) system. This procedure combines the very good spatial and temporal resolution of the SO2 camera technique with the more accurate column densities obtainable from DOAS measurements. This work investigates the uncertainty of results gained through the two commonly used, but quite different calibration methods (DOAS and calibration cells). Measurements with three different instruments, an SO2 camera, a NFOV-DOAS system and an Imaging DOAS (IDOAS), are presented. We compare the calibration-cell approach with the calibration from the NFOV-DOAS system. The respective results are compared with measurements from an IDOAS to verify the calibration curve over the spatial extend of the image. Our results show that calibration cells can lead to an overestimation of the SO2 CD by up to 60% compared with CDs from the DOAS measurements. Besides these errors of calibration, radiative transfer effects (e.g. light dilution, multiple scattering) can significantly influence the results of both instrument types. These effects can lead to an even more significant overestimation or, depending on the measurement conditions, an underestimation of the true CD. Previous investigations found that possible errors can be more than an order of magnitude. However, the spectral information from the DOAS measurements allows to correct for these radiative transfer effects. The measurement presented in this work were taken at Popocatépetl, Mexico, between 1 March 2011 and 4 March 2011. Average SO2 emission rates between 4.00 kg s-1 and 14.34 kg s-1 were observed.

Lübcke, P.; Bobrowski, N.; Illing, S.; Kern, C.; Alvarez Nieves, J. M.; Vogel, L.; Zielcke, J.; Delgado Granados, H.; Platt, U.

2012-09-01

359

Characterization of the Series 1000 Camera System  

SciTech Connect

The National Ignition Facility requires a compact network addressable scientific grade CCD camera for use in diagnostics ranging from streak cameras to gated x-ray imaging cameras. Due to the limited space inside the diagnostic, an analog and digital input/output option in the camera controller permits control of both the camera and the diagnostic by a single Ethernet link. The system consists of a Spectral Instruments Series 1000 camera, a PC104+ controller, and power supply. The 4k by 4k CCD camera has a dynamic range of 70 dB with less than 14 electron read noise at a 1MHz readout rate. The PC104+ controller includes 16 analog inputs, 4 analog outputs and 16 digital input/output lines for interfacing to diagnostic instrumentation. A description of the system and performance characterization is reported.

Kimbrough, J; Moody, J; Bell, P; Landen, O

2004-04-07

360

Characterization of the series 1000 camera system  

SciTech Connect

The National Ignition Facility requires a compact network addressable scientific grade charge coupled device (CCD) camera for use in diagnostics ranging from streak cameras to gated x-ray imaging cameras. Due to the limited space inside the diagnostic, an analog and digital input/output option in the camera controller permits control of both the camera and the diagnostic by a single Ethernet link. The system consists of a Spectral Instruments Series 1000 camera, a PC104+ controller, and power supply. The 4k by 4k CCD camera has a dynamic range of 70 dB with less than 14 electron read noise at a 1 MHz readout rate. The PC104+ controller includes 16 analog inputs, four analog outputs, and 16 digital input/output lines for interfacing to diagnostic instrumentation. A description of the system and performance characterization is reported.

Kimbrough, J.R.; Moody, J.D.; Bell, P.M.; Landen, O.L. [Lawrence Livermore National Laboratory, Livermore, California 94551-0808 (United States)

2004-10-01

361

Application of camera phones in telehaematology.  

PubMed

We investigated the use of camera phones for telehaematology. First, the minimum requirements for the camera phones to be used in telehaematology were investigated. A single image containing white cells, red cells and platelets was sent from a camera phone to 33 different camera phones. Nine of the camera phones were found to be unsuitable for telehaematology due to low display resolution or no zoom function of the image. Then we examined the agreement between a haematologist using a suitable camera phone for remote diagnosis and the blood film report made in the usual way. Blood samples were collected from nine patients who had conditions in which diagnostically important morphological abnormalities occurred. In seven of the nine cases, the telehaematology responses were similar to the documented blood film reports. We conclude that telehaematology using camera phones offers a quick and potentially valuable method of support for the diagnostic haematology laboratory. PMID:19815902

McLean, Richard; Jury, Corrine; Bazeos, Alexandra; Lewis, S Mitchell

2009-01-01

362

Measuring Non-spherical Airborne Dust with Space-based MISR Multi-angle Imaging.  

NASA Astrophysics Data System (ADS)

Some of the world's largest dust plumes emanate from Northern Eurasian deserts and are expected to increasingly affect Asian ergonomics. Together with field experiments, satellite observations of dust outbreaks, placed into the context of large-scale dust transport modeling, can help understand the impact of mineral dust aerosols on past and present climate and climate predictions in North and Central Asia. Multi-angle instruments such as the Multi-angle Imaging SpectroRadiometer (MISR) provide independent constraints on aerosol properties based on sensitivity to the shape of the scattering phase function. We present an analysis of the Multi-angle Imaging SpectroRadiometer (MISR) Standard Aerosol Retrieval algorithm, updated with new non-spherical dust models (Version 16 and higher). We compare the MISR products with coincident AERONET surface sun-photometer observations taken during the passage of dust fronts. Our analysis shows that during such events MISR retrieves Angstrom exponents characteristic of large particles, having little spectral variation in extinction over the MISR wavelength range (442, 550, 672 and 866 nm channels), as expected. Also, the retrieved fraction of non-spherical particles is very high. This quantity is not retrieved by satellite instruments having only nadir-viewing cameras. We assess whether MISR aerosol optical thickness (AOT) acquired at about 10:30 AM local time, can be used to represent daily mean AOT in dust climate forcing studies, by comparing MISR-retrieved aerosol optical thickness (AOT) with AERONET daily-mean values. We also compare the effect of particle shape on MISR and MODIS dust retrievals, using co-located MISR, MODIS, and AERONET AOTs and Angstrom exponents. In most cases obtained for this study, MODIS had no retrievals due to sun-glint when MISR's narrower swath observed AERONET sties on islands surrounded by dark water. For the few coincident MISR-MODIS-AERONET dark-water, dusty condition retrievals we obtained, the MISR retrievals were in better agreement with AERONET than those from MODIS. Over bright desert sites, MODIS AOTs at visible wavelengths was systematically higher than those of AERONET and MISR. MISR-derived aerosol type mixtures for these cases included non-spherical dust components with high frequency in retrievals over dark water, and slightly lower frequency over land. The frequency with which non-spherical dust models were selected by the algorithm also decreased in dusty regions affected by pollution. Both MISR and MODIS retrievals have a high fail rate over optically thick dust plumes.

Kalashnikova, O. V.; Diner, D. J.; Abdou, W.; Kahn, R.; Gaitley, B. J.; Gasso, S.

2004-12-01

363

40. CENTRAL PAVILION OF WEST FACADESLIGHTLY ANGLED, FRONTAL, NORMAL ANGLE ...  

Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

40. CENTRAL PAVILION OF WEST FACADE--SLIGHTLY ANGLED, FRONTAL, NORMAL ANGLE Copy photograph of photogrammetric plate LC-HABS-FS13-B-1974-839R. - St. Mary's Seminary, 600 North Paca Street, Baltimore, Independent City, MD

364

The Advanced Camera for the Hubble Space Telescope  

Microsoft Academic Search

The Advanced Camera for the Hubble Space Telescope has three cameras. The first, the Wide Field Camera, will be a high- throughput, wide field, 4096 X 4096 pixel CCD optical and I-band camera that is half-critically sampled at 500 nm. The second, the High Resolution Camera (HRC), is a 1024 X 1024 pixel CCD camera that is critically sampled at

G. D. Illingworth; Paul D. Feldman; David A. Golimowski; Zlatan Tsvetanov; Christopher J. Burrows; James H. Crocker; Pierre Y. Bely; George F. Hartig; Randy A. Kimble; Michael P. Lesser; Richard L. White; Tom Broadhurst; William B. Sparks; Robert A. Woodruff; Pamela Sullivan; Carolyn A. Krebs; Douglas B. Leviton; William Burmester; Sherri Fike; Rich Johnson; Robert B. Slusher; Paul Volmer

1997-01-01

365

First results from the Faint Object Camera - SN 1987A  

NASA Technical Reports Server (NTRS)

The first images of SN 1987A taken on day 1278 after outburst with the Faint Object Camera on board the Hubble Space Telescope are presented. The supernova is well detected and resolved spatially in three broadband ultraviolet exposures spanning the 1500-3800 A range and in a narrow-band image centered on the forbidden O III 5007 line. Simple uniform disk fits to the profiles of SN 1987A yield an average angular diameter of 170 + or - 30 mas, corresponding to an average expansion velocity of 6000 km/s. The derived broadband ultraviolet fluxes, when corrected for interstellar absorption, indicate a blue ultraviolet spectrum corresponding to a color temperature near 13,000 K.

Jakobsen, P.; Albrecht, R.; Barbieri, C.; Blades, J. C.; Boksenberg, A.

1991-01-01

366

First light observations with TIFR near-infrared camera.  

NASA Astrophysics Data System (ADS)

The TIFR near-infrared camera (TIRCAM) is based on the SBRC InSb focal plane array (58×62 pixels) sensitive between 1-5 ?m. TIRCAM had its first engineering run at Gurusikhar 1.2 meter PRL telescope at Mount Abu during March-April 2001. The first light observations with TIRCAM were quite successful. Several infrared standard stars and the Trapezium Cluster in Orion region were observed in the J, H and K bands. In a narrow band at 3.9 ?m (nbL), some bright stars could be detected from the Gurusikhar site. The performance of TIRCAM is discussed in the light of preliminary observations in nbL band.

Ojha, D. K.; Ghosh, S. K.; Verma, R. P.; Chakraborti, A.; Anandarao, B. G.

2002-12-01

367

Feasibility study for the application of the large format camera as a payload for the Orbiter program  

NASA Technical Reports Server (NTRS)

The large format camera (LFC) designed as a 30 cm focal length cartographic camera system that employs forward motion compensation in order to achieve the full image resolution provided by its 80 degree field angle lens is described. The feasibility of application of the current LFC design to deployment in the orbiter program as the Orbiter Camera Payload System was assessed and the changes that are necessary to meet such a requirement are discussed. Current design and any proposed design changes were evaluated relative to possible future deployment of the LFC on a free flyer vehicle or in a WB-57F. Preliminary mission interface requirements for the LFC are given.

1978-01-01

368

CCD Camera Lens Interface for Real-Time Theodolite Alignment  

NASA Technical Reports Server (NTRS)

Theodolites are a common instrument in the testing, alignment, and building of various systems ranging from a single optical component to an entire instrument. They provide a precise way to measure horizontal and vertical angles. They can be used to align multiple objects in a desired way at specific angles. They can also be used to reference a specific location or orientation of an object that has moved. Some systems may require a small margin of error in position of components. A theodolite can assist with accurately measuring and/or minimizing that error. The technology is an adapter for a CCD camera with lens to attach to a Leica Wild T3000 Theodolite eyepiece that enables viewing on a connected monitor, and thus can be utilized with multiple theodolites simultaneously. This technology removes a substantial part of human error by relying on the CCD camera and monitors. It also allows image recording of the alignment, and therefore provides a quantitative means to measure such error.

Wake, Shane; Scott, V. Stanley, III

2012-01-01

369

RESULTS OF A PERFORMANCE TEST OF A DUAL MID-FORMAT DIGITAL CAMERA SYSTEM  

E-print Network

The low weight and the relatively low cost of medium format digital cameras have pushed the use of those units for aerial survey. These medium format cameras are often used as secondary sensors together with other aerial sensors like LiDAR systems. The rising number of pixels per camera leads to an increasing interest in medium format systems as main sensors, especially for smaller survey aircraft. While the number of pixels across flight direction is not critical for capturing linear objects, like power lines or pipelines, the relatively small number of pixels compared to large format systems increases the necessary flying effort for photogrammetric blocks. In case larger blocks have to be flown efficiently, it is possible to combine two or more of such medium format cameras (dual- or multi-head solutions). This combination of medium format cameras increases the possible image strip widths and therefore reduces the flying time and distance for block projects. A performance test of a dual-head medium format digital camera system flown over the Vaihingen/Enz test field of the Institute for Photogrammetry of Stuttgart University is presented. The operated Dual-DigiCAM-H/39 consisted of two 39Mpixel cameras. To increase the image width across flight direction, the two cameras were mounted to look to the side at an oblique angle of +14.8 ° and-14.8°, respectively. This configuration results in an effective image width of 13650 pixel. The dual camera system was operated together with a CCNS/AEROcontrol navigation- and GPS/IMU system. The GPS/IMU trajectory was processed with different GPS methods and the different trajectories are compared. The overall system performance was evaluated based on the analysis of independent check point differences. 1.

Jens Kremer A; Michael Cramer B; Commission I Ths

370

Side writing phenomena in narrow track recording  

Microsoft Academic Search

Edge writing phenomena in narrow track recording on both well-oriented and planar isotropic thin-film recording media are studied by numerical micromagnetic modeling. Multiple dibit transition pairs are simulated and statistical properties are analyzed. The magnetization patterns of the simulated dibits are studied as well as the magnetic pole density distributions. It is found that, for the well-oriented film, the magnetic

Jian-Gang Zhu; Xiao-Guang Ye; T. C. Arnoldussen

1992-01-01

371

Shapes and Angles  

NSDL National Science Digital Library

In this activity (page 7 of PDF), learners will identify the general two-dimensional geometric shape of the uppermost cross section of an impact crater. They will also draw connections between the general two-dimensional geometric shape of an impact crater and the projectile's angle of impact. There are two versions of this activity: Challenge, where students construct a launcher and create their own craters; and Non-Challenge where students analyze pictures of craters. The Moon Math: Craters! guide follows a 5E approach, applying concepts of geometry, modeling, data analysis to the NASA lunar spacecraft mission, LCROSS.

NASA

2012-05-08

372

The costovertebral angle.  

PubMed

Because the anatomy of the costovertebral angle is complex and often unfamiliar to the operating thoracic surgeon, surgical procedures performed in that area must be performed by a surgical team rather than by individual surgeons, and such a team usually includes either an orthopedic surgeon or a neurosurgeon. This is the case, for example, of Pancoast's tumors invading the roots of the brachial plexus or the spine itself where the help of an orthopedic surgeon is invaluable not only to achieve a complete resection but also to prevent catastrophic complications. PMID:18271164

Vallières, Eric

2007-11-01

373

Angles and Area  

NSDL National Science Digital Library

In this activity (page 10 of PDF), learners approximate the area of the uppermost cross section of an impact crater using a variety of square grids. They conclude which angle of impact results in the greatest area. There are two versions of this activity: Challenge, where students construct a launcher and create their own craters; and Non-Challenge where students analyze pictures of craters. Includes a pre-lesson activity (p54). The Moon Math: Craters! guide follows a 5E approach, applying concepts of geometry, modeling, data analysis to the NASA lunar spacecraft mission, LCROSS.

Nasa

2012-05-08

374

Fruit Detectability Analysis for Different Camera Positions in Sweet-Pepper †  

PubMed Central

For robotic harvesting of sweet-pepper fruits in greenhouses a sensor system is required to detect and localize the fruits on the plants. Due to the complex structure of the plant, most fruits are (partially) occluded when an image is taken from one viewpoint only. In this research the effect of multiple camera positions and viewing angles on fruit visibility and detectability was investigated. A recording device was built which allowed to place the camera under different azimuth and zenith angles and to move the camera horizontally along the crop row. Fourteen camera positions were chosen and the fruit visibility in the recorded images was manually determined for each position. For images taken from one position only with the criterion of maximum 50% occlusion per fruit, the fruit detectability (FD) was in no case higher than 69%. The best single positions were the front views and looking with a zenith angle of 60° upwards. The FD increased when a combination was made of multiple viewpoint positions. With a combination of five favourite positions the maximum FD was 90%. PMID:24681670

Hemming, Jochen; Ruizendaal, Jos; Hofstee, Jan Willem; van Henten, Eldert J.

2014-01-01

375

Integrated calibration between digital camera and laser scanner from mobile mapping system for land vehicles  

NASA Astrophysics Data System (ADS)

The paper presents the concept of lever arm and boresight angle, the design requirements of calibration sites and the integrated calibration method of boresight angles of digital camera or laser scanner. Taking test data collected by Applanix's LandMark system as an example, the camera calibration method is introduced to be piling three consecutive stereo images and OTF-Calibration method using ground control points. The laser calibration of boresight angle is proposed to use a manual and automatic method with ground control points. Integrated calibration between digital camera and laser scanner is introduced to improve the systemic precision of two sensors. By analyzing the measurement value between ground control points and its corresponding image points in sequence images, a conclusion is that position objects between camera and images are within about 15cm in relative errors and 20cm in absolute errors. By comparing the difference value between ground control points and its corresponding laser point clouds, the errors is less than 20cm. From achieved results of these experiments in analysis, mobile mapping system is efficient and reliable system for generating high-accuracy and high-density road spatial data more rapidly.

Zhao, Guihua; Chen, Hong; Li, Xingquan; Zou, Xiaoliang

376

Optimum Projection Angle for Attaining Maximum Distance in a Soccer Punt Kick  

PubMed Central

To produce the greatest horizontal distance in a punt kick the ball must be projected at an appropriate angle. Here, we investigated the optimum projection angle that maximises the distance attained in a punt kick by a soccer goalkeeper. Two male players performed many maximum-effort kicks using projection angles of between 10° and 90°. The kicks were recorded by a video camera at 100 Hz and a 2 D biomechanical analysis was conducted to obtain measures of the projection velocity, projection angle, projection height, ball spin rate, and foot velocity at impact. The player’s optimum projection angle was calculated by substituting mathematical equations for the relationships between the projection variables into the equations for the aerodynamic flight of a soccer ball. The calculated optimum projection angles were in agreement with the player’s preferred projection angles (40° and 44°). In projectile sports even a small dependence of projection velocity on projection angle is sufficient to produce a substantial shift in the optimum projection angle away from 45°. In the punt kicks studied here, the optimum projection angle was close to 45° because the projection velocity of the ball remained almost constant across all projection angles. This result is in contrast to throwing and jumping for maximum distance, where the projection velocity the athlete is able to achieve decreases substantially with increasing projection angle and so the optimum projection angle is well below 45°. Key points The optimum projection angle that maximizes the distance of a punt kick by a soccer goalkeeper is about 45°. The optimum projection angle is close to 45° because the projection velocity of the ball is almost the same at all projection angles. This result is in contrast to throwing and jumping for maximum distance, where the optimum projection angle is well below 45° because the projection velocity the athlete is able to achieve decreases substantially with increasing projection angle. PMID:24149315

Linthorne, Nicholas P.; Patel, Dipesh S.

2011-01-01

377

An attentive multi-camera system  

NASA Astrophysics Data System (ADS)

Intelligent multi-camera systems that integrate computer vision algorithms are not error free, and thus both false positive and negative detections need to be revised by a specialized human operator. Traditional multi-camera systems usually include a control center with a wall of monitors displaying videos from each camera of the network. Nevertheless, as the number of cameras increases, switching from a camera to another becomes hard for a human operator. In this work we propose a new method that dynamically selects and displays the content of a video camera from all the available contents in the multi-camera system. The proposed method is based on a computational model of human visual attention that integrates top-down and bottom-up cues. We believe that this is the first work that tries to use a model of human visual attention for the dynamic selection of the camera view of a multi-camera system. The proposed method has been experimented in a given scenario and has demonstrated its effectiveness with respect to the other methods and manually generated ground-truth. The effectiveness has been evaluated in terms of number of correct best-views generated by the method with respect to the camera views manually generated by a human operator.

Napoletano, Paolo; Tisato, Francesco

2014-03-01

378

Making Oatmeal Box Pinhole Cameras  

NSDL National Science Digital Library

This web site provides step-by-step directions for constructing a pinhole camera out of an oatmeal box and other common household items. Each step is supplemented with photos to show exactly how to build the apparatus so that it will actually take pictures. Also included are detailed procedures for shooting the photographs and developing them in an amateur darkroom. **NOTE: If performing this activity with children, follow safety procedures for using the photo developing agent. SEE THIS LINK for safety information on Kodak Dektol: http://www2.itap.purdue.edu/msds/docs/9735.pdf

Woodruff, Stewart

2009-05-28

379

Optimising camera traps for monitoring small mammals.  

PubMed

Practical techniques are required to monitor invasive animals, which are often cryptic and occur at low density. Camera traps have potential for this purpose, but may have problems detecting and identifying small species. A further challenge is how to standardise the size of each camera's field of view so capture rates are comparable between different places and times. We investigated the optimal specifications for a low-cost camera trap for small mammals. The factors tested were 1) trigger speed, 2) passive infrared vs. microwave sensor, 3) white vs. infrared flash, and 4) still photographs vs. video. We also tested a new approach to standardise each camera's field of view. We compared the success rates of four camera trap designs in detecting and taking recognisable photographs of captive stoats (Mustelaerminea), feral cats (Felis catus) and hedgehogs (Erinaceuseuropaeus). Trigger speeds of 0.2-2.1 s captured photographs of all three target species unless the animal was running at high speed. The camera with a microwave sensor was prone to false triggers, and often failed to trigger when an animal moved in front of it. A white flash produced photographs that were more readily identified to species than those obtained under infrared light. However, a white flash may be more likely to frighten target animals, potentially affecting detection probabilities. Video footage achieved similar success rates to still cameras but required more processing time and computer memory. Placing two camera traps side by side achieved a higher success rate than using a single camera. Camera traps show considerable promise for monitoring invasive mammal control operations. Further research should address how best to standardise the size of each camera's field of view, maximise the probability that an animal encountering a camera trap will be detected, and eliminate visible or audible cues emitted by camera traps. PMID:23840790

Glen, Alistair S; Cockburn, Stuart; Nichols, Margaret; Ekanayake, Jagath; Warburton, Bruce

2013-01-01

380

Digital Camera Calibration Using Images Taken from AN Unmanned Aerial Vehicle  

NASA Astrophysics Data System (ADS)

For calibrating the camera, an accurate determination of the interior orientation parameters is needed. For more accurate results, the calibration images should be taken under conditions that are similar to the field samples. The aim of this work is the establishment of an efficient and accurate digital camera calibration method to be used in particular working conditions, as it can be found with our UAV (Unmanned Aerial Vehicle) photogrammetric projects. The UAV used in this work was md4-200 modelled by Microdrones. The microdrone is also equipped with a standard digital non- metric camera, the Pentax Optio A40 camera. To find out the interior orientation parameters of the digital camera, two calibration methods were done. A lab calibration based on a flat pattern and a field calibration were fulfilled. To carry out the calibration, Photomodeler Scanner software was used in both cases. The lab calibration process was completely automatic using a calibration grid. The focal length was fixed at widest angle and the network included a total of twelve images with± 90º roll angles. In order to develop the field calibration, a flight plan was programmed including a total of twelve images. In the same way as in the lab calibration, the focal length was fixed at widest angle. The field test used in the study was a flat surface located on the University of Almería campus and a set of 67 target points were placed. The calibration field area was 25 × 25 m approximately and the altitude flight over ground was 50 m. After the software processing, the camera calibration parameter values were obtained. The paper presents the process, the results and the accuracy of these calibration methods. The field calibration method reduced the final total error obtained in the previous lab calibration. Furthermore the overall RMSs obtained from both methods are similar. Therefore we will apply the field calibration results to all our photogrammetric projects in which the flight high will be close to 50 m.

Pérez, M.; Agüera, F.; Carvajal, F.

2011-09-01

381

Angle-resolved scattering spectroscopy of explosives using an external cavity quantum cascade laser  

SciTech Connect

Investigation of angle-resolved scattering from solid explosives residues on a car door for non-contact sensing geometries. Illumination with a mid-infrared external cavity quantum cascade laser tuning between 7 and 8 microns was detected both with a sensitive single point detector and a hyperspectral imaging camera. Spectral scattering phenomena were discussed and possibilities for hyperspectral imaging at large scattering angles were outlined.

Suter, Jonathan D.; Bernacki, Bruce E.; Phillips, Mark C.

2012-04-01

382

Variable angle correlation spectroscopy  

SciTech Connect

In this dissertation, a novel nuclear magnetic resonance (NMR) technique, variable angle correlation spectroscopy (VACSY) is described and demonstrated with {sup 13}C nuclei in rapidly rotating samples. These experiments focus on one of the basic problems in solid state NMR: how to extract the wealth of information contained in the anisotropic component of the NMR signal while still maintaining spectral resolution. Analysis of the anisotropic spectral patterns from poly-crystalline systems reveal information concerning molecular structure and dynamics, yet in all but the simplest of systems, the overlap of spectral patterns from chemically distinct sites renders the spectral analysis difficult if not impossible. One solution to this problem is to perform multi-dimensional experiments where the high-resolution, isotropic spectrum in one dimension is correlated with the anisotropic spectral patterns in the other dimensions. The VACSY technique incorporates the angle between the spinner axis and the static magnetic field as an experimental parameter that may be incremented during the course of the experiment to help correlate the isotropic and anisotropic components of the spectrum. The two-dimensional version of the VACSY experiments is used to extract the chemical shift anisotropy tensor values from multi-site organic molecules, study molecular dynamics in the intermediate time regime, and to examine the ordering properties of partially oriented samples. The VACSY technique is then extended to three-dimensional experiments to study slow molecular reorientations in a multi-site polymer system.

Lee, Y.K. [Univ. of California, Berkeley, CA (United States)]|[Lawrence Berkeley Lab., CA (United States). Chemical Biodynamics Div.

1994-05-01

383

Limbus Impact on Off-angle Iris Degradation  

SciTech Connect

The accuracy of iris recognition depends on the quality of data capture and is negatively affected by several factors such as angle, occlusion, and dilation. Off-angle iris recognition is a new research focus in biometrics that tries to address several issues including corneal refraction, complex 3D iris texture, and blur. In this paper, we present an additional significant challenge that degrades the performance of the off-angle iris recognition systems, called the limbus effect . The limbus is the region at the border of the cornea where the cornea joins the sclera. The limbus is a semitransparent tissue that occludes a side portion of the iris plane. The amount of occluded iris texture on the side nearest the camera increases as the image acquisition angle increases. Without considering the role of the limbus effect, it is difficult to design an accurate off-angle iris recognition system. To the best of our knowledge, this is the first work that investigates the limbus effect in detail from a biometrics perspective. Based on results from real images and simulated experiments with real iris texture, the limbus effect increases the hamming distance score between frontal and off-angle iris images ranging from 0.05 to 0.2 depending upon the limbus height.

Karakaya, Mahmut [ORNL; Barstow, Del R [ORNL; Santos-Villalobos, Hector J [ORNL; Thompson, Joseph W [ORNL; Bolme, David S [ORNL; Boehnen, Chris Bensing [ORNL

2013-01-01

384

Triangles: Finding Interior Angle Measures  

NSDL National Science Digital Library

In this lesson plan, students will start with a hands-on activity and then experiment with a GeoGebra-based computer model to investigate and discover the Triangle Angle Sum Theorem. Then they will use the Triangle Angle Sum Theorem to write and solve equations and find missing angle measures in a variety of examples.

2012-11-25

385

A grazing incidence x-ray streak camera for ultrafast, single-shot measurements  

SciTech Connect

An ultrafast x-ray streak camera has been realized using a grazing incidence reflection photocathode. X-rays are incident on a gold photocathode at a grazing angle of 20 degree and photoemitted electrons are focused by a large aperture magnetic solenoid lens. The streak camera has high quantum efficiency, 600fs temporal resolution, and 6mm imaging length in the spectral direction. Its single shot capability eliminates temporal smearing due to sweep jitter, and allows recording of the ultrafast dynamics of samples that undergo non-reversible changes.

Feng, Jun; Engelhorn, K.; Cho, B.I.; Lee, H.J.; Greaves, M.; Weber, C.P.; Falcone, R.W.; Padmore, H. A.; Heimann, P.A.

2010-02-18

386

The Dark Energy Camera (DECam)  

E-print Network

In this paper we describe the Dark Energy Camera (DECam), which will be the primary instrument used in the Dark Energy Survey. DECam will be a 3 sq. deg. mosaic camera mounted at the prime focus of the Blanco 4m telescope at the Cerro-Tololo International Observatory (CTIO). It consists of a large mosaic CCD focal plane, a five element optical corrector, five filters (g,r,i,z,Y), a modern data acquisition and control system and the associated infrastructure for operation in the prime focus cage. The focal plane includes of 62 2K x 4K CCD modules (0.27"/pixel) arranged in a hexagon inscribed within the roughly 2.2 degree diameter field of view and 12 smaller 2K x 2K CCDs for guiding, focus and alignment. The CCDs will be 250 micron thick fully-depleted CCDs that have been developed at the Lawrence Berkeley National Laboratory (LBNL). Production of the CCDs and fabrication of the optics, mechanical structure, mechanisms, and control system for DECam are underway; delivery of the instrument to CTIO is scheduled for 2010.

K. Honscheid; D. L. DePoy; for the DES Collaboration

2008-10-20

387

High Speed Digital Camera Technology Review  

NASA Technical Reports Server (NTRS)

A High Speed Digital Camera Technology Review (HSD Review) is being conducted to evaluate the state-of-the-shelf in this rapidly progressing industry. Five HSD cameras supplied by four camera manufacturers participated in a Field Test during the Space Shuttle Discovery STS-128 launch. Each camera was also subjected to Bench Tests in the ASRC Imaging Development Laboratory. Evaluation of the data from the Field and Bench Tests is underway. Representatives from the imaging communities at NASA / KSC and the Optical Systems Group are participating as reviewers. A High Speed Digital Video Camera Draft Specification was updated to address Shuttle engineering imagery requirements based on findings from this HSD Review. This draft specification will serve as the template for a High Speed Digital Video Camera Specification to be developed for the wider OSG imaging community under OSG Task OS-33.

Clements, Sandra D.

2009-01-01

388

Television camera video level control system  

NASA Technical Reports Server (NTRS)

A video level control system is provided which generates a normalized video signal for a camera processing circuit. The video level control system includes a lens iris which provides a controlled light signal to a camera tube. The camera tube converts the light signal provided by the lens iris into electrical signals. A feedback circuit in response to the electrical signals generated by the camera tube, provides feedback signals to the lens iris and the camera tube. This assures that a normalized video signal is provided in a first illumination range. An automatic gain control loop, which is also responsive to the electrical signals generated by the camera tube 4, operates in tandem with the feedback circuit. This assures that the normalized video signal is maintained in a second illumination range.

Kravitz, M.; Freedman, L. A.; Fredd, E. H.; Denef, D. E. (inventors)

1985-01-01

389

Spectrophotometric investigation of Phobos with the Rosetta OSIRIS-NAC camera and implications for its collisional capture  

NASA Astrophysics Data System (ADS)

The Martian satellite Phobos has been observed on 2007 February 24 and 25, during the pre- and post-Mars closest approach (CA) of the ESA Rosetta spacecraft Mars swing-by. The goal of the observations was the determination of the surface composition of different areas of Phobos, in order to obtain new clues regarding its nature and origin. Near-ultraviolet, visible and near-infrared (263.5-992.0 nm) images of Phobos's surface were acquired using the Narrow Angle Camera of the OSIRIS instrument onboard Rosetta. The six multi-wavelength sets of observations allowed a spectrophotometric characterization of different areas of the satellite, belonging respectively to the leading and trailing hemisphere of the anti-Mars hemisphere, and also of a section of its sub-Mars hemisphere. The pre-CA spectrophotometric data obtained with a phase angle of 19° have a spectral trend consistent within the error bars with those of unresolved/disc-integrated measurements present in the literature. In addition, we detect an absorption band centred at 950 nm, which is consistent with the presence of pyroxene. The post-CA observations cover from NUV to NIR a portion of the surface (0° to 43°E of longitude) never studied before. The reflectance measured on our data does not fit with the previous spectrophotometry above 650 nm. This difference can be due to two reasons. First, the OSIRIS observed area in this observation phase is completely different with respect to the other local specific spectra and hence the spectrum may be different. Secondly, due to the totally different observation geometry (the phase angle ranges from 137° to 140°), the differences of spectral slope can be due to phase reddening. The comparison of our reflectance spectra, both pre- and post-CA, with those of D-type asteroids shows that the spectra of Phobos are all redder than the mean D-type spectrum, but within the spectral dispersion of other D-types. To complement this result, we performed an investigation of the conditions needed to collisionally capture Phobos in a way similar to that proposed for the irregular satellites of the giant planets. Once put in the context of the current understanding of the evolution of the early Solar system, the coupled observational and dynamical results we obtained strongly argue for an early capture of Phobos, likely immediately after the formation of Mars.

Pajola, M.; Lazzarin, M.; Bertini, I.; Marzari, F.; Turrini, D.; Magrin, S.; La Forgia, F.; Thomas, N.; Küppers, M.; Moissl, R.; Ferri, F.; Barbieri, C.; Rickman, H.; Sierks, H.

2012-12-01

390

Comparative evaluation of consumer grade cameras and mobile phone cameras for close range photogrammetry  

NASA Astrophysics Data System (ADS)

The authors have been concentrating on developing convenient 3D measurement methods using consumer grade digital cameras, and it was concluded that consumer grade digital cameras are expected to become a useful photogrammetric device for the various close range application fields. On the other hand, mobile phone cameras which have 10 mega pixels were appeared on the market in Japan. In these circumstances, we are faced with alternative epoch-making problem whether mobile phone cameras are able to take the place of consumer grade digital cameras in close range photogrammetric applications. In order to evaluate potentials of mobile phone cameras in close range photogrammetry, comparative evaluation between mobile phone cameras and consumer grade digital cameras are investigated in this paper with respect to lens distortion, reliability, stability and robustness. The calibration tests for 16 mobile phone cameras and 50 consumer grade digital cameras were conducted indoors using test target. Furthermore, practability of mobile phone camera for close range photogrammetry was evaluated outdoors. This paper presents that mobile phone cameras have ability to take the place of consumer grade digital cameras, and develop the market in digital photogrammetric fields.

Chikatsu, Hirofumi; Takahashi, Yoji

2009-08-01

391

Through-the-lens camera control  

Microsoft Academic Search

In this paper we introduce through-the-lens camera con- trol, a body of techniques that permit a user to manipulate a virtual camera by controlling and constraining features in the image seen through its lens. Rather than solving for camera parameters directly, constrained optimization is used to compute their time derivatives based on desired changes in user-defined controls. This effectively permits

Michael Gleicher; Andrew P. Witkin

1992-01-01

392

Tracking Across Multiple Cameras With Disjoint Views  

Microsoft Academic Search

Conventional tracking approaches assume proximity in space, time and appearance of objects in successive obser- vations. However, observations of objects are often widely separated in time and space when viewed from multiple non-overlapping cameras. To address this problem, we present a novel approach for establishing object correspon - dence across non-overlapping cameras. Our multi-camera tracking algorithm exploits the redundance in

Omar Javed; Zeeshan Rasheed; Khurram Shafique

2003-01-01

393

A Flexible New Technique for Camera Calibration  

Microsoft Academic Search

Abstract—We propose,a flexible new,technique,to easily calibrate a camera. It only requires the camera,to observe,a planar pattern shown,at a few (at least two) different orientations. Either the camera,or the planar pattern can be freely moved. The motion,need not be known. Radial lens distortion is modeled. The proposed procedure consists of a closed-form solution, followed by a nonlinear refinement based,on the maximum,likelihood

Zhengyou Zhang

2000-01-01

394

Multi-Camera Human Activity Monitoring  

Microsoft Academic Search

With the proliferation of security cameras, the approach taken to monitoring and placement of these cameras is critical. This\\u000a paper presents original work in the area of multiple camera human activity monitoring. First, a system is presented that tracks\\u000a pedestrians across a scene of interest and recognizes a set of human activities. Next, a framework is developed for the placement

Loren Fiore; Duc Fehr; Robert Bodor; Andrew Drenner; Guruprasad Somasundaram; Nikolaos Papanikolopoulos

2008-01-01

395

The Mars Science Laboratory Engineering Cameras  

NASA Astrophysics Data System (ADS)

NASA's Mars Science Laboratory (MSL) Rover is equipped with a set of 12 engineering cameras. These cameras are build-to-print copies of the Mars Exploration Rover cameras described in Maki et al. (J. Geophys. Res. 108(E12): 8071, 2003). Images returned from the engineering cameras will be used to navigate the rover on the Martian surface, deploy the rover robotic arm, and ingest samples into the rover sample processing system. The Navigation cameras (Navcams) are mounted to a pan/tilt mast and have a 45-degree square field of view (FOV) with a pixel scale of 0.82 mrad/pixel. The Hazard Avoidance Cameras (Hazcams) are body-mounted to the rover chassis in the front and rear of the vehicle and have a 124-degree square FOV with a pixel scale of 2.1 mrad/pixel. All of the cameras utilize a 1024×1024 pixel detector and red/near IR bandpass filters centered at 650 nm. The MSL engineering cameras are grouped into two sets of six: one set of cameras is connected to rover computer "A" and the other set is connected to rover computer "B". The Navcams and Front Hazcams each provide similar views from either computer. The Rear Hazcams provide different views from the two computers due to the different mounting locations of the "A" and "B" Rear Hazcams. This paper provides a brief description of the engineering camera properties, the locations of the cameras on the vehicle, and camera usage for surface operations.

Maki, J.; Thiessen, D.; Pourangi, A.; Kobzeff, P.; Litwin, T.; Scherr, L.; Elliott, S.; Dingizian, A.; Maimone, M.

2012-09-01

396

Electrostatic camera system functional design study  

NASA Technical Reports Server (NTRS)

A functional design study for an electrostatic camera system for application to planetary missions is presented. The electrostatic camera can produce and store a large number of pictures and provide for transmission of the stored information at arbitrary times after exposure. Preliminary configuration drawings and circuit diagrams for the system are illustrated. The camera system's size, weight, power consumption, and performance are characterized. Tradeoffs between system weight, power, and storage capacity are identified.

Botticelli, R. A.; Cook, F. J.; Moore, R. F.

1972-01-01

397

Stationary Camera Aims And Zooms Electronically  

NASA Technical Reports Server (NTRS)

Microprocessors select, correct, and orient portions of hemispherical field of view. Video camera pans, tilts, zooms, and provides rotations of images of objects of field of view, all without moving parts. Used for surveillance in areas where movement of camera conspicuous or constrained by obstructions. Also used for closeup tracking of multiple objects in field of view or to break image into sectors for simultaneous viewing, thereby replacing several cameras.

Zimmermann, Steven D.

1994-01-01

398

NARROW-K-BAND OBSERVATIONS OF THE GJ 1214 SYSTEM  

SciTech Connect

GJ 1214 is a nearby M dwarf star that hosts a transiting super-Earth-size planet, making this system an excellent target for atmospheric studies. Most studies find that the transmission spectrum of GJ 1214b is flat, which favors either a high mean molecular weight or cloudy/hazy hydrogen (H) rich atmosphere model. Photometry at short wavelengths (<0.7 ?m) and in the K band can discriminate the most between these different atmosphere models for GJ 1214b, but current observations do not have sufficiently high precision. We present photometry of seven transits of GJ 1214b through a narrow K-band (2.141 ?m) filter with the Wide Field Camera on the 3.8 m United Kingdom Infrared Telescope. Our photometric precision is typically 1.7 × 10{sup –3} (for a single transit), comparable with other ground-based observations of GJ 1214b. We measure a planet-star radius ratio of 0.1158 ± 0.0013, which, along with other studies, also supports a flat transmission spectrum for GJ 1214b. Since this does not exclude a scenario where GJ 1214b has an H-rich envelope with heavy elements that are sequestered below a cloud/haze layer, we compare K-band observations with models of H{sub 2} collision-induced absorption in an atmosphere for a range of temperatures. While we find no evidence for deviation from a flat spectrum (slope s = 0.0016 ± 0.0038), an H{sub 2}-dominated upper atmosphere (<60 mbar) cannot be excluded. More precise observations at <0.7 ?m and in the K band, as well as a uniform analysis of all published data, would be useful for establishing more robust limits on atmosphere models for GJ 1214b.

Colón, Knicole D.; Gaidos, Eric, E-mail: colonk@hawaii.edu [Department of Geology and Geophysics, University of Hawaii at Manoa, Honolulu, HI 96822 (United States)

2013-10-10

399

Mission report on the Orbiter Camera Payload System (OCPS) Large Format Camera (LFC) and Attitude Reference System (ARS)  

NASA Technical Reports Server (NTRS)

The Orbiter Camera Payload System (OCPS) is an integrated photographic system which is carried into earth orbit as a payload in the Space Transportation System (STS) Orbiter vehicle's cargo bay. The major component of the OCPS is a Large Format Camera (LFC), a precision wide-angle cartographic instrument that is capable of producing high resolution stereo photography of great geometric fidelity in multiple base-to-height (B/H) ratios. A secondary, supporting system to the LFC is the Attitude Reference System (ARS), which is a dual lens Stellar Camera Array (SCA) and camera support structure. The SCA is a 70-mm film system which is rigidly mounted to the LFC lens support structure and which, through the simultaneous acquisition of two star fields with each earth-viewing LFC frame, makes it possible to determine precisely the pointing of the LFC optical axis with reference to the earth nadir point. Other components complete the current OCPS configuration as a high precision cartographic data acquisition system. The primary design objective for the OCPS was to maximize system performance characteristics while maintaining a high level of reliability compatible with Shuttle launch conditions and the on-orbit environment. The full-up OCPS configuration was launched on a highly successful maiden voyage aboard the STS Orbiter vehicle Challenger on October 5, 1984, as a major payload aboard mission STS 41-G. This report documents the system design, the ground testing, the flight configuration, and an analysis of the results obtained during the Challenger mission STS 41-G.

Mollberg, Bernard H.; Schardt, Bruton B.

1988-01-01

400

Dependence of astigmatism, far-field pattern, and spectral envelope width on active layer thickness of gain guided lasers with narrow stripe geometry  

SciTech Connect

The effects of active layer thickness on the astigmatism, the angle of far-field pattern width parallel to the junction, and the spectral envelope width of a gain guided laser with a narrow stripe geometry have been investigated analytically and experimentally. It is concluded that a large level of astigmatism, a narrow far-field pattern width, and a rapid convergence of the spectral envelope width are inherent to the gain guided lasers with thin active layers.

Mamine, T.

1984-06-15

401

Wall Angle Effects on Nozzle Separation Stability  

NASA Astrophysics Data System (ADS)

The presence of asymmetric side loads due to unstable separation within over-expanded rocket nozzles is well documented. Although progress has been made in developing understanding of this phenomenon through numerical and experimental means, the causes of these side loads have yet to be fully explained. The hypothesis examined within this paper is that there is a relationship between nozzle wall angle at the point of separation, and the stability of the flow separation. This was achieved through an experimental investigation of a series of subscale over-expanded conical nozzles with half-angles of 8.3°, 10.4°, 12.6° and 14.8°. All had overall area ratios of 16:1, with separation occurring at approximately half the nozzle length (i.e. area ration of 4:1) under an overall pressure ratio of approximately 7:1 using air as the working fluid. The structure of exhaust flow was observed and analysed by use of an optimised Schlieren visualisation system, coupled with a high speed digital camera. The 12.6° and 14.8° nozzles exhaust flow were seen to be stable throughout the recorded test period of 10 seconds. However, a small number of large fluctuations in the jet angle were seen to be present within the flowfield of the 10.4° nozzle, occurring at apparently random intervals through the test period. The flowfield of the 8.3° nozzle demonstrated near continuous, large angle deviations in the jet, with flow patterns containing thickened shear layers and apparent reattachment to the wall, something not previously identified in conical nozzles. These results were used to design a truncated ideal contour with an exit angle of over 10 degrees, in order to assess the possibility of designing conventional nozzles that separate stably over a wide range of pressure ratios. These tests were successful, potentially providing a simpler, cheaper alternative to altitude compensating nozzle devices. However, more work determining the nature of the separation and its causes is required.

Aghababaie, A.; Taylor, N.

402

Heterodyne Interferometer Angle Metrology  

NASA Technical Reports Server (NTRS)

A compact, high-resolution angle measurement instrument has been developed that is based on a heterodyne interferometer. The common-path heterodyne interferometer metrology is used to measure displacements of a reflective target surface. In the interferometer setup, an optical mask is used to sample the measurement laser beam reflecting back from a target surface. Angular rotations, around two orthogonal axes in a plane perpendicular to the measurement- beam propagation direction, are determined simultaneously from the relative displacement measurement of the target surface. The device is used in a tracking telescope system where pitch and yaw measurements of a flat mirror were simultaneously performed with a sensitivity of 0.1 nrad, per second, and a measuring range of 0.15 mrad at a working distance of an order of a meter. The nonlinearity of the device is also measured less than one percent over the measurement range.

Hahn, Inseob; Weilert, Mark A.; Wang, Xu; Goullioud, Renaud

2010-01-01

403

Sun angle calculator  

NASA Technical Reports Server (NTRS)

A circular computer and system is disclosed for determining the sun angle relative to the horizon from any given place and at any time. The computer includes transparent, rotatably mounted discs on both sides of the circular disc member. Printed on one side of the circular disc member are outer and inner circular sets of indicia respectively representative of site longitude and Greenwich Mean Time. Printed on an associated one of the rotatable discs is a set of indicia representative of Solar Time. Printed on the other side of the circular disc member are parallel lines representative of latitude between diametral representations of North and South poles. Elliptical lines extending between the North and South poles are proportionally disposed on the surface to scale Solar Time in hours.

Flippin, A.; Schmitt, A. L. (inventors)

1976-01-01

404

Advanced High-Definition Video Cameras  

NASA Technical Reports Server (NTRS)

A product line of high-definition color video cameras, now under development, offers a superior combination of desirable characteristics, including high frame rates, high resolutions, low power consumption, and compactness. Several of the cameras feature a 3,840 2,160-pixel format with progressive scanning at 30 frames per second. The power consumption of one of these cameras is about 25 W. The size of the camera, excluding the lens assembly, is 2 by 5 by 7 in. (about 5.1 by 12.7 by 17.8 cm). The aforementioned desirable characteristics are attained at relatively low cost, largely by utilizing digital processing in advanced field-programmable gate arrays (FPGAs) to perform all of the many functions (for example, color balance and contrast adjustments) of a professional color video camera. The processing is programmed in VHDL so that application-specific integrated circuits (ASICs) can be fabricated directly from the program. ["VHDL" signifies VHSIC Hardware Description Language C, a computing language used by the United States Department of Defense for describing, designing, and simulating very-high-speed integrated circuits (VHSICs).] The image-sensor and FPGA clock frequencies in these cameras have generally been much higher than those used in video cameras designed and manufactured elsewhere. Frequently, the outputs of these cameras are converted to other video-camera formats by use of pre- and post-filters.

Glenn, William

2007-01-01

405

Intelligent thermal imaging camera with network interface  

NASA Astrophysics Data System (ADS)

In recent years, a significant increase in usage of thermal imagining cameras can be observed in both public and commercial sector, due to the lower cost and expanding availability of uncooled microbolometer infrared radiation detectors. Devices present on the market vary in their parameters and output interfaces. However, all these thermographic cameras are only a source of an image, which is then analyzed in external image processing unit. There is no possibility to run users dedicated image processing algorithms by thermal imaging camera itself. This paper presents a concept of realization, architecture and hardware implementation of "Intelligent thermal imaging camera with network interface" utilizing modern technologies, standards and approach in one single device.

Sielewicz, Krzysztof M.; Kasprowicz, Grzegorz; Po?niak, Krzysztof T.; Romaniuk, R. S.

2011-10-01

406

Upgrading optical information of rotating mirror cameras.  

PubMed

To date, rotating mirror (RM) cameras still serve as indispensable imaging equipment for the diagnosis of microsecond transient processes due to their excellent characteristics. This paper, for upgrading the optical information capacity of the cameras, presents the new optical acceleration principle to increase the framing frequency or scanning velocity, the new design theory without principle errors instead of the classical theories with some flaws in principle to have applied to design our simultaneous streak and framing rotating mirror camera with continuous access, and the new rotating mirror with novel structure, made of an aluminum alloy, to have considerably reduced lateral deformation of the RM and improved the performance of the camera. PMID:25430114

Li, Jingzhen; Sun, Fengshan; Huang, Hongbin; Gong, Xiangdong; Chen, Hongyi; Lu, Xiaowei; Cai, Yi

2014-11-01

407

Investigating at the Moon With new Eyes: The Lunar Reconnaissance Orbiter Mission Camera (LROC)  

NASA Astrophysics Data System (ADS)

The Lunar Reconnaissance Orbiter Mission Camera (LROC) H. Hiesinger (1,2), M.S. Robinson (3), A.S. McEwen (4), E.P. Turtle (4), E.M. Eliason (4), B.L. Jolliff (5), M.C. Malin (6), and P.C. Thomas (7) (1) Brown Univ., Dept. of Geological Sciences, Providence RI 02912, Harald_Hiesinger@brown.edu, (2) Westfaelische Wilhelms-University, (3) Northwestern Univ., (4) LPL, Univ. of Arizona, (5) Washington Univ., (6) Malin Space Science Systems, (7) Cornell Univ. The Lunar Reconnaissance Orbiter (LRO) mission is scheduled for launch in October 2008 as a first step to return humans to the Moon by 2018. The main goals of the Lunar Reconnaissance Orbiter Camera (LROC) are to: 1) assess meter and smaller- scale features for safety analyses for potential lunar landing sites near polar resources, and elsewhere on the Moon; and 2) acquire multi-temporal images of the poles to characterize the polar illumination environment (100 m scale), identifying regions of permanent shadow and permanent or near permanent illumination over a full lunar year. In addition, LROC will return six high-value datasets such as 1) meter-scale maps of regions of permanent or near permanent illumination of polar massifs; 2) high resolution topography through stereogrammetric and photometric stereo analyses for potential landing sites; 3) a global multispectral map in 7 wavelengths (300-680 nm) to characterize lunar resources, in particular ilmenite; 4) a global 100-m/pixel basemap with incidence angles (60-80 degree) favorable for morphologic interpretations; 5) images of a variety of geologic units at sub-meter resolution to investigate physical properties and regolith variability; and 6) meter-scale coverage overlapping with Apollo Panoramic images (1-2 m/pixel) to document the number of small impacts since 1971-1972, to estimate hazards for future surface operations. LROC consists of two narrow-angle cameras (NACs) which will provide 0.5-m scale panchromatic images over a 5-km swath, a wide-angle camera (WAC) to acquire images at about 100 m/pixel in seven color bands over a 100-km swath, and a common Sequence and Compressor System (SCS). Each NAC has a 700-mm-focal-length optic that images onto a 5000-pixel CCD line-array, providing a cross-track field-of-view (FOV) of 2.86 degree. The NAC readout noise is better than 100 e- , and the data are sampled at 12 bits. Its internal buffer holds 256 MB of uncompressed data, enough for a full-swath image 25-km long or a 2x2 binned image 100-km long. The WAC has two 6-mm- focal-length lenses imaging onto the same 1000 x 1000 pixel, electronically shuttered CCD area-array, one imaging in the visible/near IR, and the other in the UV. Each has a cross-track FOV of 90 degree. From the nominal 50-km orbit, the WAC will have a resolution of 100 m/pixel in the visible, and a swath width of ˜100 km. The seven-band color capability of the WAC is achieved by color filters mounted directly 1 over the detector, providing different sections of the CCD with different filters [1]. The readout noise is less than 40 e- , and, as with the NAC, pixel values are digitized to 12-bits and may be subsequently converted to 8-bit values. The total mass of the LROC system is about 12 kg; the total LROC power consumption averages at 22 W (30 W peak). Assuming a downlink with lossless compression, LRO will produce a total of 20 TeraBytes (TB) of raw data. Production of higher-level data products will result in a total of 70 TB for Planetary Data System (PDS) archiving, 100 times larger than any previous missions. [1] Malin et al., JGR, 106, 17651-17672, 2001. 2

Hiesinger, H.; Robinson, M. S.; McEwen, A. S.; Turtle, E. P.; Eliason, E. M.; Jolliff, B. L.; Malin, M. C.; Thomas, P. C.

408

Anomalous Wakefields in Ultra-Narrow Plasmas  

NASA Astrophysics Data System (ADS)

The properties of relativistic non-linear plasma wakes excited by lasers or particle beams in an ultra-narrow plasma (radius less than the plasma skin depth and the displacement amplitude of plasma electrons in the wake) are described. The restoring force on the oscillating electrons is reduced compared to oscillations in a homogeneous plasma resulting in a wave frequency that can be several times smaller than that of the usual wake at the plasma frequency. The circumstance described here arises commonly in current laser wakefield experiments with tightly focused lasers that self-ionize a gas in the low plasma density or resonant laser wakefield regime.

Tarik, K.; Katsouleas, T.; Muggli, P.; Mori, W. B.; Gordon, D.

2001-10-01

409

Multispectral imaging with optical bandpass filters: tilt angle and position estimation  

NASA Astrophysics Data System (ADS)

Optical bandpass filters play a decisive role in multispectral imaging. Various multispectral cameras use this type of color filter for the sequential acquisition of different spectral bands. Practically unavoidable, small tilt angles of the filters with respect to the optical axis influence the imaging process: First, by tilting the filter, the center wavelength of the filter is shifted, causing color variations. Second, due to refractions of the filter, the image is distorted geometrically depending on the tilt angle. Third, reflections between sensor and filter glass may cause ghosting, i.e., a weak and shifted copy of the image, which also depends on the filter angle. A method to measure the filter position parameters from multispectral color components is thus highly desirable. We propose a method to determine the angle and position of an optical filter brought into the optical path in, e.g., filter-wheel multispectral cameras, with respect to the camera coordinate system. We determine the position and angle of the filter by presenting a calibration chart to the camera, which is always partly reflected by the highly reflective optical bandpass filter. The extrinsic parameters of the original and mirrored chart can then be estimated. We derive the angle and position of the filter from the coordinates of the charts. We compare the results of the angle measurements to a ground truth obtained from the settings of a high-precision rotation table and thus validate our measurement method. Furthermore, we simulate the refraction effect of the optical filter and show that the results match quite well with the reality, thus also confirming our method.

Brauers, Johannes; Aach, Til

2009-01-01

410

Explosive Transient Camera (ETC) Program  

NASA Technical Reports Server (NTRS)

Since the inception of the ETC program, a wide range of new technologies was developed to support this astronomical instrument. The prototype unit was installed at ETC Site 1. The first partially automated observations were made and some major renovations were later added to the ETC hardware. The ETC was outfitted with new thermoelectrically-cooled CCD cameras and a sophisticated vacuum manifold, which, together, made the ETC a much more reliable unit than the prototype. The ETC instrumentation and building were placed under full computer control, allowing the ETC to operate as an automated, autonomous instrument with virtually no human intervention necessary. The first fully-automated operation of the ETC was performed, during which the ETC monitored the error region of the repeating soft gamma-ray burster SGR 1806-21.

Ricker, George

1991-01-01

411

Approximations to camera sensor noise  

NASA Astrophysics Data System (ADS)

Noise is present in all image sensor data. Poisson distribution is said to model the stochastic nature of the photon arrival process, while it is common to approximate readout/thermal noise by additive white Gaussian noise (AWGN). Other sources of signal-dependent noise such as Fano and quantization also contribute to the overall noise profile. Question remains, however, about how best to model the combined sensor noise. Though additive Gaussian noise with signal-dependent noise variance (SD-AWGN) and Poisson corruption are two widely used models to approximate the actual sensor noise distribution, the justification given to these types of models are based on limited evidence. The goal of this paper is to provide a more comprehensive characterization of random noise. We concluded by presenting concrete evidence that Poisson model is a better approximation to real camera model than SD-AWGN. We suggest further modification to Poisson that may improve the noise model.

Jin, Xiaodan; Hirakawa, Keigo

2013-02-01

412

Gesture recognition on smart cameras  

NASA Astrophysics Data System (ADS)

Gesture recognition is a feature in human-machine interaction that allows more natural interaction without the use of complex devices. For this reason, several methods of gesture recognition have been developed in recent years. However, most real time methods are designed to operate on a Personal Computer with high computing resources and memory. In this paper, we analyze relevant methods found in the literature in order to investigate the ability of smart camera to execute gesture recognition algorithms. We elaborate two hand gesture recognition pipelines. The first method is based on invariant moments extraction and the second on finger tips detection. The hand detection method used for both pipeline is based on skin color segmentation. The results obtained show that the un-optimized versions of invariant moments method and finger tips detection method can reach 10 fps on embedded processor and use about 200 kB of memory.

Dziri, Aziz; Chevobbe, Stephane; Darouich, Mehdi

2013-02-01

413

Reading Challenging Barcodes with Cameras.  

PubMed

Current camera-based barcode readers do not work well when the image has low resolution, is out of focus, or is motion-blurred. One main reason is that virtually all existing algorithms perform some sort of binarization, either by gray scale thresholding or by finding the bar edges. We propose a new approach to barcode reading that never needs to binarize the image. Instead, we use deformable barcode digit models in a maximum likelihood setting. We show that the particular nature of these models enables efficient integration over the space of deformations. Global optimization over all digits is then performed using dynamic programming. Experiments with challenging UPC-A barcode images show substantial improvement over other state-of-the-art algorithms. PMID:20617113

Gallo, Orazio; Manduchi, Roberto

2009-12-01

414

Wind dynamic range video camera  

NASA Technical Reports Server (NTRS)

A television camera apparatus is disclosed in which bright objects are attenuated to fit within the dynamic range of the system, while dim objects are not. The apparatus receives linearly polarized light from an object scene, the light being passed by a beam splitter and focused on the output plane of a liquid crystal light valve. The light valve is oriented such that, with no excitation from the cathode ray tube, all light is rotated 90 deg and focused on the input plane of the video sensor. The light is then converted to an electrical signal, which is amplified and used to excite the CRT. The resulting image is collected and focused by a lens onto the light valve which rotates the polarization vector of the light to an extent proportional to the light intensity from the CRT. The overall effect is to selectively attenuate the image pattern focused on the sensor.

Craig, G. D. (inventor)

1985-01-01

415

Traffic camera markup language (TCML)  

NASA Astrophysics Data System (ADS)

In this paper, we present a novel video markup language for articulating semantic traffic data from surveillance cameras and other sensors. The markup language includes three layers: sensor descriptions, traffic measurement, and application interface descriptions. The multi-resolution based video codec algorithm enables a quality-of-service-aware video streaming according the data traffic. A set of object detection APIs are developed using Convex Hull and Adaptive Proportion models and 3D modeling. It is found that our approach outperforms 3D modeling and Scale-Independent Feature Transformation (SIFT) algorithms in terms of robustness. Furthermore, our empirical data shows that it is feasible to use TCML to facilitate the real-time communication between an infrastructure and a vehicle for safer and more efficient traffic control.

Cai, Yang; Bunn, Andrew; Snyder, Kerry

2012-01-01

416

Illumination box and camera system  

DOEpatents

A hand portable, field-deployable thin-layer chromatography (TLC) unit and a hand portable, battery-operated unit for development, illumination, and data acquisition of the TLC plates contain many miniaturized features that permit a large number of samples to be processed efficiently. The TLC unit includes a solvent tank, a holder for TLC plates, and a variety of tool chambers for storing TLC plates, solvent, and pipettes. After processing in the TLC unit, a TLC plate is positioned in a collapsible illumination box, where the box and a CCD camera are optically aligned for optimal pixel resolution of the CCD images of the TLC plate. The TLC system includes an improved development chamber for chemical development of TLC plates that prevents solvent overflow.

Haas, Jeffrey S. (San Ramon, CA); Kelly, Fredrick R. (Modesto, CA); Bushman, John F. (Oakley, CA); Wiefel, Michael H. (La Honda, CA); Jensen, Wayne A. (Livermore, CA); Klunder, Gregory L. (Oakland, CA)

2002-01-01

417

2. Photocopied July 1971 from photostat Jordan Narrows Folder #1, ...  

Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

2. Photocopied July 1971 from photostat Jordan Narrows Folder #1, Engineering Department, Utah Power and Light Co., Salt Lake City, Utah. JORDAN NARROWS STATION. PLAN AND SECTION. - Salt Lake City Water & Electrical Power Company, Jordan Narrows Hydroelectric Plant, Jordan River, Riverton, Salt Lake County, UT

418

Narrow field electromagnetic sensor system and method  

DOEpatents

A narrow field electromagnetic sensor system and method of sensing a characteristic of an object provide the capability to realize a characteristic of an object such as density, thickness, or presence, for any desired coordinate position on the object. One application is imaging. The sensor can also be used as an obstruction detector or an electronic trip wire with a narrow field without the disadvantages of impaired performance when exposed to dirt, snow, rain, or sunlight. The sensor employs a transmitter for transmitting a sequence of electromagnetic signals in response to a transmit timing signal, a receiver for sampling only the initial direct RF path of the electromagnetic signal while excluding all other electromagnetic signals in response to a receive timing signal, and a signal processor for processing the sampled direct RF path electromagnetic signal and providing an indication of the characteristic of an object. Usually, the electromagnetic signal is a short RF burst and the obstruction must provide a substantially complete eclipse of the direct RF path. By employing time-of-flight techniques, a timing circuit controls the receiver to sample only the initial direct RF path of the electromagnetic signal while not sampling indirect path electromagnetic signals. The sensor system also incorporates circuitry for ultra-wideband spread spectrum operation that reduces interference to and from other RF services while allowing co-location of multiple electronic sensors without the need for frequency assignments.

McEwan, Thomas E. (Livermore, CA)

1996-01-01

419

Reinterpreting several narrow `resonances' as threshold cusps  

E-print Network

The threshold pbar-p peak in BES data for J/\\Psi to gamma-pbar-p may be fitted as a cusp. It arises from the well known threshold peak in pbar-p elastic scattering due to annihilation. Several similar examples are discussed. The PS185 data for pbar-p to Lambdabar-Lambda require an almost identical cusp at the Lambdabar-Lambda threshold. There is also a cusp at the Sigma-N threshold in Kminus-d to piminus-Lambda-p. Similar cusps are likely to arise at thresholds for all 2-body de-excitation processes, providing the interaction is attractive; likely examples are Lambda-pbar, Sigma-pbar, and Kbar-Lambda. The narrow peak observed by Belle at 3872 MeV in piplus-piminus-J/Psi may be a 1++ cusp due to the Dbar-D* threshold. The narrow Xi*(1862) observed by NA49 may be due to a threshold cusp in Sigma(1385)-Kbar coupled to Xi-pi and Sigma-Kbar. The relation of cusps to known resonances such as fo(980) is discussed.

D. V. Bugg

2004-06-25

420

Magnetoacoustic transport in narrow electron channels  

NASA Astrophysics Data System (ADS)

We develop a theory of the effect due to a small perpendicular magnetic field on the quantized acoustoelectric current induced by a surface acoustic wave (SAW) in a narrow electron channel. The quasi one-dimensional channel is formed in a piezoelectric GaAs/AlGaAs semiconductor structure by a split gate technique with the gate voltage beyond pinch-off. The current is the result of the trapping of electrons in the SAW induced moving quantum dots and the transfer of electrons residing in these dots through the channel. It has been observed recently (J. Cunningham, et al., Phys. Rev.B, 1999) that in small magnetic fields the acoustoelectric current oscillates as a function of magnetic field. Based on a simple model for the quantized acoustoelecric transport in a narrow channel (G. Gumbs et al., Phys. Rev.B, Rapid Commun., 60, N20, R13954, 1999) we develop a theory for these oscillations. The case when one electron is captured in the dot is considered, and the period, the amplitude, and the phase of the current oscillations as a function of the system's parameters are obtained and analyzed.

Aizin, Gregory; Gumbs, Godfrey; Pepper, M.

2000-03-01

421

Narrow linewidth operation of the RILIS titanium: Sapphire laser at ISOLDE/CERN  

NASA Astrophysics Data System (ADS)

A narrow linewidth operating mode for the Ti:sapphire laser of the CERN ISOLDE Resonance Ionization Laser Ion Source (RILIS) has been developed. This satisfies the laser requirements for the programme of in-source resonance ionization spectroscopy measurements and improves the selectivity for isomer separation using RILIS. A linewidth reduction from typically 10 GHz down to 1 GHz was achieved by the intra-cavity insertion of a second (thick) Fabry-Pérot etalon. Reliable operation during a laser scan was achieved through motorized control of the tilt angle of each etalon. A scanning, stabilization and mode cleaning procedure was developed and implemented in LabVIEW. The narrow linewidth operation was confirmed in a high resolution spectroscopy study of francium isotopes by the Collinear Resonance Ionization Spectroscopy experiment. The resulting laser scans demonstrate the suitability of the laser, in terms of linewidth, spectral purity and stability for high resolution in-source spectroscopy and isomer selective ionization using the RILIS.

Rothe, S.; Fedosseev, V. N.; Kron, T.; Marsh, B. A.; Rossel, R. E.; Wendt, K. D. A.

2013-12-01

422

Source camera identification based on CFA interpolation  

Microsoft Academic Search

In this work, we focus our interest on blind source camera identification problem by extending our results in the direction of (1). The interpolation in the color surface of an image due to the use of a color filter array (CFA) forms the basis of the paper. We propose to identify the source camera of an image based on traces

Sevinc Bayram; Husrev T. Sencar; Nasir D. Memon; Ismail Avcibas

2005-01-01

423

Operating the CCD Camera 1995 Edition  

E-print Network

and even the focal position may vary. If you turn on the camera system first, then open up the roof, get. To exit from the ccd program (after you have finished taking pictures and transferring them to hard disk Taking a Picture = Operating the Electronics To control the camera, type in commands at the asterisk

Veilleux, Sylvain

424

Tracking human motion using multiple cameras  

Microsoft Academic Search

Presents a framework for tracking human motion in an indoor environment from sequences of monocular grayscale images obtained from multiple fixed cameras. Multivariate Gaussian models are applied to find the most likely matches of human subjects between consecutive frames taken by cameras mounted in various locations. Experimental results from real data show the robustness of the algorithm and its potential

Q. Cai; J. K. Aggarwal

1996-01-01

425

Ego-Motion and Omnidirectional Cameras  

Microsoft Academic Search

Recent research in image sensors has produced cam- eras with very large fields of view. An area of computer vision research which will benefit from this technol- ogy is the computation of camera motion (ego-motion) from a sequence of images. Traditional cameras suffer from the problem that the direction of translation may lie outside of the field of view, making

Joshua Gluckman; Shree K. Nayar

1998-01-01

426

Using vanishing points for camera calibration  

Microsoft Academic Search

In this article a new method for the calibration of a vision system which consists of two (or more) cameras is presented. The proposed method, which uses simple properties of vanishing points, is divided into two steps. In the first step, the intrinsic parameters of each camera, that is, the focal length and the location of the intersection between the

Bruno Caprile; Vincent Torre

1990-01-01

427

Controlled Impact Demonstration (CID) tail camera video  

NASA Technical Reports Server (NTRS)

The Controlled Impact Demonstration (CID) was a joint research project by NASA and the FAA to test a survivable aircraft impact using a remotely piloted Boeing 720 aircraft. The tail camera movie is one shot running 27 seconds. It shows the impact from the perspective of a camera mounted high on the vertical stabilizer, looking forward over the fuselage and wings.

1984-01-01

428

Mobile Robot Geometry Initialization from Single Camera  

E-print Network

Mobile Robot Geometry Initialization from Single Camera Daniel Pizarro1 , Manuel Mazo1 , Enrique to achieve robot localization has been widely proposed in the area of Intelligent Spaces. Recently, an online approach that simul- taneously obtains robot's pose and its 3D structure using a single external camera has

Paris-Sud XI, Université de

429

Catadioptric Camera Calibration Using Geometric Invariants  

Microsoft Academic Search

Central catadioptric cameras are imaging devices that use mirrors to enhance the field of view while preserving a single effective viewpoint. In this paper, we propose a novel method for the calibration of central catadioptric cameras using geometric invariants. Lines in space are projected into conics in the catadioptric image plane as well as spheres in space. We proved that

Xianghua Ying; Zhanyi Hu

2004-01-01

430

Photo annotation on a camera phone  

Microsoft Academic Search

In this paper we describe a system that allows users to annotate digital photos at the time of capture. The system uses camera phones with a lightweight client application and a server to store the images and metadata and assists the user in annotation on the camera phone by providing guesses about the location and content of the photos. By

Anita Wilhelm; Yuri Takhteyev; Risto Sarvas; Nancy A. Van House; Marc Davis

2004-01-01

431

USB Security Camera Software for Linux  

Microsoft Academic Search

USB Security Camera has been developed in the society security field, however, current video surveillance is too expensive to limit use widely. The paper proposes a new method that Linux system is software development, with USB camera as video gather. Using TCP\\/IP Protocol agreement realize network communication. The system inside embeds web server so users can visit resources by browser

J. Weerachai; P. Siam; K. Narawith

2011-01-01

432

Cameras Monitor Spacecraft Integrity to Prevent Failures  

NASA Technical Reports Server (NTRS)

The Jet Propulsion Laboratory contracted Malin Space Science Systems Inc. to outfit Curiosity with four of its cameras using the latest commercial imaging technology. The company parlayed the knowledge gained under working with NASA to develop an off-the-shelf line of cameras, along with a digital video recorder, designed to help troubleshoot problems that may arise on satellites in space.

2014-01-01

433

Making a room-sized camera obscura  

NASA Astrophysics Data System (ADS)

We describe how to convert a room into a camera obscura as a project for introductory geometrical optics. The view for our camera obscura is a busy street scene set against a beautiful mountain skyline. We include a short video with project instructions, ray diagrams and delightful moving images of cars driving on the road outside.

Flynt, Halima; Ruiz, Michael J.

2015-01-01

434

Solid State Replacement of Rotating Mirror Cameras  

SciTech Connect

Rotating mirror cameras have been the mainstay of mega-frame per second imaging for decades. There is still no electronic camera that can match a film based rotary mirror camera for the combination of frame count, speed, resolution and dynamic range. The rotary mirror cameras are predominantly used in the range of 0.1 to 100 micro-seconds per frame, for 25 to more than a hundred frames. Electron tube gated cameras dominate the sub microsecond regime but are frame count limited. Video cameras are pushing into the microsecond regime but are resolution limited by the high data rates. An all solid state architecture, dubbed ''In-situ Storage Image Sensor'' or ''ISIS'', by Prof. Goji Etoh, has made its first appearance into the market and its evaluation is discussed. Recent work at Lawrence Livermore National Laboratory has concentrated both on evaluation of the presently available technologies and exploring the capabilities of the ISIS architecture. It is clear though there is presently no single chip camera that can simultaneously match the rotary mirror cameras, the ISIS architecture has the potential to approach their performance.

Frank, A M; Bartolick, J M

2006-08-25

435

Design and implementation of ubiquitous smart cameras  

Microsoft Academic Search

Design aspects and software modelling for ubiquitous real-time camera system are described in this paper. We propose system architecture using a network of inexpensive cameras and perform video processing in-network. In general, ubiquitous systems have to perform spatial and temporal calibration in advance to determine timing and coordination relationship between sensor nodes, and other application specific design considerations, such as

Chang Hong Lin; Wayne Wolf; Andrew Dixon; Xenofon Koutsoukos; Janos Sztipanovits

2006-01-01

436

Design and Implementation of Ubiquitous Smart Cameras  

Microsoft Academic Search

Design aspects and software modelling for ubiquitous real-time camera system are described in this paper. We propose system architecture using a network of inexpensive cameras and perform video processing in-network. In gen- eral, ubiquitous systems have to perform spatial and tem- poral calibration in advance to determine timing and coor- dination relationship between sensor nodes, and other ap- plication specific

Chang Hong Lin; Wayne Wolf; Andrew Dixon; Xenofon D. Koutsoukos; Janos Sztipanovits

2006-01-01

437

Camera Surveillance as a Measure of Counterterrorism?  

Microsoft Academic Search

Camera surveillance has recently gained prominence in policy proposals on combatingterrorism. We evaluate this instrument of counterterrorism as resting on the premise of adeterrence effect. Based on comparative arguments and previous evidence on crime, weexpect camera surveillance to have a relatively smaller deterrent effect on terrorism than onother forms of crime. In particular, we emphasize opportunities for substitution (i.e.,displacement effects),

Alois Stutzer; Michael Zehnder

2010-01-01

438

A control system for LAMOST CCD cameras  

NASA Astrophysics Data System (ADS)

32 scientific CCD cameras within 16 low-dispersion spectrographs of LAMOST are used for object spectra. This paper introduced the CCD Master system designed for camera management and control based on UCAM controller. The layers of Master, UDP and CCD-end daemons are described in detail. The commands, statuses, user interface and spectra viewer are discussed.

Deng, Xiaochao; Wang, Jian; Dong, Jian; Luo, Yu; Liu, Guangcao; Yuan, Hailong; Jin, Ge

2010-07-01

439

Hierarchical Depth Mapping from Multiple Cameras  

Microsoft Academic Search

. We present a method to estimate a dense and sharp depthmap using multiple cameras. A key issue in obtaining sharp depth map ishow to overcome the harmful influence of occlusion. Thus, we first proposean occlusion-overcoming strategy which selectively use the depthinformation from multiple cameras. With a simple sort and discard technique,we resolve the occlusion problem considerably at a slight

Jong-il Park; Seiki Inoue

1997-01-01

440

High resolution RGB color line scan camera  

NASA Astrophysics Data System (ADS)

A color line scan camera family which is available with either 6000, 8000 or 10000 pixels/color channel, utilizes off-the-shelf lenses, interfaces with currently available frame grabbers, includes on-board pixel by pixel offset correction, and is configurable and controllable via RS232 serial port for computer controlled or stand alone operation is described in this paper. This line scan camera is based on an available 8000 element monochrome line scan camera designed by AOA for OEM use. The new color version includes improvements such as better packaging and additional user features which make the camera easier to use. The heart of the camera is a tri-linear CCD sensor with on-chip color balancing for maximum accuracy and pinned photodiodes for low lag response. Each color channel is digitized to 12 bits and all three channels are multiplexed together so that the resulting camera output video is either a 12 or 8 bit data stream at a rate of up to 24Megpixels/sec. Conversion from 12 to 8 bit, or user-defined gamma, is accomplished by on board user-defined video look up tables. The camera has two user-selectable operating modes; lows speed, high sensitivity mode or high speed, reduced sensitivity mode. The intended uses of the camera include industrial inspection, digital archiving, document scanning, and graphic arts applications.

Lynch, Theodore E.; Huettig, Fred

1998-04-01

441

An electronic multiband camera film viewer.  

NASA Technical Reports Server (NTRS)

An electronic viewer for real-time viewing and processing of multiband camera imagery is described. The Multiband Camera Film Viewer (MCFV) is a high-resolution, 1000-line system scanning three channels of multiband imagery. The MCFV provides a calibrated output from each of the three channels for viewing in composite true color, analog false-color, and digitized, enhanced false color.

Roberts, L. H.

1972-01-01

442

Solid state replacement of rotating mirror cameras  

NASA Astrophysics Data System (ADS)

Rotating mirror cameras have been the mainstay of mega-frame per second imaging for decades. There is still no electronic camera that can match a film based rotary mirror camera for the combination of frame count, speed, resolution and dynamic range. The rotary mirror cameras are predominantly used in the range of 0.1 to 100 micro-seconds per frame, for 25 to more than a hundred frames. Electron tube gated cameras dominate the sub microsecond regime but are frame count limited. Video cameras are pushing into the microsecond regime but are resolution limited by the high data rates. An all solid state architecture, dubbed 'In-situ Storage Image Sensor' or 'ISIS', by Prof. Goji Etoh has made its first appearance into the market and its evaluation is discussed. Recent work at Lawrence Livermore National Laboratory has concentrated both on evaluation of the presently available technologies and exploring the capabilities of the ISIS architecture. It is clear though there is presently no single chip camera that can simultaneously match the rotary mirror cameras, the ISIS architecture has the potential to approach their performance.

Frank, Alan M.; Bartolick, Joseph M.

2007-01-01

443

A theory of self-calibration of a moving camera  

Microsoft Academic Search

There is a close connection between the calibration of a single camera and the epipolar transformation obtained when the camera undergoes a displacement. The epipolar transformation imposes two algebraic constraints on the camera calibration. If two epipolar transformations, arising from different camera displacements, are available then the compatible camera calibrations are parameterized by an algebraic curve of genus four. The

Stephen J. Maybank; Olivier D. Faugeras

1992-01-01

444

Wrist Camera Orientation for Effective Telerobotic Orbital Replaceable Unit (ORU) Changeout  

NASA Technical Reports Server (NTRS)

The Hydraulic Manipulator Testbed (HMTB) is the kinematic replica of the Flight Telerobotic Servicer (FTS). One use of the HMTB is to evaluate advanced control techniques for accomplishing robotic maintenance tasks on board the Space Station. Most maintenance tasks involve the direct manipulation of the robot by a human operator when high-quality visual feedback is important for precise control. An experiment was conducted in the Systems Integration Branch at the Langley Research Center to compare several configurations of the manipulator wrist camera for providing visual feedback during an Orbital Replaceable Unit changeout task. Several variables were considered such as wrist camera angle, camera focal length, target location, lighting. Each study participant performed the maintenance task by using eight combinations of the variables based on a Latin square design. The results of this experiment and conclusions based on data collected are presented.

Jones, Sharon Monica; Aldridge, Hal A.; Vazquez, Sixto L.

1997-01-01

445

Generalization of the Euler Angles  

NASA Technical Reports Server (NTRS)

It is shown that the Euler angles can be generalized to axes other than members of an orthonormal triad. As first shown by Davenport, the three generalized Euler axes, hereafter: Davenport axes, must still satisfy the constraint that the first two and the last two axes be mutually perpendicular if these axes are to define a universal set of attitude parameters. Expressions are given which relate the generalized Euler angles, hereafter: Davenport angles, to the 3-1-3 Euler angles of an associated direction-cosine matrix. The computation of the Davenport angles from the attitude matrix and their kinematic equation are presented. The present work offers a more direct development of the Davenport angles than Davenport's original publication and offers additional results.

Bauer, Frank H. (Technical Monitor); Shuster, Malcolm D.; Markley, F. Landis

2002-01-01

446

The QUEST Large Area CCD Camera  

E-print Network

We have designed, constructed and put into operation a very large area CCD camera that covers the field of view of the 1.2 m Samuel Oschin Schmidt Telescope at the Palomar Observatory. The camera consists of 112 CCDs arranged in a mosaic of four rows with 28 CCDs each. The CCDs are 600 x 2400 pixel Sarnoff thinned, back illuminated devices with 13 um x 13 um pixels. The camera covers an area of 4.6 deg x 3.6 deg on the sky with an active area of 9.6 square degrees. This camera has been installed at the prime focus of the telescope, commissioned, and scientific quality observations on the Palomar-QUEST Variability Sky Survey were started in September of 2003. The design considerations, construction features, and performance parameters of this camera are described in this paper.

Charlie Baltay; David Rabinowitz; Peter Andrews; Anne Bauer; Nancy Ellman; William Emmet; Rebecca Hudson; Thomas Hurteau; Jonathan Jerke; Rochelle Lauer; Julia Silge; Andrew Szymkowiak; Brice Adams; Mark Gebhard; James Musser; Michael Doyle; Harold Petrie; Roger Smith; Robert Thicksten; John Geary

2007-02-21

447

A small-angle scatterometer  

Microsoft Academic Search

The design, analysis, and performance of a small-angle scatterometer are presented. A dye-cell Gaussian apodized aperture is utilized to reduce the small-angle diffraction background, so that the system scatter becomes the dominant noise in the instrument beam profile. The geometrical aberrations are graphically shown to have no significant effect on the higher-order diffraction rings of the beam profile. Small-angle scattered

Steven J. Wein; William L. Wolfe

1989-01-01

448

Development of high-speed video cameras  

NASA Astrophysics Data System (ADS)

Presented in this paper is an outline of the R and D activities on high-speed video cameras, which have been done in Kinki University since more than ten years ago, and are currently proceeded as an international cooperative project with University of Applied Sciences Osnabruck and other organizations. Extensive marketing researches have been done, (1) on user's requirements on high-speed multi-framing and video cameras by questionnaires and hearings, and (2) on current availability of the cameras of this sort by search of journals and websites. Both of them support necessity of development of a high-speed video camera of more than 1 million fps. A video camera of 4,500 fps with parallel readout was developed in 1991. A video camera with triple sensors was developed in 1996. The sensor is the same one as developed for the previous camera. The frame rate is 50 million fps for triple-framing and 4,500 fps for triple-light-wave framing, including color image capturing. Idea on a video camera of 1 million fps with an ISIS, In-situ Storage Image Sensor, was proposed in 1993 at first, and has been continuously improved. A test sensor was developed in early 2000, and successfully captured images at 62,500 fps. Currently, design of a prototype ISIS is going on, and, hopefully, will be fabricated in near future. Epoch-making cameras in history of development of high-speed video cameras by other persons are also briefly reviewed.

Etoh, Takeharu G.; Takehara, Kohsei; Okinaka, Tomoo; Takano, Yasuhide; Ruckelshausen, Arno; Poggemann, Dirk

2001-04-01

449

Sensitivity of seismic waves to structure: Wide-angle broad-band sensitivity packets  

E-print Network

Sensitivity of seismic waves to structure: Wide-angle broad-band sensitivity packets Ludek Klimes domain as the sensitivity beams, and in the time domain as the sensitivity packets. The sensitivity packets are mostly represented by narrow­band Gaussian sensitivity packets studied in the previous paper

Cerveny, Vlastislav

450

8.G Find the Angle  

NSDL National Science Digital Library

This is a task from the Illustrative Mathematics website that is one part of a complete illustration of the standard to which it is aligned. Each task has at least one solution and some commentary that addresses important asects of the task and its potential use. Here are the first few lines of the commentary for this task: In triangle $\\Delta ABC$, point $M$ is the point of intersection of the bisectors of angles $\\angle BAC$, $\\angle ABC$, and $\\angle ACB$. The measure o...

451

3/20/2007 Mohanty 1 A Secure Digital Camera (SDC) forA Secure Digital Camera (SDC) for  

E-print Network

3/20/2007 Mohanty 1 A Secure Digital Camera (SDC) forA Secure Digital Camera (SDC) for Real Proposed Secure Digital Camera (SDC) for real-time DRM. Our Low-Power Watermarking Chip for the SDC:Our Solution for DRM: Secure Digital Camera (SDC)Secure Digital Camera (SDC) #12;3/20/2007 Mohanty 10 Secure

Mohanty, Saraju P.

452

3/9/2007 Mohanty 1 A Secure Digital Camera (SDC) forA Secure Digital Camera (SDC) for  

E-print Network

3/9/2007 Mohanty 1 A Secure Digital Camera (SDC) forA Secure Digital Camera (SDC) for Real Proposed Secure Digital Camera (SDC) for real-time DRM Our Low-Power Watermarking Chip for the SDC Research for DRM: Secure Digital Camera (SDC)Secure Digital Camera (SDC) #12;3/9/2007 Mohanty 11 Secure Digital

Mohanty, Saraju P.

453

Performance Analysis of a Low-Cost Triangulation-Based 3d Camera: Microsoft Kinect System  

NASA Astrophysics Data System (ADS)

Recent technological advancements have made active imaging sensors popular for 3D modelling and motion tracking. The 3D coordinates of signalised targets are traditionally estimated by matching conjugate points in overlapping images. Current 3D cameras can acquire point clouds at video frame rates from a single exposure station. In the area of 3D cameras, Microsoft and PrimeSense have collaborated and developed an active 3D camera based on the triangulation principle, known as the Kinect system. This off-the-shelf system costs less than 150 USD and has drawn a lot of attention from the robotics, computer vision, and photogrammetry disciplines. In this paper, the prospect of using the Kinect system for precise engineering applications was evaluated. The geometric quality of the Kinect system as a function of the scene (i.e. variation of depth, ambient light conditions, incidence angle, and object reflectivity) and the sensor (i.e. warm-up time and distance averaging) were analysed quantitatively. This system's potential in human body measurements was tested against a laser scanner and 3D range camera. A new calibration model for simultaneously determining the exterior orientation parameters, interior orientation parameters, boresight angles, leverarm, and object space features parameters was developed and the effectiveness of this calibration approach was explored.

. K. Chow, J. C.; Ang, K. D.; Lichti, D. D.; Teskey, W. F.

2012-07-01

454

Modeling of scintillation camera systems.  

PubMed

Despite their widespread use, the satisfactory modeling of scintillation camera systems has remained difficult. Although the resolving time and deadtime T of a nonparalyzable counter are identical and also invariable, a distinction needs to be made between the fixed resolving time tau0 and the variable deadtime tau of a paralyzable counter. It is shown here that tau = tau0(e(n) - 1)/n, where n = Ntau0 = N/Nmax is the normalized input rate and N the absolute input rate. The normalized output rate, r = Rtau0, where R is the absolute output rate, has a maximum value r(max) = 1/e approximately 0.368 at the input rate n(max) = 1, where tau = tau0(e - 1) approximately 1.718tau0. It is also shown that the response of a system of nonparalyzable and paralyzable components at all input rates is determined by just the dominant nonparalyzable and paralyzable components in the system, the response at any particular input rate being that of the component with the higher of the two deadtimes T or tau. A system can be purely paralyzable (kT = T/tau0 < or = 1), combined paralyzable/nonparalyzable (1 < kT < or = 1.718), or essentially nonparalyzable (kT > 1.718), the combined paralyzable/nonparalyzable system having a lower nonparalyzable (T > tau) and an upper paralyzable (tau > T) operating range separated by a threshold input rate n(t) = ln(1 + kTn(t)) at which tau = T. A highly accurate and explicit expression for n(t) has also been derived. In the essentially nonparalyzable case, the system operates as nonparalyzable all the way up to the system's peak response point, which may occur at or above n(max) = 1. A two-component system with kT > 1 can also be described mathematically as nonparalyzable using r = n/(1 + k(tau)n), where k(tau) = tau/tau0 = kT for n < or = n(t), and k(tau) = (e(n) - 1)/n for n > or = n(t), or as paralyzable using r = ne(-nk0) with k0 = [ln(1 + kTn)]/n for n < or = n(t) and k0 = 1 for n > or = n(t). These alternative descriptions will be of considerable importance in the measurement of T and tau0 for such systems. The model described is able to account fully for the three different operating modes possible with scintillation camera systems. PMID:10435541

Woldeselassie, T

1999-07-01

455

Wide-angle camera with multichannel architecture using microlenses on a curved surface.  

PubMed

We propose a multichannel imaging system that combines the principles of an insect's compound eye and the human eye. The optical system enables a reduction in track length of the imaging device to achieve miniaturization. The multichannel structure is achieved by a curved microlens array, and a Hypergon lens is used as the main lens to simulate the human eye, achieving large field of view (FOV). With this architecture, each microlens of the array transmits a segment of the overall FOV. The partial images are recorded in separate channels and stitched together to form the final image of the whole FOV by image processing. The design is 2.7 mm thick, with 59 channels; the 100°×80° full FOV is optimized using ZEMAX ray-tracing software on an image plane. The image plane size is 4.53??mm×3.29??mm. Given the recent progress in the fabrication of microlenses, this image system has the potential to be commercialized in the near future. PMID:24921135

Liang, Wei-Lun; Shen, Hui-Kai; Su, Guo-Dung J

2014-06-10

456

Active Brownian motion in a narrow channel  

E-print Network

We review recent advances in rectification control of artificial microswimmers, also known as Janus particles, diffusing along narrow, periodically corrugated channels. The swimmer self-propulsion mechanism is modeled so as to incorporate a nonzero torque (propulsion chirality). We first summarize the effects of chirality on the autonomous current of microswimmers freely diffusing in channels of different geometries. In particular, left-right and upside-down asymmetric channels are shown to exhibit different transport properties. We then report new results on the dependence of the diffusivity of chiral microswimmers on the channel geometry and their own self-propulsion mechanism. The self-propulsion torque turns out to play a key role as a transport control parameter.

Xue Ao; Pulak Kumar Ghosh; Yunyun Li; Gerhard Schmid; Peter Hänggi; Fabio Marchesoni

2014-09-17

457

Active Brownian motion in a narrow channel  

NASA Astrophysics Data System (ADS)

We review recent advances in rectification control of artificial microswimmers, also known as Janus particles, diffusing along narrow, periodically corrugated channels. The swimmer self-propulsion mechanism is modeled so as to incorporate a nonzero torque (propulsion chirality). We first summarize the effects of chirality on the autonomous current of microswimmers freely diffusing in channels of different geometries. In particular, left-right and upside-down asymmetric channels are shown to exhibit different transport properties. We then report new results on the dependence of the diffusivity of chiral microswimmers on the channel geometry and their own self-propulsion mechanism. The self-propulsion torque turns out to play a key role as a transport control parameter.

Ao, X.; Ghosh, P. K.; Li, Y.; Schmid, G.; Hänggi, P.; Marchesoni, F.

2014-12-01

458

Modeling of a slanted-hole collimator in a compact endo-cavity gamma camera.  

NASA Astrophysics Data System (ADS)

Having the ability to take an accurate 3D image of a tumor greatly helps doctors diagnose it and then create a treatment plan for a patient. One way to accomplish molecular imaging is to inject a radioactive tracer into a patient and then measure the gamma rays emitted from regions with high-uptake of the tracer, viz., the cancerous tissues. In large, expensive PET- or SPECT-imaging systems, the 3D imaging easily is accomplished by rotating the gamma-ray detectors and then employing software to reconstruct the 3D images from the multiple 2D projections at different angles of view. However, this method is impractical in a very compact imaging system due to anatomical considerations, e.g., the transrectal gamma camera under development at Brookhaven National Laboratory (BNL) for detection of intra-prostatic tumors. The camera uses pixilated cadmium zinc telluride (CdZnTe or CZT) detectors with matched parallel-hole collimator. Our research investigated the possibility of using a collimator with slanted holes to create 3D pictures of a radioactive source. The underlying concept is to take 2D projection images at different angles of view by adjusting the slant angle of the collimator, then using the 2D projection images to reconstruct the 3D image. To do this, we first simulated the response of a pixilated CZT detector to radiation sources placed in the field of view of the camera. Then, we formulated an algorithm to use the simulation results as prior knowledge and estimate the distribution of a shaped source from its 2D projection images. From the results of the simulation, we measured the spatial resolution of the camera as ~7-mm at a depth of 13.85-mm when using a detector with 2.46-mm pixel pitch and a collimator with 60° slant angle.

Kamuda, Mark; Cui, Yonggang; Lall, Terry; Ionson, Jim; Camarda, Giuseppe S.; Hossain, Anwar; Yang, Ge; Roy, Utpal N.; James, Ralph B.

2013-09-01

459

A Survey of Martian Dust Devil Activity Using Mars Global Surveyor Mars Orbiter Camera Images  

NASA Astrophysics Data System (ADS)

We present results from an orbital survey of Martian dust devils using the Mars Global Surveyor (MGS) Mars Orbiter Camera (MOC) wide- and narrow-angle (WA and NA) images. The survey includes all available imaging data (mapping and pre-mapping orbit), through to mission phase E06. Due to the large volume of data, we have concentrated on surveying limited regions, selected variously on the basis of where dust devils or other dust storm activity has previously been reported, on the basis of where lander observations have been or will be obtained, and on the basis of predictions from numerical atmospheric models. Our study regions to date include: Amazonis Planitia (25-45N, 145-165W), Sinus Meridiani (10S-10N, 10E-10W), Chryse Planitia (10-30N, 30-60W), Solis Planum (15-45S, 75-105W), Hellas Planitia (15-60S, 265-315W), Casius (45-65N, 255-285W), Utopia Planitia (25-45N, 225-255W), Sinai Planum (10-20S, 60-100W), Mare Cimmerium (10-45S, 180-220W). We have compiled statistics on dust devil activity in three categories: dust devils observed in NA images, dust devils observed in WA images, and dust devil tracks observed in NA images. For each region and each category, we have compiled statistics for four seasonal date bins, centered on the equinoxes and solstices: Ls=45-135 (northern summer solstice), Ls=135-225 (northern autumn equinox), Ls=225-315 (northern winter solstice), and Ls=315-45 (northern spring equinox). Our survey has highlighted great spatial variability in dust devil activity, with the Amazonis Planitia region being by far the dominant location for activity. This region is additionally characterized by a large size range of dust devils, including individual devils up to several km in height. Other regions in which dust devils have been frequently imaged include Utopia, Solis, and Sinai. Numerous dust devil tracks were observed in Casius and Cimmerium, but with very few accompanying dust devils. This suggests dust devils occurring in local times other than that of the MGS orbit (~2pm). Our seasonal statistics suggest a very strong preference for Amazonis and Solis dust devil activity to occur in the northern autumn season. Conversely, Utopia shows dust devil activity which is relatively constant, except in the northern spring period. The observations will be presented, and compared with numerical model predictions. Initial results from this survey have already been used to define target regions for very high resolution simulations of dust devil development using the Caltech/Cornell Mars MM5 model.

Fisher, J.; Richardson, M. I.; Ewald, S. P.; Toigo, A. D.; Wilson, R. J.

2002-12-01

460

Diffusion in narrow channels on curved manifolds  

NASA Astrophysics Data System (ADS)

In this work, we derive a general effective diffusion coefficient to describe the two-dimensional (2D) diffusion in a narrow and smoothly asymmetric channel of varying width, embedded on a curved surface, in the simple diffusion of non-interacting, point-like particles under no external field. To this end, we extend the generalization of the Kalinay-Percus' projection method [J. Chem. Phys. 122, 204701 (2005); Kalinay-Percus', Phys. Rev. E 74, 041203 (2006)] for the asymmetric channels introduced in [L. Dagdug and I. Pineda, J. Chem. Phys. 137, 024107 (2012)], to project the anisotropic two-dimensional diffusion equation on a curved manifold, into an effective one-dimensional generalized Fick-Jacobs equation that is modified according to the curvature of the surface. For such purpose we construct the whole expansion, writing the marginal concentration as a perturbation series. The lowest order in the perturbation parameter, which corresponds to the Fick-Jacobs equation, contains an additional term that accounts for the curvature of the surface. We explicitly obtain the first-order correction for the invariant effective concentration, which is defined as the correct marginal concentration in one variable, and we obtain the first approximation to the effective diffusion coefficient analogous to Bradley's coefficient [Phys. Rev. E 80, 061142 (2009)] as a function of the metric elements of the surface. In a straightforward manner, we study the perturbation series up to the nth order, and derive the full effective diffusion coefficient for two-dimensional diffusion in a narrow asymmetric channel, with modifications according to the metric terms. This expression is given as D(? )=D_0/w^' (? )}?{g_1/g_2} lbrace arctan [?{g_2/g_1}(y^' }_0(? )+w^' }(? )/2)]-arctan [?{g_2/g_1}(y^' }_0(? )-w^' }(? )/2)] rbrace, which is the main result of our work. Finally, we present two examples of symmetric surfaces, namely, the sphere and the cylinder, and we study certain specific channel configurations on these surfaces.

Chacón-Acosta, Guillermo; Pineda, Inti; Dagdug, Leonardo

2013-12-01

461

Mitsubishi Electric Research Labs (MERL) Computational Cameras Amit Agrawal  

E-print Network

Mitsubishi Electric Research Labs (MERL) Computational Cameras Amit Agrawal Mitsubishi Electric Research Labs (MERL) Cambridge, MA, USA Computational Cameras: Exploiting Spatial- Angular Temporal Tradeoffs in Photography #12;Mitsubishi Electric Research Labs (MERL) Computational Cameras Where

Agrawal, Amit

462

16 CFR 3.45 - In camera orders.  

Code of Federal Regulations, 2013 CFR

...document in question on which in camera or otherwise confidential...or rejected, be placed in camera only after finding that...or corporation requesting in camera treatment or after finding...an individual's Social Security number, taxpayer...

2013-01-01

463

16 CFR 3.45 - In camera orders.  

Code of Federal Regulations, 2011 CFR

...document in question on which in camera or otherwise confidential...or rejected, be placed in camera only after finding that...or corporation requesting in camera treatment or after finding...an individual's Social Security number, taxpayer...

2011-01-01

464

16 CFR 3.45 - In camera orders.  

Code of Federal Regulations, 2010 CFR

...document in question on which in camera or otherwise confidential...or rejected, be placed in camera only after finding that...or corporation requesting in camera treatment or after finding...an individual's Social Security number, taxpayer...

2010-01-01

465

16 CFR 3.45 - In camera orders.  

Code of Federal Regulations, 2014 CFR

...document in question on which in camera or otherwise confidential...or rejected, be placed in camera only after finding that...or corporation requesting in camera treatment or after finding...an individual's Social Security number, taxpayer...

2014-01-01

466

16 CFR 3.45 - In camera orders.  

Code of Federal Regulations, 2012 CFR

...document in question on which in camera or otherwise confidential...or rejected, be placed in camera only after finding that...or corporation requesting in camera treatment or after finding...an individual's Social Security number, taxpayer...

2012-01-01

467

Design and analysis of a two-dimensional camera array  

E-print Network

I present the design and analysis of a two-dimensional camera array for virtual studio applications. It is possible to substitute conventional cameras and motion control devices with a real-time, light field camera array. ...

Yang, Jason C. (Jason Chieh-Sheng), 1977-

2005-01-01

468

True three-dimensional camera  

NASA Astrophysics Data System (ADS)

An imager that can measure the distance from each pixel to the point on the object that is in focus at the pixel is described. This is accomplished by short photo-conducting lightguides at each pixel. In the eye the rods and cones are the fiber-like lightguides. The device uses ambient light that is only coherent in spherical shell-shaped light packets of thickness of one coherence length. Modern semiconductor technology permits the construction of lightguides shorter than a coherence length of ambient light. Each of the frequency components of the broad band light arriving at a pixel has a phase proportional to the distance from an object point to its image pixel. Light frequency components in the packet arriving at a pixel through a convex lens add constructively only if the light comes from the object point in focus at this pixel. The light in packets from all other object points cancels. Thus the pixel receives light from one object point only. The lightguide has contacts along its length. The lightguide charge carriers are generated by the light patterns. These light patterns, and thus the photocurrent, shift in response to the phase of the input signal. Thus, the photocurrent is a function of the distance from the pixel to its object point. Applications include autonomous vehicle navigation and robotic vision. Another application is a crude teleportation system consisting of a camera and a three-dimensional printer at a remote location.

Kornreich, Philipp; Farell, Bart

2013-01-01

469

Practical intraoperative stereo camera calibration.  

PubMed

Many of the currently available stereo endoscopes employed during minimally invasive surgical procedures have shallow depths of field. Consequently, focus settings are adjusted from time to time in order to achieve the best view of the operative workspace. Invalidating any prior calibration procedure, this presents a significant problem for image guidance applications as they typically rely on the calibrated camera parameters for a variety of geometric tasks, including triangulation, registration and scene reconstruction. While recalibration can be performed intraoperatively, this invariably results in a major disruption to workflow, and can be seen to represent a genuine barrier to the widespread adoption of image guidance technologies. The novel solution described herein constructs a model of the stereo endoscope across the continuum of focus settings, thereby reducing the number of degrees of freedom to one, such that a single view of reference geometry will determine the calibration uniquely. No special hardware or access to proprietary interfaces is required, and the method is ready for evaluation during human cases. A thorough quantitative analysis indicates that the resulting intrinsic and extrinsic parameters lead to calibrations as accurate as those derived from multiple pattern views. PMID:25485437

Pratt, Philip; Bergeles, Christos; Darzi, Ara; Yang, Guang-Zhong

2014-01-01

470

Cloud Computing with Context Cameras  

E-print Network

We summarize methods and plans to monitor and calibrate photometric observations with our autonomous, robotic network of 2m, 1m and 40cm telescopes. These are sited globally to optimize our ability to observe time-variable sources. Wide field "context" cameras are aligned with our network telescopes and cycle every 2 minutes through BVriz filters, spanning our optical range. We measure instantaneous zero-point offsets and transparency (throughput) against calibrators in the 5-12m range from the all-sky Tycho2 catalog, and periodically against primary standards. Similar measurements are made for all our science images, with typical fields of view of 0.5 degrees. These are matched against Landolt, Stetson and Sloan standards, and against calibrators in the 10-17m range from the all-sky APASS catalog. Such measurements provide pretty good instantaneous flux calibration, often to better than 5%, even in cloudy conditions. Zero-point and transparency measurements can be used to characterize, monitor and inter-comp...

Pickles, A J

2013-01-01

471

Smart Camera Technology Increases Quality  

NASA Technical Reports Server (NTRS)

When it comes to real-time image processing, everyone is an expert. People begin processing images at birth and rapidly learn to control their responses through the real-time processing of the human visual system. The human eye captures an enormous amount of information in the form of light images. In order to keep the brain from becoming overloaded with all the data, portions of an image are processed at a higher resolution than others, such as a traffic light changing colors. changing colors. In the same manner, image processing products strive to extract the information stored in light in the most efficient way possible. Digital cameras available today capture millions of pixels worth of information from incident light. However, at frame rates more than a few per second, existing digital interfaces are overwhelmed. All the user can do is store several frames to memory until that memory is full and then subsequent information is lost. New technology pairs existing digital interface technology with an off-the-shelf complementary metal oxide semiconductor (CMOS) imager to provide more than 500 frames per second of specialty image processing. The result is a cost-effective detection system unlike any other.

2004-01-01

472

Visualization of explosion phenomena using a high-speed video camera with an uncoupled objective lens by fiber-optic  

NASA Astrophysics Data System (ADS)

Visualization of explosion phenomena is very important and essential to evaluate the performance of explosive effects. The phenomena, however, generate blast waves and fragments from cases. We must protect our visualizing equipment from any form of impact. In the tests described here, the front lens was separated from the camera head by means of a fiber-optic cable in order to be able to use the camera, a Shimadzu Hypervision HPV-1, for tests in severe blast environment, including the filming of explosions. It was possible to obtain clear images of the explosion that were not inferior to the images taken by the camera with the lens directly coupled to the camera head. It could be confirmed that this system is very useful for the visualization of dangerous events, e.g., at an explosion site, and for visualizations at angles that would be unachievable under normal circumstances.

Tokuoka, Nobuyuki; Miyoshi, Hitoshi; Kusano, Hideaki; Hata, Hidehiro; Hiroe, Tetsuyuki; Fujiwara, Kazuhito; Yasushi, Kondo

2008-11-01

473

Autoconfiguration of a dynamic nonoverlapping camera network.  

PubMed

In order to monitor sufficiently large areas of interest for surveillance or any event detection, we need to look beyond stationary cameras and employ an automatically configurable network of nonoverlapping cameras. These cameras need not have an overlapping field of view and should be allowed to move freely in space. Moreover, features like zooming in/out, readily available in security cameras these days, should be exploited in order to focus on any particular area of interest if needed. In this paper, a practical framework is proposed to self-calibrate dynamically moving and zooming cameras and determine their absolute and relative orientations, assuming that their relative position is known. A global linear solution is presented for self-calibrating each zooming/focusing camera in the network. After self-calibration, it is shown that only one automaticall