These are representative sample records from Science.gov related to your search topic.
For comprehensive and current results, perform a real-time search at Science.gov.
1

Two Years of Digital Terrain Model Production Using the Lunar Reconnaissance Orbiter Narrow Angle Camera  

NASA Astrophysics Data System (ADS)

One of the primary objectives of the Lunar Reconnaissance Orbiter Camera (LROC) is to gather stereo observations with the Narrow Angle Camera (NAC). These stereo observations are used to generate digital terrain models (DTMs). The NAC has a pixel scale of 0.5 to 2.0 meters but was not designed for stereo observations and thus requires the spacecraft to roll off-nadir to acquire these images. Slews interfere with the data collection of the other instruments, so opportunities are currently limited to four per day. Arizona State University has produced DTMs from 95 stereo pairs for 11 Constellation Project (CxP) sites (Aristarchus, Copernicus crater, Gruithuisen domes, Hortensius domes, Ina D-caldera, Lichtenberg crater, Mare Ingenii, Marius hills, Reiner Gamma, South Pole-Aitkin Rim, Sulpicius Gallus) as well as 30 other regions of scientific interest (including: Bhabha crater, highest and lowest elevation points, Highland Ponds, Kugler Anuchin, Linne Crater, Planck Crater, Slipher crater, Sears Crater, Mandel'shtam Crater, Virtanen Graben, Compton/Belkovich, Rumker Domes, King Crater, Luna 16/20/23/24 landing sites, Ranger 6 landing site, Wiener F Crater, Apollo 11/14/15/17, fresh craters, impact melt flows, Larmor Q crater, Mare Tranquillitatis pit, Hansteen Alpha, Moore F Crater, and Lassell Massif). To generate DTMs, the USGS ISIS software and SOCET SET° from BAE Systems are used. To increase the absolute accuracy of the DTMs, data obtained from the Lunar Orbiter Laser Altimeter (LOLA) is used to coregister the NAC images and define the geodetic reference frame. NAC DTMs have been used in examination of several sites, e.g. Compton-Belkovich, Marius Hills and Ina D-caldera [1-3]. LROC will continue to acquire high-resolution stereo images throughout the science phase of the mission and any extended mission opportunities, thus providing a vital dataset for scientific research as well as future human and robotic exploration. [1] B.L. Jolliff (2011) Nature Geoscience, in press. [2] Lawrence et al. (2011) LPSC XLII, Abst 2228. [3] Garry et al. (2011) LPSC XLII, Abst 2605.

Burns, K.; Robinson, M. S.; Speyerer, E.; LROC Science Team

2011-12-01

2

Narrow Angle movie  

NASA Technical Reports Server (NTRS)

This brief three-frame movie of the Moon was made from three Cassini narrow-angle images as the spacecraft passed by the Moon on the way to its closest approach with Earth on August 17, 1999. The purpose of this particular set of images was to calibrate the spectral response of the narrow-angle camera and to test its 'on-chip summing mode' data compression technique in flight. From left to right, they show the Moon in the green, blue and ultraviolet regions of the spectrum in 40, 60 and 80 millisecond exposures, respectively. All three images have been scaled so that the brightness of Crisium basin, the dark circular region in the upper right, is the same in each image. The spatial scale in the blue and ultraviolet images is 1.4 miles per pixel (2.3 kilometers). The original scale in the green image (which was captured in the usual manner and then reduced in size by 2x2 pixel summing within the camera system) was 2.8 miles per pixel (4.6 kilometers). It has been enlarged for display to the same scale as the other two. The imaging data were processed and released by the Cassini Imaging Central Laboratory for Operations (CICLOPS) at the University of Arizona's Lunar and Planetary Laboratory, Tucson, AZ.

Photo Credit: NASA/JPL/Cassini Imaging Team/University of Arizona

Cassini, launched in 1997, is a joint mission of NASA, the European Space Agency and Italian Space Agency. The mission is managed by NASA's Jet Propulsion Laboratory, Pasadena, CA, for NASA's Office of Space Science, Washington DC. JPL is a division of the California Institute of Technology, Pasadena, CA.

1999-01-01

3

High-resolution topomapping of candidate MER landing sites with Mars Orbiter Camera narrow-angle images  

USGS Publications Warehouse

We analyzed narrow-angle Mars Orbiter Camera (MOC-NA) images to produce high-resolution digital elevation models (DEMs) in order to provide topographic and slope information needed to assess the safety of candidate landing sites for the Mars Exploration Rovers (MER) and to assess the accuracy of our results by a variety of tests. The mapping techniques developed also support geoscientific studies and can be used with all present and planned Mars-orbiting scanner cameras. Photogrammetric analysis of MOC stereopairs yields DEMs with 3-pixel (typically 10 m) horizontal resolution, vertical precision consistent with ???0.22 pixel matching errors (typically a few meters), and slope errors of 1-3??. These DEMs are controlled to the Mars Orbiter Laser Altimeter (MOLA) global data set and consistent with it at the limits of resolution. Photoclinometry yields DEMs with single-pixel (typically ???3 m) horizontal resolution and submeter vertical precision. Where the surface albedo is uniform, the dominant error is 10-20% relative uncertainty in the amplitude of topography and slopes after "calibrating" photoclinometry against a stereo DEM to account for the influence of atmospheric haze. We mapped portions of seven candidate MER sites and the Mars Pathfinder site. Safety of the final four sites (Elysium, Gusev, Isidis, and Meridiani) was assessed by mission engineers by simulating landings on our DEMs of "hazard units" mapped in the sites, with results weighted by the probability of landing on those units; summary slope statistics show that most hazard units are smooth, with only small areas of etched terrain in Gusev crater posing a slope hazard.

Kirk, R.L.; Howington-Kraus, E.; Redding, B.; Galuszka, D.; Hare, T.M.; Archinal, B.A.; Soderblom, L.A.; Barrett, J.M.

2003-01-01

4

Methane Band and Continuum Band Imaging of Titan's Atmosphere Using Cassini ISS Narrow Angle Camera Pictures from the CURE/Cassini Imaging Project  

NASA Astrophysics Data System (ADS)

The study of Titan's atmosphere, which bears resemblance to early Earth's, may help us understand more of our own. Constructing a Monte Carlo model of Titan's atmosphere is helpful to achieve this goal. Methane (MT) and continuum band (CB) images of Titan taken by the CURE/Cassini Imaging Project, using the Cassini Narrow Angle Camera (NAC) were analyzed. They were scheduled by Cassini Optical Navigation. Images were obtained at phase 53°, 112°, 161°, and 165°. They include 22 total MT1(center wavelength at 619nm), MT2(727nm), MT3(889nm), CB1(635nm), CB2(751nm), and CB3(938nm) images. They were reduced with previously written scripts using the National Optical Astronomy Observatory Image Reduction and Analysis Facility scientific analysis suite. Correction for horizontal and vertical banding and cosmic ray hits were made. The MT images were registered with corresponding CB images to ensure that subsequently measured fluxes ratios came from the same parts of the atmosphere. Preliminary DN limb-to-limb scans and loci of the haze layers will be presented. Accurate estimates of the sub-spacecraft points on each picture will be presented. Flux ratios (FMT/FCB=Q0) along the scans and total absorption coefficients along the lines of sight from the spacecraft through the pixels (and into Titan) will also be presented.

Shitanishi, Jennifer; Gillam, S. D.

2009-05-01

5

Wide angle pinhole camera  

NASA Technical Reports Server (NTRS)

Hemispherical refracting element gives pinhole camera 180 degree field-of-view without compromising its simplicity and depth-of-field. Refracting element, located just behind pinhole, bends light coming in from sides so that it falls within image area of film. In contrast to earlier pinhole cameras that used water or other transparent fluids to widen field, this model is not subject to leakage and is easily loaded and unloaded with film. Moreover, by selecting glass with different indices of refraction, field at film plane can be widened or reduced.

Franke, J. M.

1978-01-01

6

Peripapillary Schisis in Glaucoma Patients With Narrow Angles and  

E-print Network

Peripapillary Schisis in Glaucoma Patients With Narrow Angles and Increased Intraocular Pressure cases of peripapillary retinal schisis in patients with glaucoma without evidence of optic nerve pits patient was followed over time. RESULTS: The first patient, diagnosed with narrow angle glaucoma

Srinivasan, Vivek J.

7

Optimum Camera Angle for Optic Flow-Based Centering Response  

E-print Network

a camera angle of less than 90 is beneficial. In this paper we present both empirical and analytical of an autonomous helicopter through urban environments [10], and our investigation therefore incor- porates with a simulated autonomous helicopter in a 3D urban canyon. In this context, we define the optimum camera angle

Sukhatme, Gaurav S.

8

Viscogonioplasty in narrow angle glaucoma: a randomized controlled trial  

PubMed Central

Purpose: To determine the effect of viscogonioplasty and cataract extraction on intraocular pressure in patients with narrow angle glaucoma. Methods: This was a double-masked randomized controlled trial involving 50 eyes (25 cases and 25 controls) from 38 consecutive patients. All patients underwent phacoemulsification with or without viscogonioplasty. The main outcome measures were intraocular pressure post-treatment and number of glaucoma medications post-treatment. Results: Cases had a greater reduction in intraocular pressure than controls, with a mean intraocular pressure (standard deviation) at 12 months of 13.7 (±2.89) mmHg compared with 16.2 (±3.55) mmHg in controls (P = 0.009). Cases had a greater reduction in mean number of antiglaucoma medications than controls at 12-month review, with 13 of 25 eyes (52%) of cases controlled without any antiglaucoma therapy versus 9 of 25 (36%) of the controls (P = 0.005). Conclusions: Viscogonioplasty combined with cataract extraction has a significantly greater effect than cataract extraction alone on lowering intraocular pressure in patients with poorly controlled narrow angle glaucoma and should therefore be considered as a treatment option for patients with this condition. PMID:21191443

Varma, Deepali; Adams, Wendy; Bunce, Catey; Phelan, Peter; Fraser, Scott

2010-01-01

9

WIDE-ANGLE, NARROW-ANGLE, AND IMAGING BASELINES OF OPTICAL LONG-BASELINE INTERFEROMETERS  

SciTech Connect

For optical interferometers, the baseline is typically defined as the vector joining two perfectly identical telescopes. However, when the telescopes are naturally different or when the requirements on the baseline vector challenge the telescope perfection, the baseline definition depends on how the interferometer is used. This is where the notions of wide-angle, narrow-angle, and imaging baselines come into play. This article explores this variety of baselines, with the purpose of presenting a coherent set of definitions, describing how they relate to each other, and suggesting baseline metrology requirements. Ultimately, this work aims at supporting upcoming long-baseline optical interferometers with narrow-angle astrometry and phase-referenced imaging capabilities at the microarcsecond level.

Woillez, J. [W. M. Keck Observatory, 65-1120 Mamalahoa Highway, Kamuela, HI 96743 (United States)] [W. M. Keck Observatory, 65-1120 Mamalahoa Highway, Kamuela, HI 96743 (United States); Lacour, S., E-mail: jwoillez@keck.hawaii.edu, E-mail: sylvestre.lacour@obspm.fr [Observatoire de Paris, Place Jules Janssen, F-92195 Meudon (France)

2013-02-10

10

High Prevalence of Narrow Angles among Chinese-American Glaucoma and Glaucoma Suspect Patients  

PubMed Central

Purpose To evaluate the prevalence of gonioscopically narrow angles in a Chinese-American population with glaucoma or glaucoma suspicion. Patients and Methods Charts from all Chinese-American patients seen in a comprehensive ophthalmology clinic in the Chinatown district of San Francisco in 2002 were reviewed. One eye from each patient with glaucoma or glaucoma suspicion that met inclusion criteria was included (n=108). Data was collected for gender, age, race (self-declared), refraction (spherical equivalent), intraocular pressure (IOP), gonioscopy and vertical cup-to-disk ratio (CDR). Results Sixty percent (n=65) of Chinese-American eyes with glaucoma or glaucoma suspicion had gonioscopically narrow angles (Shaffer grade ?2 in three or more quadrants). Those with narrow angles were significantly older (P=0.004) than their open angle counterparts, but the two groups did not differ in terms of gender, refraction, IOP or CDR (all, P?0.071). In a multivariate model including age, gender and refraction as predictors of angle grade (open or narrow), only age was a significant predictor of angle grade (P=0.004). Conclusions A large proportion of Chinese-Americans in our study population with glaucoma or glaucoma suspicion had gonioscopically narrow angles. In multivariate analysis, patients with narrow angles were older than those with open angles but did not differ from them in terms of gender or refraction. Continued evaluation of angle closure glaucoma risk among Chinese-Americans is needed. PMID:19826385

Seider, Michael I; Pekmezci, Melike; Han, Ying; Sandhu, Simi; Kwok, Shiu Y; Lee, Roland Y; Lin, Shan C

2009-01-01

11

Characterizing Geometric Distortion of the Lunar Reconnaissance Orbiter Wide Angle Camera  

NASA Astrophysics Data System (ADS)

Each month the Lunar Reconnaissance Orbiter (LRO) Wide Angle Camera (WAC) provides 100 m scale images of nearly the entire Moon, each month with different range of lighting conditions [1]. Pre-flight calibration efforts provided a baseline for correcting the geometric distortion present in the WAC. However, residual errors of 1-2 pixels existed with this original model. In-flight calibration enables the derivation of a precise correction for geometric distortion to provide sub-pixel map projection accuracy. For the in-flight calibration, we compared WAC images to high-resolution (0.5 - 2.0 meter scale) images provided by the Narrow Angle Camera (NAC). Since the NAC has very narrow field of view (2.86°) its geometric accuracy is well characterized. The additions of the WAC-derived 100 m/pixel digital terrain model (GLD100) [2] and refined ephemeris provided by LOLA [3] have improved our efforts to remove small distortion artifacts in the WAC camera model. Since the NAC field of view is always in the same cross-track location in the WAC frame, NAC and WAC images of the same regions, under similar lighting conditions, were map projected. Hundreds of NAC (truth image) and WAC images were then co-registered using an automatic registration algorithm in ISIS [4]. This output was fed into a second ISIS program (fplanemap) that converted the registration offsets to focal plane coordinates for the distorted (original) and undistorted (corrected location derived from the truth image) pixel [4]. With this dataset, offsets in the WAC distortion model were identified and accounted for with a new 2D Taylor series function that has been added to the existing radial model. This technique improves the accurate placement of each pixel across the sensor in target space. We have applied this correction to the 643 nm band and will derive the coefficients for the remaining bands. Once this study is complete, a new camera model, instrument kernel (IK), and frames kernel (FK) will be released though NAIF [5] and USGS.

Speyerer, E.; Wagner, R.; Robinson, M. S.; Becker, K. J.; Anderson, J.; Thomas, P. C.

2011-12-01

12

13. 22'X34' original vellum, VariableAngle Launcher, 'SIDEVIEW CAMERA CAR TRACK ...  

Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

13. 22'X34' original vellum, Variable-Angle Launcher, 'SIDEVIEW CAMERA CAR TRACK DETAILS' drawn at 1/4'=1'-0' (BUORD Sketch # 208078, PAPW 908). - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

13

10. 22'X34' original blueprint, VariableAngle Launcher, 'SIDE VIEW CAMERA CARSTEEL ...  

Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

10. 22'X34' original blueprint, Variable-Angle Launcher, 'SIDE VIEW CAMERA CAR-STEEL FRAME AND AXLES' drawn at 1/2'=1'-0'. (BOURD Sketch # 209124). - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

14

3-D RECONSTRUCTION FROM NARROW-ANGLE RADIOGRAPHS  

SciTech Connect

So as to detect and characterize potential defects in pipes, inspections are carried out with the help of non-destructive examination techniques (NDE) including x- or {gamma} radiography. Should a defect be detected, one can be asked to prove the component still stands the mechanical constraints. In these cases of expertise, the use of a 3-D reconstruction processing technique can be very useful. One characteristic of such applications is that, in general the number and angles of projections are very limited and the data are very noisy, so classical tomography algorithms cannot solve the problem. In this work, we study two methods of reconstruction that allows to take the specificity of radiography inspection into account through two different means: a reconstruction technique based on a priori model (Markov-Potts), a binary technique that constrain the solution to be either 0 or 1 and called 'BLMR'. This paper focuses on first results we obtain on simulated data and real data corresponding to a mock-up with several electro-dynamically manufactured cylindrical defects.

Fournier, L.; Chatellier, L.; Peureux, P. [Electricite de France, Research and Development Division, 6 Quai Watier, BP 49, 78401 Chatou (France); Mohammad-Djafari, A. [Laboratoire des Signaux et Systemes, Unite mixte de recherche 8506 (CNRS-SUPELEC-UPS), SUPELEC, Plateau de Moulon 3 rue Joliot Curie, 91192 Gif-sur-Yvette (France); Idier, J. [Institut de Recherche en Communications et en Cybernetique de Nantes (IRCCyN), 1 rue de la Noee, BP92101, 44321 Nantes Cedex 03 (France)

2008-02-28

15

Improved iris localization by using wide and narrow field of view cameras for iris recognition  

NASA Astrophysics Data System (ADS)

Biometrics is a method of identifying individuals by their physiological or behavioral characteristics. Among other biometric identifiers, iris recognition has been widely used for various applications that require a high level of security. When a conventional iris recognition camera is used, the size and position of the iris region in a captured image vary according to the X, Y positions of a user's eye and the Z distance between a user and the camera. Therefore, the searching area of the iris detection algorithm is increased, which can inevitably decrease both the detection speed and accuracy. To solve these problems, we propose a new method of iris localization that uses wide field of view (WFOV) and narrow field of view (NFOV) cameras. Our study is new as compared to previous studies in the following four ways. First, the device used in our research acquires three images, one each of the face and both irises, using one WFOV and two NFOV cameras simultaneously. The relation between the WFOV and NFOV cameras is determined by simple geometric transformation without complex calibration. Second, the Z distance (between a user's eye and the iris camera) is estimated based on the iris size in the WFOV image and anthropometric data of the size of the human iris. Third, the accuracy of the geometric transformation between the WFOV and NFOV cameras is enhanced by using multiple matrices of the transformation according to the Z distance. Fourth, the searching region for iris localization in the NFOV image is significantly reduced based on the detected iris region in the WFOV image and the matrix of geometric transformation corresponding to the estimated Z distance. Experimental results showed that the performance of the proposed iris localization method is better than that of conventional methods in terms of accuracy and processing time.

Kim, Yeong Gon; Shin, Kwang Yong; Park, Kang Ryoung

2013-10-01

16

A switchable light field camera architecture with Angle Sensitive Pixels and dictionary-based sparse coding  

E-print Network

We propose a flexible light field camera architecture that is at the convergence of optics, sensor electronics, and applied mathematics. Through the co-design of a sensor that comprises tailored, Angle Sensitive Pixels and ...

Hirsch, Matthew Waggener

17

Wide-Angle, Wide-Band Camera for Remote Imaging  

NASA Technical Reports Server (NTRS)

Improved ultraviolet-to-infrared camera design combines high resolution and relatively wide field of view in remote-imaging system. Although design intended for satellite-borne system to give information on such Earth features as vegetation, pollution, and land formation mineral deposits, optical principle also useful in ground-based or airborne high-resolution television for imaging objects at great distances.

Atcheson, P. D.

1985-01-01

18

Wide-Angle, Reflective Strip-Imaging Camera  

NASA Technical Reports Server (NTRS)

Proposed camera images thin, striplike portion of field of view of 180 degrees wide. Hemispherical concave reflector forms image onto optical fibers, which transfers it to strip of photodetectors or spectrograph. Advantages include little geometric distortion, achromatism, and ease of athermalization. Uses include surveillance of clouds, coarse mapping of terrain, measurements of bidirectional reflectance distribution functions of aerosols, imaging spectrometry, oceanography, and exploration of planets.

Vaughan, Arthur H.

1992-01-01

19

Alternative approach to precision narrow-angle astrometry for Antarctic long baseline interferometry  

NASA Astrophysics Data System (ADS)

The conventional approach to high-precision narrow-angle astrometry using a long baseline interferometer is to directly measure the fringe packet separation of a target and a nearby reference star. This is done by means of a technique known as phase-referencing which requires a network of dual beam combiners and laser metrology systems. Using an alternative approach that does not rely on phase-referencing, the narrow-angle astrometry of several closed binary stars (with separation less than 2''), as described in this paper, was carried out by observing the fringe packet crossing event of the binary systems. Such an event occurs twice every sidereal day when the line joining the two stars of the binary is is perpendicular to the projected baseline of the interferometer. Observation of these events is well suited for an interferometer in Antarctica. Proof of concept observations were carried out at the Sydney University Stellar Interferometer (SUSI) with targets selected according to its geographical location. Narrow-angle astrometry using this indirect approach has achieved sub-100 micro-arcsecond precision.

Kok, Yitping; Ireland, Michael J.; Rizzuto, Aaron C.; Tuthill, Peter G.; Robertson, J. Gordon; Warrington, Benjamin A.; Tango, William J.

2014-07-01

20

Accuracy and repeatability of joint angles measured using a single camera markerless motion capture system.  

PubMed

Markerless motion capture systems have developed in an effort to evaluate human movement in a natural setting. However, the accuracy and reliability of these systems remain understudied. Therefore, the goals of this study were to quantify the accuracy and repeatability of joint angles using a single camera markerless motion capture system and to compare the markerless system performance with that of a marker-based system. A jig was placed in multiple static postures with marker trajectories collected using a ten camera motion analysis system. Depth and color image data were simultaneously collected from a single Microsoft Kinect camera, which was subsequently used to calculate virtual marker trajectories. A digital inclinometer provided a measure of ground-truth for sagittal and frontal plane joint angles. Joint angles were calculated with marker data from both motion capture systems using successive body-fixed rotations. The sagittal and frontal plane joint angles calculated from the marker-based and markerless system agreed with inclinometer measurements by <0.5°. The systems agreed with each other by <0.5° for sagittal and frontal plane joint angles and <2° for transverse plane rotation. Both systems showed a coefficient of reliability <0.5° for all angles. These results illustrate the feasibility of a single camera markerless motion capture system to accurately measure lower extremity kinematics and provide a first step in using this technology to discern clinically relevant differences in the joint kinematics of patient populations. PMID:24315287

Schmitz, Anne; Ye, Mao; Shapiro, Robert; Yang, Ruigang; Noehren, Brian

2014-01-22

21

A Solid-State, Simultaneous Wide Angle - Detailed View Video Surveillance Camera  

Microsoft Academic Search

We have developed a simultaneously wide-angle and detailed-view surveillance camera. For the purpose of surveillance, detailed views for suspicious objects are needed. Conventional motorized zoom cameras, however, are fragile and provide only a small region-of-interest at a time. We propose a system which is capable of obtain- ing wide-range views as well as detailed views of multiple regions of interest

Ryutaro Oi; Marcus A. Magnor; Kiyoharu Aizawa

2003-01-01

22

On-Orbit Geometric Calibration of the Lunar Reconnaissance Orbiter Wide Angle Camera  

NASA Astrophysics Data System (ADS)

Lunar Reconnaissance Orbiter (LRO) is equipped with a single Wide Angle Camera (WAC) [1] designed to collect monochromatic and multispectral observations of the lunar surface. Cartographically accurate image mosaics and stereo image based terrain models requires the position of each pixel in a given image be known to a corresponding point on the lunar surface with a high degree of accuracy and precision. The Lunar Reconnaissance Orbiter Camera (LROC) team initially characterized the WAC geometry prior to launch at the Malin Space Science Systems calibration facility. After lunar orbit insertion, the LROC team recognized spatially varying geometric offsets between color bands. These misregistrations made analysis of the color data problematic and showed that refinements to the pre-launch geometric analysis were necessary. The geometric parameters that define the WAC optical system were characterized from statistics gathered from co-registering over 84,000 image pairs. For each pair, we registered all five visible WAC bands to a precisely rectified Narrow Angle Camera (NAC) image (accuracy <15 m) [2] to compute key geometric parameters. In total, we registered 2,896 monochrome and 1,079 color WAC observations to nearly 34,000 NAC observations and collected over 13.7 million data points across the visible portion of the WAC CCD. Using the collected statistics, we refined the relative pointing (yaw, pitch and roll), effective focal length, principal point coordinates, and radial distortion coefficients. This large dataset also revealed spatial offsets between bands after orthorectification due to chromatic aberrations in the optical system. As white light enters the optical system, the light bends at different magnitudes as a function of wavelength, causing a single incident ray to disperse in a spectral spread of color [3,4]. This lateral chromatic aberration effect, also known as 'chromatic difference in magnification' [5] introduces variation to the effective focal length for each WAC band. Secondly, tangential distortions caused by minor decentering in the optical system altered the derived exterior orientation parameters for each 14-line WAC band. We computed the geometric parameter sets separately for each band to characterize the lateral chromatic aberrations and the decentering components in the WAC optical system. From this approach, we negated the need for additional tangential terms in the distortion model, thus reducing the number of computations during image orthorectification and therefore expediting the orthorectification process. We undertook a similar process for refining the geometry for the UV bands (321 and 360 nm), except we registered each UV bands to orthorectified visible bands of the same WAC observation (the visible bands have resolutions 4 times greater than the UV). The resulting 7-band camera model with refined geometric parameters enables map projection with sub-pixel accuracy. References: [1] Robinson et al. (2010) Space Sci. Rev. 150, 81-124 [2] Wagner et al. (2013) Lunar Sci Forum [3] Mahajan, V.N. (1998) Optical Imaging and Aberrations [4] Fiete, R.D. (2013), Manual of Photogrammetry, pp. 359-450 [5] Brown, D.C. (1966) Photometric Eng. 32, 444-462.

Speyerer, E. J.; Wagner, R.; Robinson, M. S.

2013-12-01

23

Calibration of a trinocular system formed with wide angle lens cameras.  

PubMed

To obtain 3D information of large areas, wide angle lens cameras are used to reduce the number of cameras as much as possible. However, since images are high distorted, errors in point correspondences increase and 3D information could be erroneous. To increase the number of data from images and to improve the 3D information, trinocular sensors are used. In this paper a calibration method for a trinocular sensor formed with wide angle lens cameras is proposed. First pixels locations in the images are corrected using a set of constraints which define the image formation in a trinocular system. When pixels location are corrected, lens distortion and trifocal tensor is computed. PMID:23262716

Ricolfe-Viala, Carlos; Sanchez-Salmeron, Antonio-Jose; Valera, Angel

2012-12-01

24

GRAVITY: the VLTI 4-beam combiner for narrow-angle astrometry and interferometric imaging  

E-print Network

GRAVITY is the second generation Very Large Telescope Interferometer instrument for precision narrow-angle astrometry and interferometric imaging in the Near Infra-Red (NIR). It shall provide precision astrometry of order 10 microarcseconds, and imaging capability at a few milliarcsecond resolution, and hence will revolutionise dynamical measurements of celestial objects. GRAVITY is currently in the last stages of its integration and tests in Garching at MPE, and will be delivered to the VLT Interferometer (VLTI) in 2015. We present here the instrument, with a particular focus on the components making use of fibres: integrated optics beam combiners, polarisation rotators, fibre differential delay lines, and the metrology.

Blind, N; Gillessen, S; Kok, Y; Lippa, M; Perrin, G; Dembet, R; Fedou, P; Lacour, S; Perraut, K; Jocou, L; Burtscher, L; Hans, O; Haug, M; Haussmann, F; Huber, S; Janssen, A; Kellner, S; Ott, T; Pfuhl, O; Sturm, E; Weber, J; Wieprecht, E; Amorim, A; Brandner, W; Straubmeier, C

2015-01-01

25

Research on influence of space-borne stereo mapping camera's time synchronization precision on attitude angle errors  

NASA Astrophysics Data System (ADS)

Mapping precision of space-borne stereo mapping camera is primarily determined by attitude angle errors of the satellite. Time synchronization errors of space-borne stereo mapping camera will bring on extra attitude angle errors. In this paper model of space-borne stereo mapping camera was established in satellite tool kit (STK) to obtain the regularity of attitude angles changing with time. Influence of space-borne stereo mapping camera's time synchronization precision on attitude angle errors was analyzed by combing the regularity of attitude angles changing with time and the sampling theory. As a result digitalized model of extra attitude angle errors and time synchronization errors of space-borne stereo mapping camera was put forward. In validation experiments real attitude angle data of a stereo mapping satellite were collected and extra attitude angle errors caused by specific time synchronization errors of space-borne stereo mapping camera were obtained. Results of the experiments and analysis indicated that extra attitude angle errors caused by specific time synchronization error could be reduced from 0.01939 arc second to 0.00003879 arc second as time synchronization precision was optimized from 1ms to 20?s.

Wu, Xingxing; Liu, Jinguo; Long, Kehui; Zhou, Huaide; Zhang, Boyan

2014-11-01

26

Comparison of Scheimpflug imaging and spectral domain anterior segment optical coherence tomography for detection of narrow anterior chamber angles  

Microsoft Academic Search

PurposeTo compare the performance of anterior chamber volume (ACV) and anterior chamber depth (ACD) obtained using Scheimpflug imaging with angle opening distance (AOD500) and trabecular-iris space area (TISA500) obtained using spectral domain anterior segment optical coherence tomography (SD-ASOCT) in detecting narrow angles classified using gonioscopy.MethodsIn this prospective, cross-sectional observational study, 265 eyes of 265 consecutive patients underwent sequential Scheimpflug imaging,

D S Grewal; G S Brar; R Jain; S P S Grewal

2011-01-01

27

Lunar Reconnaissance Orbiter Camera (LROC) Instrument Overview  

Microsoft Academic Search

The Lunar Reconnaissance Orbiter Camera (LROC) Wide Angle Camera (WAC) and Narrow Angle Cameras (NACs) are on the NASA Lunar\\u000a Reconnaissance Orbiter (LRO). The WAC is a 7-color push-frame camera (100 and 400 m\\/pixel visible and UV, respectively), while\\u000a the two NACs are monochrome narrow-angle linescan imagers (0.5 m\\/pixel). The primary mission of LRO is to obtain measurements\\u000a of the Moon that

M. S. Robinson; S. M. Brylow; M. Tschimmel; D. Humm; S. J. Lawrence; P. C. Thomas; B. W. Denevi; E. Bowman-Cisneros; J. Zerr; M. A. Ravine; M. A. Caplinger; F. T. Ghaemi; J. A. Schaffner; M. C. Malin; P. Mahanti; A. Bartels; J. Anderson; T. N. Tran; E. M. Eliason; A. S. McEwen; E. Turtle; B. L. Jolliff; H. Hiesinger

2010-01-01

28

Phase-Referenced Interferometry and Narrow-Angle Astrometry with SUSI  

NASA Astrophysics Data System (ADS)

The Sydney University Stellar Interferometer (SUSI) now incorporates a new beam combiner, called the Microarc-second University of Sydney Companion Astrometry instrument (MUSCA), for the purpose of high precision differential astrometry of bright binary stars. Operating in the visible wavelength regime where photon-counting and post-processing fringe tracking is possible, MUSCA will be used in tandem with SUSI's primary beam combiner, Precision Astronomical Visible Observations (PAVO), to record high spatial resolution fringes and thereby measure the separation of fringe packets of binary stars. In its current phase of development, the dual beam combiner configuration has successfully demonstrated for the first time a dual-star phase-referencing operation in visible wavelengths. This paper describes the beam combiner optics and hardware, the network of metrology systems employed to measure every non-common path between the two beam combiners and also reports on a recent narrow-angle astrometric observation of ? Orionis A (HR 1852) as the project enters its on-sky testing phase.

Kok, Y.; Ireland, M. J.; Tuthill, P. G.; Robertson, J. G.; Warrington, B. A.; Rizzuto, A. C.; Tango, W. J.

29

A New Approach to Micro-arcsecond Astrometry with SIM Allowing Early Mission Narrow Angle Measurements of Compelling Astronomical Targets  

NASA Technical Reports Server (NTRS)

The Space Interferometry Mission (SIM) is capable of detecting and measuring the mass of terrestrial planets around stars other than our own. It can measure the mass of black holes and the visual orbits of radio and x-ray binary sources. SIM makes possible a new level of understanding of complex astrophysical processes. SIM achieves its high precision in the so-called narrow-angle regime. This is defined by a 1 degree diameter field in which the position of a target star is measured with respect to a set of reference stars. The observation is performed in two parts: first, SIM observes a grid of stars that spans the full sky. After a few years, repeated observations of the grid allow one to determine the orientation of the interferometer baseline. Second, throughout the mission, SIM periodically observes in the narrow-angle mode. Every narrow-angle observation is linked to the grid to determine the precise attitude and length of the baseline. The narrow angle process demands patience. It is not until five years after launch that SIM achieves its ultimate accuracy of 1 microarcsecond. The accuracy is degraded by a factor of approx. 2 at mid-mission. Our work proposes a technique for narrow angle astrometry that does not rely on the measurement of grid stars. This technique, called Gridless Narrow Angle Astrometry (GNAA) can obtain microarcsecond accuracy and can detect extra-solar planets and other exciting objects with a few days of observation. It can be applied as early as during the first six months of in-orbit calibration (IOC). The motivations for doing this are strong. First, and obviously, it is an insurance policy against a catastrophic mid-mission failure. Second, at the start of the mission, with several space-based interferometers in the planning or implementation phase, NASA will be eager to capture the public's imagination with interferometric science. Third, early results and a technique that can duplicate those results throughout the mission will give the analysts important experience in the proper use and calibration of SIM.

Shaklan, Stuart; Pan, Xiaopei

2004-01-01

30

COMPARISON OF PHOTOGRAMMETRIC APPLICATIONS BASED ON NARROW ANGLE LINE SCANNERS WITH TRADITIONAL PHOTOGRAMMETRIC METHODS  

Microsoft Academic Search

By the Pegasus project digital line scanner images taken from an elevation of 20km, with a ground sampling distance (GSD) of 20cm, will be available. The photogrammetric use of such images is compared with traditional aerial photos, but also digital aerial cameras as well as high resolution space sensors. Line scanner images usually should be combined with direct sensor orientation.

K. Jacobsen

31

Small-angle approximation to the transfer of narrow laser beams in anisotropic scattering media  

NASA Technical Reports Server (NTRS)

The broadening and the signal power detected of a laser beam traversing an anisotropic scattering medium were examined using the small-angle approximation to the radiative transfer equation in which photons suffering large-angle deflections are neglected. To obtain tractable answers, simple Gaussian and non-Gaussian functions for the scattering phase functions are assumed. Two other approximate approaches employed in the field to further simplify the small-angle approximation solutions are described, and the results obtained by one of them are compared with those obtained using small-angle approximation. An exact method for obtaining the contribution of each higher order scattering to the radiance field is examined but no results are presented.

Box, M. A.; Deepak, A.

1981-01-01

32

Automatic screening of Narrow Anterior Chamber Angle and Angle-Closure Glaucoma based on slit-lamp image analysis by using Support Vector Machine.  

PubMed

At present, Van Herick's method is a standard technique used to screen a Narrow Anterior Chamber Angle (NACA) and Angle-Closure Glaucoma (ACG). It can identify a patient who suffers from NACA and ACG by considering the width of peripheral anterior chamber depth (PACD) and corneal thickness. However, the screening result of this method often varies among ophthalmologists. So, an automatic screening of NACA and ACG based on slit-lamp image analysis by using Support Vector Machine (SVM) is proposed. SVM can automatically generate the classification model, which is used to classify the result as an angle-closure likely or an angle-closure unlikely. It shows that it can improve the accuracy of the screening result. To develop the classification model, the width of PACD and corneal thickness from many positions are measured and selected to be features. A statistic analysis is also used in the PACD and corneal thickness estimation in order to reduce the error from reflection on the cornea. In this study, it is found that the generated models are evaluated by using 5-fold cross validation and give a better result than the result classified by Van Herick's method. PMID:24111078

Theeraworn, C; Kongprawechnon, W; Kondo, T; Bunnun, P; Nishihara, A; Manassakorn, A

2013-01-01

33

Motional narrowing of 13C dipolar magic-angle-spinning nuclear magnetic resonance spectra  

NASA Astrophysics Data System (ADS)

The effects of molecular motions on the dipolar coupled 13C magic-angle-spinning (MAS) nuclear magnetic resonance (NMR) spectra of alkane—urea inclusion compounds and bis(ethylammonium) hexachlorostannate were investigated. The contact time dependences of 13C cross-polarization (CP) MAS NMR spectra were measured for the tridecane—urea inclusion compounds at the Hartman—Hahn condition and at the off-Hartman—Hahn condition. Rotational echoes were observed at a multiple of the rotational periods both in the dipolar dephasing experiments and the CP experiments. These rotational echo shapes were analyzed by the presently proposed theory.

Ohe, Hiroshi; Kubo, Atsushi; Katada, Motomi; Sano, Hirotoshi

1990-09-01

34

The measurement and modelling of light scattering by phytoplankton cells at narrow forward angles  

NASA Astrophysics Data System (ADS)

A procedure has been devised for measuring the angular dependence of light scattering from suspensions of phytoplankton cells at forward angles from 0.25° to 8°. The cells were illuminated with a spatially-filtered laser beam and the angular distribution of scattered light measured by tracking a photodetector across the Fourier plane of a collecting lens using a stepper-motor driven stage. The procedure was calibrated by measuring scattering from latex bead suspensions with known size distributions. It was then used to examine the scattering from cultures of the unicellular algae Isochrysis galbana (4 µm × 5 µm), Dunaliella primolecta (6 µm × 7 µm) and Rhinomonas reticulata (5 µm × 11 µm). The results were compared with the predictions of Mie theory. Excellent agreement was obtained for spherical particles. A suitable choice of spherical-equivalent scattering parameters was required to enable reasonable agreement within the first diffraction lobe for ellipsoidal particles.

MacCallum, Iain; Cunningham, Alex; McKee, David

2004-07-01

35

Observations of Comet 9P\\/Tempel 1 around the Deep Impact event by the OSIRIS cameras onboard Rosetta  

Microsoft Academic Search

The OSIRIS cameras on the Rosetta spacecraft observed Comet 9P\\/Tempel 1 from 5 days before to 10 days after it was hit by the Deep Impact projectile. The Narrow Angle Camera (NAC) monitored the cometary dust in 5 different filters. The Wide Angle Camera (WAC) observed through

Horst Uwe Keller; Michael Küppers; Sonia Fornasier; Pedro J. Gutiérrez; Stubbe F. Hviid; Laurent Jorda; Jörg Knollenberg; Stephen C. Lowry; Miriam Rengel; Ivano Bertini; Rainer Kramm; Ekkehard Kührt; Luisa-Maria Lara; Holger Sierks; Cesare Barbieri; Philippe Lamy; Hans Rickman; Rafael Rodrigo; Michael F. A'Hearn; Björn J. R. Davidsson; Marco Fulle; Fritz Gliem; Olivier Groussin; José J. Lopez Moreno; Francesco Marzari; Angel Sanz; Camino Bajo de Huétor; Chung Li; G. Galilei

2006-01-01

36

Numerical simulations of the bending of narrow-angle-tail radio jets by ram pressure or pressure gradients  

SciTech Connect

Three-dimensional numerical hydrodynamic simulations are used to study the bending of radio jets. The simulations are compared with observations of jets in narrow-angle-tail radio sources. Two mechanisms for the observed bending are considered: direct bending of quasi-continuous jets by ram pressure from intergalactic gas and bending by pressure gradients in the interstellar gas of the host galaxy, the pressure gradients themselves being the result of ram pressure by intergalactic gas. It is shown that the pressure gradients are much less effective in bending jets, implying that the jets have roughly 30 times lower momentum fluxes if they are bent by this mechanism. Ram-pressure bending produces jets with kidney-shaped cross sections; when observed from the side, these jets appear to have diffuse extensions on the downstream side. On the other hand, pressure-gradient bending causes the jets to be densest near their upstream side. 31 references.

Soker, N.; Sarazin, C.L.; O'Dea, C.P.

1988-04-01

37

Numerical simulations of the bending of narrow-angle-tail radio jets by ram pressure or pressure gradients  

NASA Technical Reports Server (NTRS)

Three-dimensional numerical hydrodynamic simulations are used to study the bending of radio jets. The simulations are compared with observations of jets in narrow-angle-tail radio sources. Two mechanisms for the observed bending are considered: direct bending of quasi-continuous jets by ram pressure from intergalactic gas and bending by pressure gradients in the interstellar gas of the host galaxy, the pressure gradients themselves being the result of ram pressure by intergalactic gas. It is shown that the pressure gradients are much less effective in bending jets, implying that the jets have roughly 30 times lower momentum fluxes if they are bent by this mechanism. Ram-pressure bending produces jets with 'kidney-shaped' cross sections; when observed from the side, these jets appear to have diffuse extensions on the downstream side. On the other hand, pressure-gradient bending causes the jets to be densest near their upstream side.

Soker, Noam; Sarazin, Craig L.; O'Dea, Christopher P.

1988-01-01

38

Angles  

NSDL National Science Digital Library

Play these games to determine the best angles for success! Alien Angles Set the angle to rescue the alien. Space Angles Target the angle to shoot the alien spaceship. Mini Golf Knowing the angles will help you get the ball in the hole. ...

Mr Clark

2012-10-31

39

Angles  

NSDL National Science Digital Library

This set of eight interactive activities lets the user explore angles from many different perspectives. Activities include (1) visualizing the size of an angle; (2) examining objects that will stand or fall with right and non-right angles; (3) identifying obtuse, right, acute and straight angles; (4) guessing angle measures with different levels of precision; (5) exploring regular shapes and their angle measures; (6) studying angles in a fractal tree that is drawn with user inputs of the same angle measure between the branches at each stage; (7) exploring angle measures through firing a cannon (8) drawing with a Logo activity.

Jo Edkins

2007-01-01

40

Early direct-injection, low-temperature combustion of diesel fuel in an optical engine utilizing a 15-hole, dual-row, narrow-included-angle nozzle.  

SciTech Connect

Low-temperature combustion of diesel fuel was studied in a heavy-duty, single-cylinder optical engine employing a 15-hole, dual-row, narrow-included-angle nozzle (10 holes x 70/mD and 5 holes x 35/mD) with 103-/gmm-diameter orifices. This nozzle configuration provided the spray targeting necessary to contain the direct-injected diesel fuel within the piston bowl for injection timings as early as 70/mD before top dead center. Spray-visualization movies, acquired using a high-speed camera, show that impingement of liquid fuel on the piston surface can result when the in-cylinder temperature and density at the time of injection are sufficiently low. Seven single- and two-parameter sweeps around a 4.82-bar gross indicated mean effective pressure load point were performed to map the sensitivity of the combustion and emissions to variations in injection timing, injection pressure, equivalence ratio, simulated exhaust-gas recirculation, intake temperature, intake boost pressure, and load. High-speed movies of natural luminosity were acquired by viewing through a window in the cylinder wall and through a window in the piston to provide quasi-3D information about the combustion process. These movies revealed that advanced combustion phasing resulted in intense pool fires within the piston bowl, after the end of significant heat release. These pool fires are a result of fuel-films created when the injected fuel impinged on the piston surface. The emissions results showed a strong correlation with pool-fire activity. Smoke and NO/dx emissions rose steadily as pool-fire intensity increased, whereas HC and CO showed a dramatic increase with near-zero pool-fire activity.

Gehrke, Christopher R. (Caterpillar Inc.); Radovanovic, Michael S. (Caterpillar Inc.); Milam, David M. (Caterpillar Inc.); Martin, Glen C.; Mueller, Charles J.

2008-04-01

41

CALIBRATION OF A WIDE-ANGLE DIGITAL CAMERA SYSTEM FOR NEAR REAL TIME SCENARIOS  

Microsoft Academic Search

ABSTRACT: Near real time monitoring of natural disasters, mass events, and large traffic disasters with airborne SAR and optical sensors will be the focus of several ,projects in research ,and development ,at the German ,Aerospace Center (DLR) in the ,next years. For these projects, new airborne camera systems are applied and tested. An important part of the sensor suite plays

F. Kurz; R. Müller; M. Stephani; P. Reinartz; M. Schroeder

42

A Wide-Angle Camera for the Mobile Asteroid Surface Scout (MASCOT) on Hayabusa-2  

NASA Astrophysics Data System (ADS)

JAXA's Hayabusa-2 mission, an asteroid sample return mission, is scheduled for launch in December 2014, for a rendezvous with the C-type asteroid 1999 JU3 in 2018. MASCOT, the Mobile Asteroid Surface Scout [1], is a small lander, designed to deliver ground truth for the orbiter remote measurements, support the selection of sampling sites, and provide context for the returned samples.MASCOT's main objective is to investigate the landing site's geomorphology, the internal structure, texture and composition of the regolith (dust, soil and rocks), and the thermal, mechanical, and magnetic properties of the surface. MASCOT comprises a payload of four scientific instruments: camera, radiometer, magnetometer and hyper-spectral microscope. The camera (MASCOT CAM) was designed and built by DLR's Institute of Planetary Research, together with Airbus DS Germany.

Schmitz, N.; Koncz, A.; Jaumann, R.; Hoffmann, H.; Jobs, D.; Kachlicki, J.; Michaelis, H.; Mottola, S.; Pforte, B.; Schroeder, S.; Terzer, R.; Trauthan, F.; Tschentscher, M.; Weisse, S.; Ho, T.-M.; Biele, J.; Ulamec, S.; Broll, B.; Kruselburger, A.; Perez-Prieto, L.

2014-04-01

43

Lunar Reconnaissance Orbiter Camera (LROC) instrument overview  

USGS Publications Warehouse

The Lunar Reconnaissance Orbiter Camera (LROC) Wide Angle Camera (WAC) and Narrow Angle Cameras (NACs) are on the NASA Lunar Reconnaissance Orbiter (LRO). The WAC is a 7-color push-frame camera (100 and 400 m/pixel visible and UV, respectively), while the two NACs are monochrome narrow-angle linescan imagers (0.5 m/pixel). The primary mission of LRO is to obtain measurements of the Moon that will enable future lunar human exploration. The overarching goals of the LROC investigation include landing site identification and certification, mapping of permanently polar shadowed and sunlit regions, meter-scale mapping of polar regions, global multispectral imaging, a global morphology base map, characterization of regolith properties, and determination of current impact hazards.

Robinson, M.S.; Brylow, S.M.; Tschimmel, M.; Humm, D.; Lawrence, S.J.; Thomas, P.C.; Denevi, B.W.; Bowman-Cisneros, E.; Zerr, J.; Ravine, M.A.; Caplinger, M.A.; Ghaemi, F.T.; Schaffner, J.A.; Malin, M.C.; Mahanti, P.; Bartels, A.; Anderson, J.; Tran, T.N.; Eliason, E.M.; McEwen, A.S.; Turtle, E.; Jolliff, B.L.; Hiesinger, H.

2010-01-01

44

Angle-of-incidence effects in the spectral performance of the infrared array camera of the Spitzer Space Telescope  

NASA Astrophysics Data System (ADS)

The Infrared Array Camera (IRAC) on board the Spitzer Space Telescope uses two dichroic beamsplitters, four interference filters, and four detector arrays to acquire images in four different channels with nominal wavelengths of 3.6, 4.5, 5.8, and 8 ?m for channels 1 through 4 respectively. A ray-tracing analysis of the IRAC optical system indicates a distribution of angles that is position-dependent at each optical element and the focal-plane arrays. For the band-pass filters in channels 1 and 2, the angle distribution relative to the filter surface normal is 0-28°, whereas for channels 3 and 4, the distribution is from 30° to 58°. Since these angle variations will cause changes in the center-band wavelengths for these interference filters that needed to be accounted for, we performed spectral performance measurements as a function of the angle of incidence on witness samples corresponding to each of the four filters and the two beamsplitters in the IRAC instrument. These measurements were done in the 2-10 ?m wavelength range and at the temperature of 5 K, which is near the operating temperature of IRAC. Based on these filter measurements, we also performed an analysis of the pass-band wavelength distributions as a function of position on the instrument focal-plane array detectors. This information is necessary to attain the highest possible photometric accuracy when using IRAC for astronomical observations.

Quijada, Manuel A.; Marx, Catherine T.; Arendt, Richard G.; Moseley, Samuel H.

2004-10-01

45

Angles  

NSDL National Science Digital Library

This interactive Flash applet enables students, using estimation and measurement skills, to investigate angles. Teachers can use this page for demonstrating how to read a protractor, and the protractor can be hidden to give students practice in estimating angle measures. The size of the angle can be controlled or chosen randomly.

Dan Bunker

2011-01-01

46

Angles  

NSDL National Science Digital Library

This Java applet enables students to investigate acute, obtuse, and right angles. The student decides to work with one or two transversals and a pair of parallel lines. Angle measure is given for one angle. The student answers a short series of questions about the size of other angles, identifying relationships such as vertical and adjacent angles and alternate interior and alternate exterior angles. In addition to automatically checking the student's answers, the applet can keep score of correct answers. From the activity page, What, How, and Why buttons open pages that explain the activity's purpose, function, and how the mathematics fits into the curriculum. Supplemental resources include lesson plans and a handout with a grid for showing the relationship between all possible angles that occur when parallel lines are cut by a transversal. Copyright 2005 Eisenhower National Clearinghouse

Shodor Education Foundation

2004-01-01

47

Angles  

NSDL National Science Digital Library

This lesson is designed to introduce students to different types of angles including acute, obtuse, and right. The lesson also introduces ways to compare angles such as alternate interior, corresponding, and many others. This lesson provides links to discussions and activities related to angles as well as suggested ways to integrate them into the lesson. Finally, the lesson provides links to follow-up lessons designed for use in succession with the current one.

2011-05-23

48

On an assessment of surface roughness estimates from lunar laser altimetry pulse-widths for the Moon from LOLA using LROC narrow-angle stereo DTMs.  

NASA Astrophysics Data System (ADS)

Neumann et al. [1] proposed that laser altimetry pulse-widths could be employed to derive "within-footprint" surface roughness as opposed to surface roughness estimated from between laser altimetry pierce-points such as the example for Mars [2] and more recently from the 4-pointed star-shaped LOLA (Lunar reconnaissance Orbiter Laser Altimeter) onboard the NASA-LRO [3]. Since 2009, the LOLA has been collecting extensive global laser altimetry data with a 5m footprint and ?25m between the 5 points in a star-shape. In order to assess how accurately surface roughness (defined as simple RMS after slope correction) derived from LROC matches with surface roughness derived from LOLA footprints, publicly released LROC-NA (LRO Camera Narrow Angle) 1m Digital Terrain Models (DTMs) were employed to measure the surface roughness directly within each 5m footprint. A set of 20 LROC-NA DTMs were examined. Initially the match-up between the LOLA and LROC-NA orthorectified images (ORIs) is assessed visually to ensure that the co-registration is better than the LOLA footprint resolution. For each LOLA footprint, the pulse-width geolocation is then retrieved and this is used to "cookie-cut" the surface roughness and slopes derived from the LROC-NA DTMs. The investigation which includes data from a variety of different landforms shows little, if any correlation between surface roughness estimated from DTMs with LOLA pulse-widths at sub-footprint scale. In fact there is only any perceptible correlation between LOLA and LROC-DTMs at baselines of 40-60m for surface roughness and 20m for slopes. [1] Neumann et al. Mars Orbiter Laser Altimeter pulse width measurements and footprint-scale roughness. Geophysical Research Letters (2003) vol. 30 (11), paper 1561. DOI: 10.1029/2003GL017048 [2] Kreslavsky and Head. Kilometer-scale roughness of Mars: results from MOLA data analysis. J Geophys Res (2000) vol. 105 (E11) pp. 26695-26711. [3] Rosenburg et al. Global surface slopes and roughness of the Moon from the Lunar Orbiter Laser Altimeter. Journal of Geophysical Research (2011) vol. 116, paper E02001. DOI: 10.1029/2010JE003716 [4] Chin et al. Lunar Reconnaissance Orbiter Overview: The Instrument Suite and Mission. Space Science Reviews (2007) vol. 129 (4) pp. 391-419

Muller, Jan-Peter; Poole, William

2013-04-01

49

JOURNAL OF GEOPHYSICAL RESEARCH, VOL. 97, NO. E5, PAGES 7699-7718, MAY 25, 1992 Mars Observer Camera  

E-print Network

JOURNAL OF GEOPHYSICAL RESEARCH, VOL. 97, NO. E5, PAGES 7699-7718, MAY 25, 1992 Mars Observer AND T. A. SOULANILLE7 The Mars Observer camera (MAC) is a three-component system (one narrow-angle and two wide-angle cameras)designedto take high spatial resolution pictures of the surface of Mars

Winfree, Erik

50

Angles  

NSDL National Science Digital Library

In this activity, students practice comparing angles when a transversal intersects two parallel lines. This activity allows students to explore the vocabulary used when comparing angles (e.g., alternate, same-side, interior, corresponding, etc.). This activity includes supplemental materials, including background information about the topics covered, a description of how to use the application, and exploration questions for use with the java applet.

2010-01-01

51

Quasi-null lens optical system for the fabrication of an oblate convex ellipsoidal mirror: application to the Wide Angle Camera of the Rosetta space mission  

Microsoft Academic Search

The design of a quasi-null lens system for the fabrication of an aspheric oblate convex ellipsoidal mirror is presented. The Performance and tolerance of the system have been analyzed. The system has been applied successfully for the fabrication of the primary mirror of the Wide Angle Camera (WAC), the imaging system onboard the Rosetta, the European Space Agency cornerstone mission

Maria-Guglielmina Pelizzo; Vania da Deppo; Giampiero Naletto; Roberto Ragazzoni; Andrea Novi

2006-01-01

52

Post-trial anatomical frame alignment procedure for comparison of 3D joint angle measurement from magnetic/inertial measurement units and camera-based systems.  

PubMed

Magnetic and inertial measurement units (MIMUs) have been widely used as an alternative to traditional camera-based motion capture systems for 3D joint kinematics measurement. Since these sensors do not directly measure position, a pre-trial anatomical calibration, either with the assistance of a special protocol/apparatus or with another motion capture system is required to establish the transformation matrices between the local sensor frame and the anatomical frame (AF) of each body segment on which the sensors are attached. Because the axes of AFs are often used as the rotational axes in the joint angle calculation, any difference in the AF determination will cause discrepancies in the calculated joint angles. Therefore, a direct comparison of joint angles between MIMU systems and camera-based systems is less meaningful because the calculated joint angles contain a systemic error due to the differences in the AF determination. To solve this problem a new post-trial AF alignment procedure is proposed. By correcting the AF misalignments, the joint angle differences caused by the difference in AF determination are eliminated and the remaining discrepancies are mainly from the measurement accuracy of the systems themselves. Lower limb joint angles from 30 walking trials were used to validate the effectiveness of the proposed AF alignment procedure. This technique could serve as a new means for calibrating magnetic/inertial sensor-based motion capture systems and correcting for AF misalignment in scenarios where joint angles are compared directly. PMID:25340557

Li, Qingguo; Zhang, Jun-Tian

2014-11-01

53

7. VAL CAMERA CAR, DETAIL OF 'FLARE' OR TRAJECTORY CAMERA ...  

Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

7. VAL CAMERA CAR, DETAIL OF 'FLARE' OR TRAJECTORY CAMERA INSIDE CAMERA CAR. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

54

6. VAL CAMERA CAR, DETAIL OF COMMUNICATION EQUIPMENT INSIDE CAMERA ...  

Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

6. VAL CAMERA CAR, DETAIL OF COMMUNICATION EQUIPMENT INSIDE CAMERA CAR WITH CAMERA MOUNT IN FOREGROUND. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

55

2. VAL CAMERA CAR, VIEW OF CAMERA CAR AND TRACK ...  

Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

2. VAL CAMERA CAR, VIEW OF CAMERA CAR AND TRACK WITH CAMERA STATION ABOVE LOOKING WEST TAKEN FROM RESERVOIR. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

56

Ultra-narrow angle-tunable Fabry-Perot bandpass interference filter for use as tuning element in infrared lasers  

NASA Astrophysics Data System (ADS)

We have developed a bandpass infrared interference filter with sufficiently narrow bandwidth to be potentially suitable for tuning a self-stabilizing external-cavity quantum-cascade laser (ECQCL) in single-mode operation and describe the process parameters for fabrication of such filters with central wavelengths in the 3-12 ?m range. The filter has a passband width of 6 nm or 0.14% with peak transmission of 55% and a central wavelength of approximately 4.0 ?m. It can be tuned through over 4% by tilting with respect to the incident beam and offers orders of magnitude larger angular dispersion than diffraction gratings. We compare filters with single-cavity and coupled-cavity Fabry-Perot designs.

Kischkat, Jan; Peters, Sven; Semtsiv, Mykhaylo P.; Wegner, Tristan; Elagin, Mikaela; Monastyrskyi, Grygorii; Flores, Yuri; Kurlov, Sergii; Masselink, W. Ted

2014-11-01

57

Camera Calibration with Super-Wide-Angle and Low-Distortion Lens Using Higher Degree Polynomial Model  

E-print Network

, Camera Calibration 1. 3 3D 3 Theia Technologies LLC OpenCV 2.4 [1, 2] 2. 2.1 (X, Y, Z) (x, y) (1) x = fx X Z +cx, y = fy Y Z +cy (1) fx fy x y cx cy camera intrinsic parameter (x, y) OpenCV 2.4 1 1 Graduate School of Systems and Information Engineering, University of Tsukuba 2.2 Brown OpenCV 2.4 Brown [3

Ohya, Akihisa

58

Mars Global Surveyor Mars Orbiter Camera Image Gallery  

NSDL National Science Digital Library

This site from Malin Space Science Systems provides access to all of the images acquired by the Mars Orbiter Camera (MOC) during the Mars Global Surveyor mission through March 2005. MOC consists of several cameras: A narrow angle system that provides grayscale high resolution views of the planet's surface (typically, 1.5 to 12 meters/pixel), and red and blue wide angle cameras that provide daily global weather monitoring, context images to determine where the narrow angle views were actually acquired, and regional coverage to monitor variable surface features such as polar frost and wind streaks. Ancillary data for each image is provided and instructions regarding gallery usage are also available on the site.

Malin Space Science Systems

59

Quasi-null lens optical system for the fabrication of an oblate convex ellipsoidal mirror: application to the Wide Angle Camera of the Rosetta space mission.  

PubMed

The design of a quasi-null lens system for the fabrication of an aspheric oblate convex ellipsoidal mirror is presented. The Performance and tolerance of the system have been analyzed. The system has been applied successfully for the fabrication of the primary mirror of the Wide Angle Camera (WAC), the imaging system onboard the Rosetta, the European Space Agency cornerstone mission dedicated to the exploration of a comet. The WAC is based on an off-axis two-mirror configuration, in which the primary mirror is an oblate convex ellipsoid with a significant conic constant. PMID:16892112

Pelizzo, Maria-Guglielmina; Da Deppo, Vania; Naletto, Giampiero; Ragazzoni, Roberto; Novi, Andrea

2006-08-20

60

Quasi-null lens optical system for the fabrication of an oblate convex ellipsoidal mirror: application to the Wide Angle Camera of the Rosetta space mission  

NASA Astrophysics Data System (ADS)

The design of a quasi-null lens system for the fabrication of an aspheric oblate convex ellipsoidal mirror is presented. The Performance and tolerance of the system have been analyzed. The system has been applied successfully for the fabrication of the primary mirror of the Wide Angle Camera (WAC), the imaging system onboard the Rosetta, the European Space Agency cornerstone mission dedicated to the exploration of a comet. The WAC is based on an off-axis two-mirror configuration, in which the primary mirror is an oblate convex ellipsoid with a significant conic constant.

Pelizzo, Maria-Guglielmina; da Deppo, Vania; Naletto, Giampiero; Ragazzoni, Roberto; Novi, Andrea

2006-08-01

61

7. VAL CAMERA STATION, INTERIOR VIEW OF CAMERA MOUNT, COMMUNICATION ...  

Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

7. VAL CAMERA STATION, INTERIOR VIEW OF CAMERA MOUNT, COMMUNICATION EQUIPMENT AND STORAGE CABINET. - Variable Angle Launcher Complex, Camera Stations, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

62

3. VAL CAMERA CAR, VIEW OF CAMERA CAR AND TRACK ...  

Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

3. VAL CAMERA CAR, VIEW OF CAMERA CAR AND TRACK WITH THE VAL TO THE RIGHT, LOOKING NORTHEAST. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

63

1. VARIABLEANGLE LAUNCHER CAMERA CAR, VIEW OF CAMERA CAR AND ...  

Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

1. VARIABLE-ANGLE LAUNCHER CAMERA CAR, VIEW OF CAMERA CAR AND TRACK WITH CAMERA STATION ABOVE LOOKING NORTH TAKEN FROM RESERVOIR. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

64

A small X-ray corona of the narrow-angle tail radio galaxy NGC 1265 soaring through the Perseus cluster  

E-print Network

A deep Chandra observation of NGC 1265 (3C 83.1B), the prototype for the narrow-angled-tailed (NAT) radio galaxy, reveals a small cool X-ray thermal corona (~ 0.6 keV) embedded in the hot ICM of the Perseus cluster (~ 6.7 keV). The corona is asymmetric with a sharp edge (~ 2.2'', or 0.8 kpc from the nucleus) to the south and an extension to the north (at least ~ 8'' from the nucleus), which is interpreted as the action of ram pressure while solely the static ICM confinement is unable to explain. We estimate that the corona is moving with a velocity of ~ 2.4 - 4.2 times the local sound speed to the south. The presence of the sharp edge for this small corona indicates that the transport processes are largely suppressed by the magnetic field there. The magnetic field around the corona also suppresses heat conduction by at least a factor of ~ 60 across the corona boundary. We conclude that it is unrealistic to study the interaction of the small X-ray coronae with the hot ICM without the consideration of the roles that magnetic field plays, a factor not included in current simulations. An absorbed (N_H=1.5-3x10^22 cm^-2) nucleus is also detected, which is not usual for FR I radio galaxies. Weak X-ray emission from three inner radio knots in the jets is also detected. Indentations at the east and west of the corona indicate interaction between the jets and the X-ray corona. Narrow jets carry great amounts of energy out of the central AGN and release the energy outside the corona, preserving the tiny and vulnerable corona. This case reveals that the inner kpc core of the corona of massive galaxies can survive both high-speed stripping and powerful AGN feedback. Thus, the cooling of the X-ray coronae potentially provides fuel to the central SMBH in rich environments where the amount of the galactic cold gas is at a minimum.

M. Sun; D. Jerius; C. Jones

2005-07-14

65

Narrow-Angle Astrometry with the Space Interferometry Mission: The Search for Extrasolar Planets. I. Detection and Characterization of Single Planets  

NASA Astrophysics Data System (ADS)

A decade after the publication of the Hipparcos Catalogue, the Space Interferometry Mission (SIM) will be capable of making selected high-precision astrometric measurements about 3 orders of magnitude more accurate than the Hipparcos survey. We present results from a detailed set of end-to-end numerical simulations of SIM narrow-angle astrometric measurements and data analysis to illustrate the enormous potential that SIM has for the discovery and characterization of planets outside the solar system. Utilizing a template observing scenario, we quantify SIM sensitivity to single planets orbiting single normal nearby stars as a function of measurement errors and properties of the planet: SIM will detect over 95% of the planets with periods between a few days and the 5 yr nominal mission lifetime that produces astrometric signatures ~2.2 times larger than the single-measurement accuracy. We provide accuracy estimates of full-orbit reconstruction and planet mass determination: at twice the discovery limit, orbital elements will be determined with a typical accuracy of 20%-30% the astrometric signature must be ~10 and ~15 times the minimum signal required for detection to derive mass and inclination-angle estimates accurate to 10%. We quantify the impact of different observing strategies on the boundaries for secure detection and accurate orbit estimation: the results scale with the square root of both the number of observations and the number of reference stars. We investigate SIM discovery space, to gauge the instrument ability in detecting very low mass planets: around the nearest stars, SIM will find planets as small as Earth, if they are present. Some of these might be orbiting inside the parent star's habitable zone. Extrasolar planets figure prominently among SIM scientific goals: our results reaffirm the importance of high-precision astrometric measurements as a unique complement to spectroscopic surveys based on radial velocity. For example, establishing the existence of rocky, perhaps habitable planets would constitute both a fundamental test of theoretical models and progress toward the understanding of formation and evolution processes of planetary systems. Such discoveries would also provide the Terrestrial Planet Finder with prime targets to investigate with direct spectroscopy in terms of the potential for life.

Sozzetti, A.; Casertano, S.; Brown, R. A.; Lattanzi, M. G.

2002-11-01

66

Miniaturized fundus camera  

NASA Astrophysics Data System (ADS)

We present a miniaturized version of a fundus camera. The camera is designed for the use in screening for retinopathy of prematurity (ROP). There, but also in other applications a small, light weight, digital camera system can be extremely useful. We present a small wide angle digital camera system. The handpiece is significantly smaller and lighter then in all other systems. The electronics is truly portable fitting in a standard boardcase. The camera is designed to be offered at a compatible price. Data from tests on young rabbits' eyes is presented. The development of the camera system is part of a telemedicine project screening for ROP. Telemedical applications are a perfect application for this camera system using both advantages: the portability as well as the digital image.

Gliss, Christine; Parel, Jean-Marie A.; Flynn, John T.; Pratisto, Hans S.; Niederer, Peter F.

2003-07-01

67

Wide Angle Movie  

NASA Technical Reports Server (NTRS)

This brief movie illustrates the passage of the Moon through the Saturn-bound Cassini spacecraft's wide-angle camera field of view as the spacecraft passed by the Moon on the way to its closest approach with Earth on August 17, 1999. From beginning to end of the sequence, 25 wide-angle images (with a spatial image scale of about 14 miles per pixel (about 23 kilometers)were taken over the course of 7 and 1/2 minutes through a series of narrow and broadband spectral filters and polarizers, ranging from the violet to the near-infrared regions of the spectrum, to calibrate the spectral response of the wide-angle camera. The exposure times range from 5 milliseconds to 1.5 seconds. Two of the exposures were smeared and have been discarded and replaced with nearby images to make a smooth movie sequence. All images were scaled so that the brightness of Crisium basin, the dark circular region in the upper right, is approximately the same in every image. The imaging data were processed and released by the Cassini Imaging Central Laboratory for Operations (CICLOPS)at the University of Arizona's Lunar and Planetary Laboratory, Tucson, AZ.

Photo Credit: NASA/JPL/Cassini Imaging Team/University of Arizona

Cassini, launched in 1997, is a joint mission of NASA, the European Space Agency and Italian Space Agency. The mission is managed by NASA's Jet Propulsion Laboratory, Pasadena, CA, for NASA's Office of Space Science, Washington DC. JPL is a division of the California Institute of Technology, Pasadena, CA.

1999-01-01

68

Angles, Angles and More Angles!  

NSDL National Science Digital Library

Test Your Angle Knowledge! Angles Telescope Star Gazing Help diget to fill up his scrapbook of stars by using his telescope and pointting at each planet during the night! But make sure you hurry before the sun comes up! Shoot The Space Ship Angles Game Try and figure out which angle you need to use to shoot down the aliens spaceship! ...

Miss Smith

2011-03-23

69

Optimum camera placement considering camera specification for security monitoring  

Microsoft Academic Search

Abstract—We,present,an optimum,camera,placement,algo- rithm. We,are motivated,by,the fact that the installation of security cameras,is increasing rapidly. From the system cost point of view, it is desirable to observe all the area of interest by the smallest number,of cameras. We propose,a method,for deciding optimum,camera,placement,automatically,considering,camera specification such as visual distance, visual angle, and resolution. Moreover, to reduce the number of cameras, we divide the

Kenichi Yabuta; Hitoshi Kitazawa

2008-01-01

70

Camera Obscura  

NSDL National Science Digital Library

Before photography was invented there was the camera obscura, useful for studying the sun, as an aid to artists, and for general entertainment. What is a camera obscura and how does it work ??? Camera = Latin for room Obscura = Latin for dark But what is a Camera Obscura? The Magic Mirror of Life What is a camera obscura? A French drawing camera with supplies A French drawing camera with supplies Drawing Camera Obscuras with Lens at the top Drawing Camera Obscuras with Lens at the top Read the first three paragraphs of this article. Under the portion Early Observations and Use in Astronomy you will find the answers to the ...

Mr. Engelman

2008-10-28

71

Critical Heat Flux in Inclined Rectangular Narrow Gaps  

SciTech Connect

In light of the TMI-2 accident, in which the reactor vessel lower head survived the attack by molten core material, the in-vessel retention strategy was suggested to benefit from cooling the debris through a gap between the lower head and the core material. The GAMMA 1D (Gap Apparatus Mitigating Melt Attack One Dimensional) tests were conducted to investigate the critical heat flux (CHF) in narrow gaps with varying surface orientations. The CHF in an inclined gap, especially in case of the downward-facing narrow gap, is dictated by bubble behavior because the departing bubbles are squeezed. The orientation angle affects the bubble layer and escape of the bubbles from the narrow gap. The test parameters include gap sizes of 1, 2, 5 and 10 mm and the open periphery, and the orientation angles range from the fully downward-facing (180o) to the vertical (90o) position. The 15 ×35 mm copper test section was electrically heated by the thin film resistor on the back. The heater assembly was installed to the tip of the rotating arm in the heated water pool at the atmospheric pressure. The bubble behavior was photographed utilizing a high-speed camera through the Pyrex glass spacer. It was observed that the CHF decreased as the surface inclination angle increased and as the gap size decreased in most of the cases. However, the opposing results were obtained at certain surface orientations and gap sizes. Transition angles, at which the CHF changed in a rapid slope, were also detected, which is consistent with the existing literature. A semi-empirical CHF correlation was developed for the inclined narrow rectangular channels through dimensional analysis. The correlation provides with best-estimate CHF values for realistically assessing the thermal margin to failure of the lower head during a severe accident involving relocation of the core material.

Jeong J. Kim; Yong H. Kim; Seong J. Kim; Sang W. Noh; Kune Y. Suh; Joy L. Rempe; Fan-Bill Cheung; Sang B. Kim

2004-06-01

72

6. VAL CAMERA STATION, VIEW FROM INTERIOR OUT OF WINDOW ...  

Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

6. VAL CAMERA STATION, VIEW FROM INTERIOR OUT OF WINDOW OPENING TOWARD VAL FIRING RANGE LOOKING SOUTHEAST WITH CAMERA MOUNT IN FOREGROUND. - Variable Angle Launcher Complex, Camera Stations, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

73

8. VAL CAMERA CAR, CLOSEUP VIEW OF 'FLARE' OR TRAJECTORY ...  

Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

8. VAL CAMERA CAR, CLOSE-UP VIEW OF 'FLARE' OR TRAJECTORY CAMERA ON SLIDING MOUNT. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

74

84. VIEW FROM CAMERA TOWER LOOKING SOUTHWEST SHOWING VAL FIRING ...  

Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

84. VIEW FROM CAMERA TOWER LOOKING SOUTHWEST SHOWING VAL FIRING RANGE WITH OVERHEAD CAMERA AND CABLES, Date unknown, circa 1949. - Variable Angle Launcher Complex, Variable Angle Launcher, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

75

Imaging Narrow Angle The Voyager Spacecraft  

E-print Network

) Radioisotope Thermoelectric Generator (3) Planetary Radio Astronomy and Plasma Wave Antenna (2) Optical independent systems -- a high-energy telescope system (HET), a low-energy telescope system (LET), and an electron telescope (TET). Dr. Cummings participated in the design of the LETs and had the parts built

Waliser, Duane E.

76

The DSLR Camera  

NASA Astrophysics Data System (ADS)

Cameras have developed significantly in the past decade; in particular, digital Single-Lens Reflex Cameras (DSLR) have appeared. As a consequence we can buy cameras of higher and higher pixel number, and mass production has resulted in the great reduction of prices. CMOS sensors used for imaging are increasingly sensitive, and the electronics in the cameras allows images to be taken with much less noise. The software background is developing in a similar way—intelligent programs are created for after-processing and other supplementary works. Nowadays we can find a digital camera in almost every household, most of these cameras are DSLR ones. These can be used very well for astronomical imaging, which is nicely demonstrated by the amount and quality of the spectacular astrophotos appearing in different publications. These examples also show how much post-processing software contributes to the rise in the standard of the pictures. To sum up, the DSLR camera serves as a cheap alternative for the CCD camera, with somewhat weaker technical characteristics. In the following, I will introduce how we can measure the main parameters (position angle and separation) of double stars, based on the methods, software and equipment I use. Others can easily apply these for their own circumstances.

Berkó, Ern?; Argyle, R. W.

77

Camera Animation  

NSDL National Science Digital Library

A general discussion of the use of cameras in computer animation. This section includes principles of traditional film techniques and suggestions for the use of a camera during an architectural walkthrough. This section includes html pages, images and one video.

78

MIST: An operational ground surveillance camera  

Microsoft Academic Search

An aerial high resolution wide angle camera system capable of 3D mapping was developed with a possible application in assisting DHS border security and military reconnaissance. The goals of this system were to survey vast areas with a wide angle camera system and provide precise coordinate information. This system was designed with an UAV in mind as the flight vehicle,

Jay P. Wilhelm; Mridul Gautam; Shai Ronen

2011-01-01

79

Publications Angling, Angling Records,  

E-print Network

Publications Angling, Angling Records, and Game Fish Conservation The 1982 edition of "World Record Game Fishes," published by the Inter- national Game Fish Association, 3000 East Las alas Boulevard, Fort Lauder- dale, FL 33316, continues to grow as an important reference work for the serious angler

80

Dynamics of an oscillating bubble in a narrow gap  

E-print Network

The complex dynamics of a single bubble of a few millimeters in size oscillating inside a narrow fluid-filled gap between two parallel plates is studied using high-speed videography. Two synchronized high-speed cameras ...

Azam, Fahad Ibn

81

Radial Velocities of Narrow-Line Region Gas of Markarian 3  

NASA Astrophysics Data System (ADS)

We present preliminary measurements of radial velocities for narrow-line region (NLR) gas in the Seyfert 2 galaxy Markarian 3. The measurements were obtained from a Space Telescope Imaging Spectrograph (STIS) slitless spectrum taken in January 2000. The radial velocity results are shown in the form of plots, 2D velocity maps, and color maps. The STIS observations were compared with archival Wide Field Planetary Camera 2 (WFPC 2) and Faint Object Camera (FOC) [O III] images of Mrk 3 in order to determine the offsets of the various knots of NLR gas in the dispersion direction, and in turn, the radial velocities from those offsets. The measurements are part of a larger campaign to measure radial velocities in 9 other Seyfert galaxies, and compare the results with models of biconical and disk geometries at various inclination angles.

Ruiz, J. R.; Crenshaw, D. M.; Kraemer, S. B.

2000-05-01

82

Space Camera  

NASA Technical Reports Server (NTRS)

Nikon's F3 35mm camera was specially modified for use by Space Shuttle astronauts. The modification work produced a spinoff lubricant. Because lubricants in space have a tendency to migrate within the camera, Nikon conducted extensive development to produce nonmigratory lubricants; variations of these lubricants are used in the commercial F3, giving it better performance than conventional lubricants. Another spinoff is the coreless motor which allows the F3 to shoot 140 rolls of film on one set of batteries.

1983-01-01

83

5. VAL CAMERA CAR, DETAIL OF HOIST AT SIDE OF ...  

Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

5. VAL CAMERA CAR, DETAIL OF HOIST AT SIDE OF BRIDGE AND ENGINE CAR ON TRACKS, LOOKING NORTHEAST. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

84

Electronic Still Camera  

NASA Technical Reports Server (NTRS)

A handheld, programmable, digital camera is disclosed that supports a variety of sensors and has program control over the system components to provide versatility. The camera uses a high performance design which produces near film quality images from an electronic system. The optical system of the camera incorporates a conventional camera body that was slightly modified, thus permitting the use of conventional camera accessories, such as telephoto lenses, wide-angle lenses, auto-focusing circuitry, auto-exposure circuitry, flash units, and the like. An image sensor, such as a charge coupled device ('CCD') collects the photons that pass through the camera aperture when the shutter is opened, and produces an analog electrical signal indicative of the image. The analog image signal is read out of the CCD and is processed by preamplifier circuitry, a correlated double sampler, and a sample and hold circuit before it is converted to a digital signal. The analog-to-digital converter has an accuracy of eight bits to insure accuracy during the conversion. Two types of data ports are included for two different data transfer needs. One data port comprises a general purpose industrial standard port and the other a high speed/high performance application specific port. The system uses removable hard disks as its permanent storage media. The hard disk receives the digital image signal from the memory buffer and correlates the image signal with other sensed parameters, such as longitudinal or other information. When the storage capacity of the hard disk has been filled, the disk can be replaced with a new disk.

Holland, S. Douglas (inventor)

1992-01-01

85

Electronic still camera  

NASA Astrophysics Data System (ADS)

A handheld, programmable, digital camera is disclosed that supports a variety of sensors and has program control over the system components to provide versatility. The camera uses a high performance design which produces near film quality images from an electronic system. The optical system of the camera incorporates a conventional camera body that was slightly modified, thus permitting the use of conventional camera accessories, such as telephoto lenses, wide-angle lenses, auto-focusing circuitry, auto-exposure circuitry, flash units, and the like. An image sensor, such as a charge coupled device ('CCD') collects the photons that pass through the camera aperture when the shutter is opened, and produces an analog electrical signal indicative of the image. The analog image signal is read out of the CCD and is processed by preamplifier circuitry, a correlated double sampler, and a sample and hold circuit before it is converted to a digital signal. The analog-to-digital converter has an accuracy of eight bits to insure accuracy during the conversion. Two types of data ports are included for two different data transfer needs. One data port comprises a general purpose industrial standard port and the other a high speed/high performance application specific port. The system uses removable hard disks as its permanent storage media. The hard disk receives the digital image signal from the memory buffer and correlates the image signal with other sensed parameters, such as longitudinal or other information. When the storage capacity of the hard disk has been filled, the disk can be replaced with a new disk.

Holland, S. Douglas

1992-09-01

86

Infrared Camera  

NASA Technical Reports Server (NTRS)

A sensitive infrared camera that observes the blazing plumes from the Space Shuttle or expendable rocket lift-offs is capable of scanning for fires, monitoring the environment and providing medical imaging. The hand-held camera uses highly sensitive arrays in infrared photodetectors known as quantum well infrared photo detectors (QWIPS). QWIPS were developed by the Jet Propulsion Laboratory's Center for Space Microelectronics Technology in partnership with Amber, a Raytheon company. In October 1996, QWIP detectors pointed out hot spots of the destructive fires speeding through Malibu, California. Night vision, early warning systems, navigation, flight control systems, weather monitoring, security and surveillance are among the duties for which the camera is suited. Medical applications are also expected.

1997-01-01

87

Security Cameras  

NSDL National Science Digital Library

Using the real world example of security cameras, this lesson has students explore properties of polygons. Using this example, students will be able to discover a formula as related to polygons. An activity sheet and student questions are included. This material is intended for students from grades 9-12 and should require 1 class period to complete.

2010-12-16

88

Nikon Camera  

NASA Technical Reports Server (NTRS)

Nikon FM compact has simplification feature derived from cameras designed for easy, yet accurate use in a weightless environment. Innovation is a plastic-cushioned advance lever which advances the film and simultaneously switches on a built in light meter. With a turn of the lens aperture ring, a glowing signal in viewfinder confirms correct exposure.

1980-01-01

89

CCD Camera  

DOEpatents

A CCD camera capable of observing a moving object which has varying intensities of radiation eminating therefrom and which may move at varying speeds is shown wherein there is substantially no overlapping of successive images and wherein the exposure times and scan times may be varied independently of each other.

Roth, Roger R. (Minnetonka, MN)

1983-01-01

90

CCD Camera  

DOEpatents

A CCD camera capable of observing a moving object which has varying intensities of radiation emanating therefrom and which may move at varying speeds is shown wherein there is substantially no overlapping of successive images and wherein the exposure times and scan times may be varied independently of each other. 7 figs.

Roth, R.R.

1983-08-02

91

Camera Projector  

NSDL National Science Digital Library

In this activity (posted on March 14, 2011), learners follow the steps to construct a camera projector to explore lenses and refraction. First, learners use relatively simple materials to construct the projector. Then, learners discover that lenses project images upside down and backwards. They explore this phenomenon by creating their own slides (must be drawn upside down and backwards to appear normally). Use this activity to also introduce learners to spherical aberration and chromatic aberration.

Oakland Discovery Center

2011-01-01

92

Viewing angle changeable display  

NASA Astrophysics Data System (ADS)

Viewing angle changeable display can change the display viewing angle as needed: In the public place the display could have a narrow viewing angle for privacy, while in the private place the displays could have a wide viewing angle for the convenience of the operation and better viewing experience. This article propose a novel adjustable optical transmission device to realize the viewing angle changes for LCD by using the principle of guest- host effect of liquid crystal. The major technology is to insert a special equipment between the backlight and the LCD, through which the backlight will display either parallel or scattered features to get an either narrow or wide viewing angle. The equipment is an adjustable transmission cell (ATC) which is actually a black G-H LC cell. This ATC is the main focus of our invention. The ATC consists of a polarizer sheet and a special guest-host liquid crystal device filled with the two-phase dye (called as GH-LC in this report), to achieve the viewing angle change in the LCD. When an electrical field charges to the ATC, only the so-called near-axis lights can pass through the ATC within a relatively small angle, while the other scattered lights are absorbed sequentially by GH-LC and the polarizer sheet. On the other hand, when there is no electrical charge to the ATC, the cell behaves like a normal polarizer; and the scattered light can pass through the cell and polarizer in a normal way. This paper describes the principle and structure of the device, applies the electric field on the sample to observe the electro-optical properties, combine the theoretical and experimental research, getting the viewing angle effects of the display.

Leng, Jinbi; Huang, Ziqiang; Yang, Wenjun; Chen, Xiaoxi

2010-10-01

93

LRO Camera Imaging of Constellation Sites  

NASA Astrophysics Data System (ADS)

One of the top priorities for Lunar Reconnaissance Orbiter Camera (LROC) imaging during the "exploration" phase of the mission is thorough coverage of 50 sites selected to represent a wide variety of terrain types and geologic features that are of interest for human exploration. These sites, which are broadly distributed around the Moon and include locations at or near both poles, will provide the Constellation Program with data for a set of targets that represent a diversity of scientific and resource opportunities, thus forming a basis for planning for scientific exploration, resource development, and mission operations including traverse and habitation zone planning. Identification of the Constellation targets is not intended to be a site-selection activity. Sites include volcanic terrains (surfaces with young and old basalt flows, pyroclastic deposits, vents, fissures, domes, low shields, rilles, wrinkle ridges, and lava tubes), impact craters and basins (crater floors, central peaks, terraces and walls; impact-melt and ejecta deposits, basin ring structures; and antipodal terrain), and contacts of geologic features in areas of complex geology. Sites at the poles represent different lighting conditions and include craters with areas of permanent shadow. Sites were also chosen that represent typical feldspathic highlands terrain, areas in the highlands with anomalous compositions, and unusual features such as magnetic anomalies. These sites were reviewed by the Lunar Exploration Analysis Group (LEAG). These sites all have considerable scientific and exploration interest and were derived from previous studies of potential lunar landing sites, supplemented with areas that capitalize on discoveries from recent orbital missions. Each site consists of nested regions of interest (ROI), including 10×10 km, 20×20 km, and 40×40 km areas. Within the 10×10 and 20×20 ROIs, the goal is to compile a set of narrow-angle-camera (NAC) observations for a controlled mosaic, photometric and geometric stereo, and images taken at low and high sun to enhance morphology and albedo, respectively. These data will provide the basis for topographic maps, digital elevation models, and slope and boulder hazard maps that could be used to establish landing or habitation zones. Within the 40×40 ROIs, images will be taken to achieve the best possible high-resolution mosaics. All ROIs will have wide-angle-camera context images covering the sites and surrounding areas. At the time of writing (prior to the end of the LRO commissioning phase), over 500 individual NAC frames have been acquired for 47 of the 50 sites. Because of the polar orbit, the majority of repeat coverage occurs for the polar and high latitude sites. Analysis of the environment for several representative Constellation site ROIs will be presented.

Gruener, J.; Jolliff, B. L.; Lawrence, S.; Robinson, M. S.; Plescia, J. B.; Wiseman, S. M.; Li, R.; Archinal, B. A.; Howington-Kraus, A. E.

2009-12-01

94

Calibration of the Lunar Reconnaissance Orbiter Camera  

NASA Astrophysics Data System (ADS)

The Lunar Reconnaissance Orbiter Camera (LROC) onboard the NASA Lunar Reconnaissance Orbiter (LRO) spacecraft consists of three cameras: the Wide-Angle Camera (WAC) and two identical Narrow Angle Cameras (NAC-L, NAC-R). The WAC is push-frame imager with 5 visible wavelength filters (415 to 680 nm) at a spatial resolution of 100 m/pixel and 2 UV filters (315 and 360 nm) with a resolution of 400 m/pixel. In addition to the multicolor imaging the WAC can operate in monochrome mode to provide a global large- incidence angle basemap and a time-lapse movie of the illumination conditions at both poles. The WAC has a highly linear response, a read noise of 72 e- and a full well capacity of 47,200 e-. The signal-to-noise ratio in each band is 140 in the worst case. There are no out-of-band leaks and the spectral response of each filter is well characterized. Each NAC is a monochrome pushbroom scanner, providing images with a resolution of 50 cm/pixel from a 50-km orbit. A single NAC image has a swath width of 2.5 km and a length of up to 26 km. The NACs are mounted to acquire side-by-side imaging for a combined swath width of 5 km. The NAC is designed to fully characterize future human and robotic landing sites in terms of topography and hazard risks. The North and South poles will be mapped on a 1-meter-scale poleward of 85.5° latitude. Stereo coverage can be provided by pointing the NACs off-nadir. The NACs are also highly linear. Read noise is 71 e- for NAC-L and 74 e- for NAC-R and the full well capacity is 248,500 e- for NAC-L and 262,500 e- for NAC- R. The focal lengths are 699.6 mm for NAC-L and 701.6 mm for NAC-R; the system MTF is 28% for NAC-L and 26% for NAC-R. The signal-to-noise ratio is at least 46 (terminator scene) and can be higher than 200 (high sun scene). Both NACs exhibit a straylight feature, which is caused by out-of-field sources and is of a magnitude of 1-3%. However, as this feature is well understood it can be greatly reduced during ground processing. All three cameras were calibrated in the laboratory under ambient conditions. Future thermal vacuum tests will characterize critical behaviors across the full range of lunar operating temperatures. In-flight tests will check for changes in response after launch and provide key data for meeting the requirements of 1% relative and 10% absolute radiometric calibration.

Tschimmel, M.; Robinson, M. S.; Humm, D. C.; Denevi, B. W.; Lawrence, S. J.; Brylow, S.; Ravine, M.; Ghaemi, T.

2008-12-01

95

Pinhole Cameras: For Science, Art, and Fun!  

ERIC Educational Resources Information Center

A pinhole camera is a camera without a lens. A tiny hole replaces the lens, and light is allowed to come in for short amount of time by means of a hand-operated shutter. The pinhole allows only a very narrow beam of light to enter, which reduces confusion due to scattered light on the film. This results in an image that is focused, reversed, and…

Button, Clare

2007-01-01

96

The optimal locations of surveillance cameras on straight lanes  

Microsoft Academic Search

In this paper, we investigate the optimal locations of surveillance cameras on straight lanes. As known, inappropriate settings of surveillance cameras will result in some dead angles and dead space, and oversetting of surveillance cameras will waste limited resources. The considered location problem aims to minimize the maximal detection failure probability of points on straight lane subject to a limited

Y.-C. Hsieh; Y.-C. Lee; P.-S. You

2011-01-01

97

5. VAL CAMERA STATION, VIEW FROM INTERIOR OUT OF WINDOW ...  

Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

5. VAL CAMERA STATION, VIEW FROM INTERIOR OUT OF WINDOW OPENING TOWARD VAL FIRING RANGE LOOKING EAST WITH VARIABLE ANGLE LAUNCHER IN BACKGROUND. - Variable Angle Launcher Complex, Camera Stations, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

98

Omnidirectional underwater camera design and calibration.  

PubMed

This paper presents the development of an underwater omnidirectional multi-camera system (OMS) based on a commercially available six-camera system, originally designed for land applications. A full calibration method is presented for the estimation of both the intrinsic and extrinsic parameters, which is able to cope with wide-angle lenses and non-overlapping cameras simultaneously. This method is valid for any OMS in both land or water applications. For underwater use, a customized housing is required, which often leads to strong image distortion due to refraction among the different media. This phenomena makes the basic pinhole camera model invalid for underwater cameras, especially when using wide-angle lenses, and requires the explicit modeling of the individual optical rays. To address this problem, a ray tracing approach has been adopted to create a field-of-view (FOV) simulator for underwater cameras. The simulator allows for the testing of different housing geometries and optics for the cameras to ensure a complete hemisphere coverage in underwater operation. This paper describes the design and testing of a compact custom housing for a commercial off-the-shelf OMS camera (Ladybug 3) and presents the first results of its use. A proposed three-stage calibration process allows for the estimation of all of the relevant camera parameters. Experimental results are presented, which illustrate the performance of the calibration method and validate the approach. PMID:25774707

Bosch, Josep; Gracias, Nuno; Ridao, Pere; Ribas, David

2015-01-01

99

Cydonia: Wide Angle Color Image  

NASA Technical Reports Server (NTRS)

[figure removed for brevity, see original site]

Although the resolution of the MOC wide angle cameras is too low to tell much about the geomorphology of the Cydonia region, the images from the red and blue wide angle cameras provide us with two types of information that is of interest in their own right: color and stereoscopic data. Above are a color view and a stereoscopic anaglyph rendition of Geodesy Campaign images acquired by MGS MOC in May 1999. To view the stereo image, you need red/blue '3-D' glasses.

2000-01-01

100

Angle Sums  

NSDL National Science Digital Library

"Examine the angles in a triangle, quadrilateral, pentagon, hexagon, heptagon or octagon and find a relationship between the number of sides and the sum of the interior angles." (Source: 2000-2012 National Council of Teachers of Mathematics)

Illuminations National Council of Teachers of Mathematics

2010-05-20

101

Characterization of previously unidentified lunar pyroclastic deposits using Lunar Reconnaissance Orbiter Camera (LROC) data  

USGS Publications Warehouse

We used a Lunar Reconnaissance Orbiter Camera (LROC) global monochrome Wide-angle Camera (WAC) mosaic to conduct a survey of the Moon to search for previously unidentified pyroclastic deposits. Promising locations were examined in detail using LROC multispectral WAC mosaics, high-resolution LROC Narrow Angle Camera (NAC) images, and Clementine multispectral (ultraviolet-visible or UVVIS) data. Out of 47 potential deposits chosen for closer examination, 12 were selected as probable newly identified pyroclastic deposits. Potential pyroclastic deposits were generally found in settings similar to previously identified deposits, including areas within or near mare deposits adjacent to highlands, within floor-fractured craters, and along fissures in mare deposits. However, a significant new finding is the discovery of localized pyroclastic deposits within floor-fractured craters Anderson E and F on the lunar farside, isolated from other known similar deposits. Our search confirms that most major regional and localized low-albedo pyroclastic deposits have been identified on the Moon down to ~100 m/pix resolution, and that additional newly identified deposits are likely to be either isolated small deposits or additional portions of discontinuous, patchy deposits.

Gustafson, J. Olaf; Bell, James F.; Gaddis, Lisa R.R.; Hawke, B. Ray Ray; Giguere, Thomas A.

2012-01-01

102

Caught on Camera.  

ERIC Educational Resources Information Center

Describes the benefits of and rules to be followed when using surveillance cameras for school security. Discusses various camera models, including indoor and outdoor fixed position cameras, pan-tilt zoom cameras, and pinhole-lens cameras for covert surveillance. (EV)

Milshtein, Amy

2002-01-01

103

Camera Motion Style Transfer  

Microsoft Academic Search

When depicting both virtual and physical worlds, the viewer's impression of presence in these worlds is strongly linked to camera motion. Plausible and artist-controlled camera movement can substantially increase scene immersion. While physical camera motion exhibits subtle details of position, rotation, and acceleration, these details are often missing for virtual camera motion. In this work, we analyze camera movement using

Christian Kurz; Tobias Ritschel; Elmar Eisemann; T. Thorma?hlen; Hans-Peter Seidel

2010-01-01

104

System Synchronizes Recordings from Separated Video Cameras  

NASA Technical Reports Server (NTRS)

A system of electronic hardware and software for synchronizing recordings from multiple, physically separated video cameras is being developed, primarily for use in multiple-look-angle video production. The system, the time code used in the system, and the underlying method of synchronization upon which the design of the system is based are denoted generally by the term "Geo-TimeCode(TradeMark)." The system is embodied mostly in compact, lightweight, portable units (see figure) denoted video time-code units (VTUs) - one VTU for each video camera. The system is scalable in that any number of camera recordings can be synchronized. The estimated retail price per unit would be about $350 (in 2006 dollars). The need for this or another synchronization system external to video cameras arises because most video cameras do not include internal means for maintaining synchronization with other video cameras. Unlike prior video-camera-synchronization systems, this system does not depend on continuous cable or radio links between cameras (however, it does depend on occasional cable links lasting a few seconds). Also, whereas the time codes used in prior video-camera-synchronization systems typically repeat after 24 hours, the time code used in this system does not repeat for slightly more than 136 years; hence, this system is much better suited for long-term deployment of multiple cameras.

Nail, William; Nail, William L.; Nail, Jasper M.; Le, Doung T.

2009-01-01

105

Pre-hibernation performances of the OSIRIS cameras onboard the Rosetta spacecraft  

NASA Astrophysics Data System (ADS)

Context. The ESA cometary mission Rosetta was launched in 2004. In the past years and until the spacecraft hibernation in June 2011, the two cameras of the OSIRIS imaging system (Narrow Angle and Wide Angle Camera, NAC and WAC) observed many different sources. On 20 January 2014 the spacecraft successfully exited hibernation to start observing the primary scientific target of the mission, comet 67P/Churyumov-Gerasimenko. Aims: A study of the past performances of the cameras is now mandatory to be able to determine whether the system has been stable through the time and to derive, if necessary, additional analysis methods for the future precise calibration of the cometary data. Methods: The instrumental responses and filter passbands were used to estimate the efficiency of the system. A comparison with acquired images of specific calibration stars was made, and a refined photometric calibration was computed, both for the absolute flux and for the reflectivity of small bodies of the solar system. Results: We found a stability of the instrumental performances within ±1.5% from 2007 to 2010, with no evidence of an aging effect on the optics or detectors. The efficiency of the instrumentation is found to be as expected in the visible range, but lower than expected in the UV and IR range. A photometric calibration implementation was discussed for the two cameras. Conclusions: The calibration derived from pre-hibernation phases of the mission will be checked as soon as possible after the awakening of OSIRIS and will be continuously monitored until the end of the mission in December 2015. A list of additional calibration sources has been determined that are to be observed during the forthcoming phases of the mission to ensure a better coverage across the wavelength range of the cameras and to study the possible dust contamination of the optics.

Magrin, S.; La Forgia, F.; Da Deppo, V.; Lazzarin, M.; Bertini, I.; Ferri, F.; Pajola, M.; Barbieri, M.; Naletto, G.; Barbieri, C.; Tubiana, C.; Küppers, M.; Fornasier, S.; Jorda, L.; Sierks, H.

2015-02-01

106

Multispectral calibration to enhance the metrology performance of C-mount camera systems  

NASA Astrophysics Data System (ADS)

Low cost monochrome camera systems based on CMOS sensors and C-mount lenses have been successfully applied to a wide variety of metrology tasks. For high accuracy work such cameras are typically equipped with ring lights to image retro-reflective targets as high contrast image features. Whilst algorithms for target image measurement and lens modelling are highly advanced, including separate RGB channel lens distortion correction, target image circularity compensation and a wide variety of detection and centroiding approaches, less effort has been directed towards optimising physical target image quality by considering optical performance in narrow wavelength bands. This paper describes an initial investigation to assess the effect of wavelength on camera calibration parameters for two different camera bodies and the same `C-mount' wide angle lens. Results demonstrate the expected strong influence on principal distance, radial and tangential distortion, and also highlight possible trends in principal point, orthogonality and affinity parameters which are close to the parameter estimation noise level from the strong convergent self-calibrating image networks.

Robson, S.; MacDonald, L.; Kyle, S. A.; Shortis, M. R.

2014-06-01

107

Angle Practice!  

NSDL National Science Digital Library

How well do you know your angles? Check out these games and put your knowledge to the test! They will stump you if you don't pay close attention to the different angles they give you! Alien Angles! - Use the protractor to guess where the alien has flown away to. If you pick the right spot, you can save all the aliens! Squirt the Dog! Angle practice - Move the hose using different measures of angles to try and squirt the dog. To what degree? - Think you're ready to challenge yourself? Check out ...

Ms. Hume

2012-11-02

108

Bacterial motion in narrow capillaries.  

PubMed

Motile bacteria often have to pass through small tortuous pores in soil or tissue of higher organisms. However, their motion in this prevalent type of niche is not fully understood. Here, we modeled it with narrow glass capillaries and identified a critical radius (Rc) for bacterial motion. Near the surface of capillaries narrower than that, the swimming trajectories are helices. In larger capillaries, they swim in distorted circles. Under non-slip condition, the peritrichous Escherichia coli swam in left-handed helices with an Rc of ?10 ?m near glass surface. However, slipping could occur in the fast monotrichous Pseudomonas fluorescens, when a speed threshold was exceeded, and thus both left-handed and right-handed helices were executed in glass capillaries. In the natural non-cylindrical pores, the near-surface trajectories would be spirals and twisted loops. Engaging in such motions reduces the bacterial migration rate. With a given pore size, the run length and the tumbling angle of the bacterium determine the probability and duration of their near-surface motion. Shear flow and chemotaxis potentially enhance it. Based on this observation, the puzzling previous observations on bacterial migration in porous environments can be interpreted. PMID:25764548

Ping, Liyan; Wasnik, Vaibhav; Emberly, Eldon

2015-02-01

109

Angles (elementary)  

NSDL National Science Digital Library

This lesson is designed to introduce students to acute, obtuse, and right angles. This lesson provides links to discussions and activities related to angles as well as suggested ways to integrate them into the lesson. Finally, the lesson provides links to follow-up lessons designed for use in succession with the current one.

2011-05-23

110

Geolocating Static Cameras  

Microsoft Academic Search

\\u000a A key problem in widely distributed camera networks is\\u000a geolocating the cameras. This paper considers three scenarios for\\u000a camera localization: localizing a camera in an unknown\\u000a environment, adding a new camera in a region with many other\\u000a cameras, and localizing a camera by finding correlations with\\u000a satellite imagery. We find that simple summary statistics (the\\u000a time course of principal component

Nathan Jacobs; Scott Satkin; Nathaniel Roman; Richard Speyer; Robert Pless

2007-01-01

111

Two-Camera Acquisition and Tracking of a Flying Target  

NASA Technical Reports Server (NTRS)

A method and apparatus have been developed to solve the problem of automated acquisition and tracking, from a location on the ground, of a luminous moving target in the sky. The method involves the use of two electronic cameras: (1) a stationary camera having a wide field of view, positioned and oriented to image the entire sky; and (2) a camera that has a much narrower field of view (a few degrees wide) and is mounted on a two-axis gimbal. The wide-field-of-view stationary camera is used to initially identify the target against the background sky. So that the approximate position of the target can be determined, pixel locations on the image-detector plane in the stationary camera are calibrated with respect to azimuth and elevation. The approximate target position is used to initially aim the gimballed narrow-field-of-view camera in the approximate direction of the target. Next, the narrow-field-of view camera locks onto the target image, and thereafter the gimbals are actuated as needed to maintain lock and thereby track the target with precision greater than that attainable by use of the stationary camera.

Biswas, Abhijit; Assad, Christopher; Kovalik, Joseph M.; Pain, Bedabrata; Wrigley, Chris J.; Twiss, Peter

2008-01-01

112

Determining Camera Gain in Room Temperature Cameras  

SciTech Connect

James R. Janesick provides a method for determining the amplification of a CCD or CMOS camera when only access to the raw images is provided. However, the equation that is provided ignores the contribution of dark current. For CCD or CMOS cameras that are cooled well below room temperature, this is not a problem, however, the technique needs adjustment for use with room temperature cameras. This article describes the adjustment made to the equation, and a test of this method.

Joshua Cogliati

2010-12-01

113

Experience with duplex bearings in narrow angle oscillating applications  

NASA Technical Reports Server (NTRS)

Duplex ball bearings are matched pairs on which the abutting faces of the rings have been accurately ground so that when the rings are clamped together, a controlled amount of interference (preload) exists across the balls. These bearings are vulnerable to radial temperature gradients, blocking in oscillation and increased sensitivity to contamination. These conditions decrease the service life of these bearings. It was decided that an accelerated thermal vacuum life test should be conducted. The test apparatus and results are described and the rationale is presented for reducing a multiyear life test on oil lubricated bearings to less than a year.

Phinney, D. D.; Pollard, C. L.; Hinricks, J. T.

1988-01-01

114

Right Angle  

NSDL National Science Digital Library

This article gives teachers background information on right angles. It provides geometric and practical examples, a paper folding construction method, and some history of the usage of the term 'right.'

Paul Goldenberg

2011-06-09

115

Investigating the Pinhole Camera and Camera Obscura  

NSDL National Science Digital Library

In this activity, students explore the nature of light, including the fact that it travels in straight lines, by building and using two visual tools. The first is a simple pinhole camera--a box with a pinhole opening. The second is a camera obscura--a tool

John Eichinger

2009-05-30

116

MISR radiometric camera-by-camera Cloud Mask (MIRCCM_V1)  

NASA Technical Reports Server (NTRS)

The MISR instrument consists of nine pushbroom cameras which measure radiance in four spectral bands. Global coverage is achieved in nine days. The cameras are arranged with one camera pointing toward the nadir, four cameras pointing forward and four cameras pointing aftward. It takes 7 minutes for all nine cameras to view the same surface location. The view angles relative to the surface reference ellipsoid, are 0, 26.1, 45.6, 60.0, and 70.5 degrees. The spectral band shapes are nominally gaussian, centered at 443, 555, 670, and 865 nm. The RCCM is derived from the radiance values, and is calculated independently for each camera. Therefore, the amount of apparent cloudiness will vary with view angle, with the oblique view angles generally being more cloudy than the near-nadir ones. Since the RCCM is calculated primarily from the radiance values, it does not work well over snow and ice and will usually confuse clear snow/ice with cloud. It works the best over clear-sky ocean, but other surface types are also of quite good quality. The RCCM product also contains a glint mask for each camera, and this mask is set to true whenever the scattering angles indicate that glint could be possible. This glint mask is not masked out over land; users must do this step themselves. [Location=GLOBAL] [Temporal_Coverage: Start_Date=2000-02-24; Stop_Date=] [Spatial_Coverage: Southernmost_Latitude=-90; Northernmost_Latitude=90; Westernmost_Longitude=-180; Easternmost_Longitude=180].

Diner, David J. (Principal Investigator)

117

9. COMPLETED ROLLING CAMERA CAR ON RAILROAD TRACK AND BRIDGE ...  

Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

9. COMPLETED ROLLING CAMERA CAR ON RAILROAD TRACK AND BRIDGE LOOKING WEST, APRIL 26, 1948. (ORIGINAL PHOTOGRAPH IN POSSESSION OF DAVE WILLIS, SAN DIEGO, CALIFORNIA.) - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

118

Prediction of Viking lander camera image quality  

NASA Technical Reports Server (NTRS)

Formulations are presented that permit prediction of image quality as a function of camera performance, surface radiance properties, and lighting and viewing geometry. Predictions made for a wide range of surface radiance properties reveal that image quality depends strongly on proper camera dynamic range command and on favorable lighting and viewing geometry. Proper camera dynamic range commands depend mostly on the surface albedo that will be encountered. Favorable lighting and viewing geometries depend mostly on lander orientation with respect to the diurnal sun path over the landing site, and tend to be independent of surface albedo and illumination scattering function. Side lighting with low sun elevation angles (10 to 30 deg) is generally favorable for imaging spatial details and slopes, whereas high sun elevation angles are favorable for measuring spectral reflectances.

Huck, F. O.; Burcher, E. E.; Jobson, D. J.; Wall, S. D.

1976-01-01

119

Development of the collimator response of gas slit camera of MAXI  

NASA Astrophysics Data System (ADS)

Monitor of All-sky X-ray Image (MAXI) is an X-ray all-sky scanner, which will be attached on Exposed Facility of Japanese Experiment Module dubbed "Kibo" of International Space Station (ISS). MAXI will be launched by the Space Shuttle or the Japanese H-IIA Transfer Vehicle (HTV) in 2008. MAXI carries two types of X-ray cameras: Solid-state Slit Camera (SSC) for 0.5-10 keV and Gas Slit Camera (GSC) for 2-30 keV bands. Both have long narrow fields of view (FOV) made by a slit and orthogonally arranged collimator plates (slats). The FOV will sweep almost the whole sky once every 96 minutes by utilizing the orbital motion of ISS. Then the light curve of an X-ray point source become triangular shape in one transit. In this paper, we present the actual triangular response of the GSC collimator, obtained by our calibration. In fact they are deformed by gaps between the slats, leaning angle of the slats, and the effective width of the slats. We are measuring these sizes by shooting X-ray beams into the detector behind the collimator. We summarize the calibration and present the first compilation of the data to make the GSC collimator response, which will be useful for public users.

Morii, Mikio; Matsuoka, Masaru; Ueno, Shiro; Tomida, Hiroshi; Katayama, Haruyoshi; Kawasaki, Kazuyoshi; Yokota, Takao; Kuramata, Naoyuki; Mihara, Tatehiro; Kohama, Mitsuhiro; Isobe, Naoki; Nakajima, Motoki; Tsunemi, Hiroshi; Miyata, Emi; Yoshida, Atsumasa; Yamaoka, Kazutaka; Tsuchiya, Yuichiro; Miyakawa, Takehiro; Kawai, Nobuyuki; Kataoka, Jun; Tanaka, Satoshi; Negoro, Hitoshi

2006-06-01

120

Making Ceramic Cameras  

ERIC Educational Resources Information Center

This article describes how to make a clay camera. This idea of creating functional cameras from clay allows students to experience ceramics, photography, and painting all in one unit. (Contains 1 resource and 3 online resources.)

Squibb, Matt

2009-01-01

121

Vacuum Camera Cooler  

NASA Technical Reports Server (NTRS)

Acquiring cheap, moving video was impossible in a vacuum environment, due to camera overheating. This overheating is brought on by the lack of cooling media in vacuum. A water-jacketed camera cooler enclosure machined and assembled from copper plate and tube has been developed. The camera cooler (see figure) is cup-shaped and cooled by circulating water or nitrogen gas through copper tubing. The camera, a store-bought "spy type," is not designed to work in a vacuum. With some modifications the unit can be thermally connected when mounted in the cup portion of the camera cooler. The thermal conductivity is provided by copper tape between parts of the camera and the cooled enclosure. During initial testing of the demonstration unit, the camera cooler kept the CPU (central processing unit) of this video camera at operating temperature. This development allowed video recording of an in-progress test, within a vacuum environment.

Laugen, Geoffrey A.

2011-01-01

122

Constrained space camera assembly  

DOEpatents

A constrained space camera assembly which is intended to be lowered through a hole into a tank, a borehole or another cavity. The assembly includes a generally cylindrical chamber comprising a head and a body and a wiring-carrying conduit extending from the chamber. Means are included in the chamber for rotating the body about the head without breaking an airtight seal formed therebetween. The assembly may be pressurized and accompanied with a pressure sensing means for sensing if a breach has occurred in the assembly. In one embodiment, two cameras, separated from their respective lenses, are installed on a mounting apparatus disposed in the chamber. The mounting apparatus includes means allowing both longitudinal and lateral movement of the cameras. Moving the cameras longitudinally focuses the cameras, and moving the cameras laterally away from one another effectively converges the cameras so that close objects can be viewed. The assembly further includes means for moving lenses of different magnification forward of the cameras.

Heckendorn, Frank M. (Aiken, SC); Anderson, Erin K. (Augusta, GA); Robinson, Casandra W. (Trenton, SC); Haynes, Harriet B. (Aiken, SC)

1999-01-01

123

Contrail study with ground-based cameras  

NASA Astrophysics Data System (ADS)

Photogrammetric methods and analysis results for contrails observed with wide-angle cameras are described. Four cameras of two different types (view angle < 90° or whole-sky imager) at the ground at various positions are used to track contrails and to derive their altitude, width, and horizontal speed. Camera models for both types are described to derive the observation angles for given image coordinates and their inverse. The models are calibrated with sightings of the Sun, the Moon and a few bright stars. The methods are applied and tested in a case study. Four persistent contrails crossing each other together with a short-lived one are observed with the cameras. Vertical and horizontal positions of the contrails are determined from the camera images to an accuracy of better than 200 m and horizontal speed to 0.2 m s-1. With this information, the aircraft causing the contrails are identified by comparison to traffic waypoint data. The observations are compared with synthetic camera pictures of contrails simulated with the contrail prediction model CoCiP, a Lagrangian model using air traffic movement data and numerical weather prediction (NWP) data as input. The results provide tests for the NWP and contrail models. The cameras show spreading and thickening contrails suggesting ice-supersaturation in the ambient air. The ice-supersaturated layer is found thicker and more humid in this case than predicted by the NWP model used. The simulated and observed contrail positions agree up to differences caused by uncertain wind data. The contrail widths, which depend on wake vortex spreading, ambient shear and turbulence, were partly wider than simulated.

Schumann, U.; Hempel, R.; Flentje, H.; Garhammer, M.; Graf, K.; Kox, S.; Lösslein, H.; Mayer, B.

2013-08-01

124

Contrail study with ground-based cameras  

NASA Astrophysics Data System (ADS)

Photogrammetric methods and analysis results for contrails observed with wide-angle cameras are described. Four cameras of two different types (view angle < 90° or whole-sky imager) at the ground at various positions are used to track contrails and to derive their altitude, width, and horizontal speed. Camera models for both types are described to derive the observation angles for given image coordinates and their inverse. The models are calibrated with sightings of the Sun, the Moon and a few bright stars. The methods are applied and tested in a case study. Four persistent contrails crossing each other, together with a short-lived one, are observed with the cameras. Vertical and horizontal positions of the contrails are determined from the camera images to an accuracy of better than 230 m and horizontal speed to 0.2 m s-1. With this information, the aircraft causing the contrails are identified by comparison to traffic waypoint data. The observations are compared with synthetic camera pictures of contrails simulated with the contrail prediction model CoCiP, a Lagrangian model using air traffic movement data and numerical weather prediction (NWP) data as input. The results provide tests for the NWP and contrail models. The cameras show spreading and thickening contrails, suggesting ice-supersaturation in the ambient air. The ice-supersaturated layer is found thicker and more humid in this case than predicted by the NWP model used. The simulated and observed contrail positions agree up to differences caused by uncertain wind data. The contrail widths, which depend on wake vortex spreading, ambient shear and turbulence, were partly wider than simulated.

Schumann, U.; Hempel, R.; Flentje, H.; Garhammer, M.; Graf, K.; Kox, S.; Lösslein, H.; Mayer, B.

2013-12-01

125

Estimating Angles  

NSDL National Science Digital Library

This Flash game for one or two players gives students practice in estimating the size of angles. A circle and a radius pointing in a random direction are given. The student activates a second sweeping radius, which can move in either direction, and tries to stop it at the specified measure. Three difficulty levels control the range of angle measures. Points are awarded based on closeness of the estimate. The Teachers' Notes page includes suggestions for implementation, discussion questions, ideas for extension and support.

126

Camera for landing applications  

Microsoft Academic Search

This paper describes the Enhanced Video System (EVS) camera, built by OPGAL as subcontractor of Kollsman Inc. The EVS contains a Head up Display built by Honeywell, a special design camera for landing applications, and the external window installed on the plane together with the electronic control box built by Kollsman. The special design camera for lending applications is the

Ernest Grimberg

2001-01-01

127

cameras are watching you  

E-print Network

of software expands the small field of view that traditional pan-tilt-zoom security cameras offer. When the viewspaces of all the security cameras in an area overlap. Then it can determine the geocameras are watching you New surveillance camera being developed by Ohio

Davis, James W.

128

Digital Pinhole Camera  

ERIC Educational Resources Information Center

In this article we describe how the classic pinhole camera demonstration can be adapted for use with digital cameras. Students can easily explore the effects of the size of the pinhole and its distance from the sensor on exposure time, magnification, and image quality. Instructions for constructing a digital pinhole camera and our method for…

Lancor, Rachael; Lancor, Brian

2014-01-01

129

Omnifocus video camera  

NASA Astrophysics Data System (ADS)

The omnifocus video camera takes videos, in which objects at different distances are all in focus in a single video display. The omnifocus video camera consists of an array of color video cameras combined with a unique distance mapping camera called the Divcam. The color video cameras are all aimed at the same scene, but each is focused at a different distance. The Divcam provides real-time distance information for every pixel in the scene. A pixel selection utility uses the distance information to select individual pixels from the multiple video outputs focused at different distances, in order to generate the final single video display that is everywhere in focus. This paper presents principle of operation, design consideration, detailed construction, and over all performance of the omnifocus video camera. The major emphasis of the paper is the proof of concept, but the prototype has been developed enough to demonstrate the superiority of this video camera over a conventional video camera. The resolution of the prototype is high, capturing even fine details such as fingerprints in the image. Just as the movie camera was a significant advance over the still camera, the omnifocus video camera represents a significant advance over all-focus cameras for still images.

Iizuka, Keigo

2011-04-01

130

800 x 800 charge-coupled device /CCD/ camera for the Galileo Jupiter Orbiter mission  

NASA Technical Reports Server (NTRS)

During January 1982 the NASA space transportation system will launch a Galileo spacecraft composed of an orbiting bus and an atmospheric entry probe to arrive at the planet Jupiter in July 1985. A prime element of the orbiter's scientific instrument payload will be a new generation slow-scan planetary imaging system based on a newly developed 800 x 800 charge-coupled device (CCD) image sensor. Following Jupiter orbit insertion, the single, narrow-angle, CCD camera, designated the Solid State Imaging (SSI) Subsystem, will operate for 20 months as the orbiter makes repeated encounters with Jupiter and its Galilean Satellites. During this period the SSI will acquire 40,000 images of Jupiter's atmosphere and the surfaces of the Galilean Satellites. This paper describes the SSI, its operational modes, and science objectives.

Clary, M. C.; Klaasen, K. P.; Snyder, L. M.; Wang, P. K.

1979-01-01

131

Ultra-fast framing camera tube  

DOEpatents

An electronic framing camera tube features focal plane image dissection and synchronized restoration of the dissected electron line images to form two-dimensional framed images. Ultra-fast framing is performed by first streaking a two-dimensional electron image across a narrow slit, thereby dissecting the two-dimensional electron image into sequential electron line images. The dissected electron line images are then restored into a framed image by a restorer deflector operated synchronously with the dissector deflector. The number of framed images on the tube's viewing screen is equal to the number of dissecting slits in the tube. The distinguishing features of this ultra-fast framing camera tube are the focal plane dissecting slits, and the synchronously-operated restorer deflector which restores the dissected electron line images into a two-dimensional framed image. The framing camera tube can produce image frames having high spatial resolution of optical events in the sub-100 picosecond range.

Kalibjian, Ralph (1051 Batavia Ave., Livermore, CA 94550)

1981-01-01

132

Single-Camera Panoramic-Imaging Systems  

NASA Technical Reports Server (NTRS)

Panoramic detection systems (PDSs) are developmental video monitoring and image-data processing systems that, as their name indicates, acquire panoramic views. More specifically, a PDS acquires images from an approximately cylindrical field of view that surrounds an observation platform. The main subsystems and components of a basic PDS are a charge-coupled- device (CCD) video camera and lens, transfer optics, a panoramic imaging optic, a mounting cylinder, and an image-data-processing computer. The panoramic imaging optic is what makes it possible for the single video camera to image the complete cylindrical field of view; in order to image the same scene without the benefit of the panoramic imaging optic, it would be necessary to use multiple conventional video cameras, which have relatively narrow fields of view.

Lindner, Jeffrey L.; Gilbert, John

2007-01-01

133

WWW.INI.RUB.DE Roll Angle Estimation for Motorcycles  

E-print Network

) and estimates of the roll angle for the IMU-based Kalman filter method (IMU) and the Video-based method (VideoWWW.INI.RUB.DE Roll Angle Estimation for Motorcycles: Comparing Video and Inertial Sensor modules to motorcycles requires a known camera pose. Roll angle estimation is an important task

Schlipsing, Marc

134

NYC Surveillance Camera Project  

NSDL National Science Digital Library

These two sites focus on the increasing numbers of surveillance cameras in New York City. The first provides a .pdf-formatted map of the more than 2,300 camera locations throughout New York as well as text listings broken down by community. The information was compiled by volunteers from the New York Civil Liberties Union (NYCLU). In addition to information on camera locations, in the news section of the site, users will find links to related Websites, FAQs, and sites related to taxi cameras and traffic cameras. Both of these sites are unabashedly anti-surveillance technology and will be appreciated by New Yorkers concerned with civil liberties issues.

1998-01-01

135

Tower Camera Handbook  

SciTech Connect

The tower camera in Barrow provides hourly images of ground surrounding the tower. These images may be used to determine fractional snow cover as winter arrives, for comparison with the albedo that can be calculated from downward-looking radiometers, as well as some indication of present weather. Similarly, during spring time, the camera images show the changes in the ground albedo as the snow melts. The tower images are saved in hourly intervals. In addition, two other cameras, the skydeck camera in Barrow and the piling camera in Atqasuk, show the current conditions at those sites.

Moudry, D

2005-01-01

136

Traffic camera system development  

NASA Astrophysics Data System (ADS)

The intelligent transportation system has generated a strong need for the development of intelligent camera systems to meet the requirements of sophisticated applications, such as electronic toll collection (ETC), traffic violation detection and automatic parking lot control. In order to achieve the highest levels of accuracy in detection, these cameras must have high speed electronic shutters, high resolution, high frame rate, and communication capabilities. A progressive scan interline transfer CCD camera, with its high speed electronic shutter and resolution capabilities, provides the basic functions to meet the requirements of a traffic camera system. Unlike most industrial video imaging applications, traffic cameras must deal with harsh environmental conditions and an extremely wide range of light. Optical character recognition is a critical function of a modern traffic camera system, with detection and accuracy heavily dependent on the camera function. In order to operate under demanding conditions, communication and functional optimization is implemented to control cameras from a roadside computer. The camera operates with a shutter speed faster than 1/2000 sec. to capture highway traffic both day and night. Consequently camera gain, pedestal level, shutter speed and gamma functions are controlled by a look-up table containing various parameters based on environmental conditions, particularly lighting. Lighting conditions are studied carefully, to focus only on the critical license plate surface. A unique light sensor permits accurate reading under a variety of conditions, such as a sunny day, evening, twilight, storms, etc. These camera systems are being deployed successfully in major ETC projects throughout the world.

Hori, Toshi

1997-04-01

137

Parallel Tracking and Mapping on a camera phone  

Microsoft Academic Search

Camera phones are a promising platform for hand-held augmented reality. As their computational resources grow, they are becoming increasingly suitable for visual tracking tasks. At the same time, they still offer considerable challenges: Their cameras offer a narrow field-of-view not best suitable for robust tracking; images are often received at less than 15 Hz; long exposure times result in significant

Georg Klein; David W. Murray

2009-01-01

138

Improved Tracking of Targets by Cameras on a Mars Rover  

NASA Technical Reports Server (NTRS)

A paper describes a method devised to increase the robustness and accuracy of tracking of targets by means of three stereoscopic pairs of video cameras on a Mars-rover-type exploratory robotic vehicle. Two of the camera pairs are mounted on a mast that can be adjusted in pan and tilt; the third camera pair is mounted on the main vehicle body. Elements of the method include a mast calibration, a camera-pointing algorithm, and a purely geometric technique for handing off tracking between different camera pairs at critical distances as the rover approaches a target of interest. The mast calibration is an extension of camera calibration in which the camera images of calibration targets at known positions are collected at various pan and tilt angles. In the camerapointing algorithm, pan and tilt angles are computed by a closed-form, non-iterative solution of inverse kinematics of the mast combined with mathematical models of the cameras. The purely geometric camera-handoff technique involves the use of stereoscopic views of a target of interest in conjunction with the mast calibration.

Kim, Won; Ansar, Adnan; Steele, Robert

2007-01-01

139

Effects of incidence angle on observations of equilibrium crater diameter  

NASA Astrophysics Data System (ADS)

Determining the equilibrium crater diameter for a crater population is important in lunar regolith depth estimates as the equilibrium diameter represents the steady-state between the formation of new craters and the removal of older craters [1]. [2] hypothesized that the number of craters identified in an image is dependent on the incidence angle and showed that for three different young mare regions, fewer craters are visible at lower incidence angles, affecting reliable estimates of the equilibrium diameter of the counted crater population. [3] disputed this hypothesis and the presence of an equilibrium crater population in the data from [2]. Testing the hypothesis from [2], we chose four Apollo Metric images of the same area with different incidence angles to examine the effects of resolution on apparent equilibrium diameter estimates. We selected a 100 km2 area centered at 27.3°N, 18.2°W in Mare Imbrium east of Lambert crater with data at 87°, 82°, 71°, and 50° incidence angles, and scan resolutions of 6.6 to 7.6 m/pixel. To compare the craters visible at different illuminations, we resampled the images to 10 m/pixel and employed three individuals to count craters. The cumulative histograms for the four Apollo Metric frames exhibit the effects of different incidence angles on reliably counting craters. Current results show that the crater counts for the 82° incidence angle image are the most consistent between different observers, finding a production function slope of -4.1 and an apparent equilibrium diameter of 200 m. Deviation from the small crater trends (equilibrium population?) and the production function slope observed at 82° incidence is found at the higher (87°) and lower (71°, 50°) incidence angles. We attribute some of this deviation to the effects of incidence angle on crater detection; at crater diameters >~300 m, we find similar production functions, an observation consistent with our identification of these large craters in all four illuminations. However, the small crater trends vary significantly among observations at different illuminations. An important question is whether the small crater slope and rollover we observe are representative of the equilibrium crater population or whether these observations are due to resolution limits of the images, a too-small count area, or shadow effects (e.g., loss of small craters in the shadows of larger craters). To test if the observed rollover in the cumulative histograms is due to resolution effects or to the observation of the equilibrium crater population, we will use substantially higher resolution images. Images from the Lunar Reconnaissance Orbiter Camera Narrow Angle Camera (resolution increasing during the nominal mission from ~1.5 to ~0.5 m/pixel) at incidence angles ranging from 50° to 87°, focusing on higher incidences (70° to 87°), will be used to maximize the identification of small craters. [1] L. A. Soderblom (1970) JGR, 75, 2655. [2] B. B. Wilcox et al. (2005) Meteoritics & Plan. Sci., 40, 695. [3] V. R. Oberbeck (2008) Meteoritics & Plan. Sci., 43, 815.

Ostrach, L. R.; Denevi, B. W.; Hastings, A.; Koeber, S.; Robinson, M. S.; Thomas, P. C.; Tran, T. N.

2009-12-01

140

Microchannel plate streak camera  

DOEpatents

An improved streak camera in which a microchannel plate electron multiplier is used in place of or in combination with the photocathode used in prior streak cameras is disclosed. The improved streak camera is far more sensitive to photons (UV to gamma-rays) than the conventional x-ray streak camera which uses a photocathode. The improved streak camera offers gamma-ray detection with high temporal resolution. It also offers low-energy x-ray detection without attenuation inside the cathode. Using the microchannel plate in the improved camera has resulted in a time resolution of about 150 ps, and has provided a sensitivity sufficient for 1,000 KeV x-rays. 3 figs.

Wang, C.L.

1989-03-21

141

Microchannel plate streak camera  

DOEpatents

An improved streak camera in which a microchannel plate electron multiplier is used in place of or in combination with the photocathode used in prior streak cameras. The improved streak camera is far more sensitive to photons (UV to gamma-rays) than the conventional x-ray streak camera which uses a photocathode. The improved streak camera offers gamma-ray detection with high temporal resolution. It also offers low-energy x-ray detection without attenuation inside the cathode. Using the microchannel plate in the improved camera has resulted in a time resolution of about 150 ps, and has provided a sensitivity sufficient for 1000 KeV x-rays.

Wang, Ching L. (Livermore, CA)

1989-01-01

142

Microchannel plate streak camera  

DOEpatents

An improved streak camera in which a microchannel plate electron multiplier is used in place of or in combination with the photocathode used in prior streak cameras. The improved streak camera is far more sensitive to photons (uv to gamma-rays) than the conventional x-ray streak camera which uses a photocathode. The improved streak camera offers gamma-ray detection with high temporal resolution. It also offers low-energy x-ray detection without attenuation inside the cathode. Using the microchannel plate in the improved camera has resulted in a time resolution of about 150 ps, and has provided a sensitivity sufficient for 1000 keV x-rays.

Wang, C.L.

1984-09-28

143

FPA camera standardisation  

NASA Astrophysics Data System (ADS)

The temperature standardisation of an infrared camera is generally done with an internal black body. However, some cameras do not have such correction and some particular effects like Narcissus or other internal contributions disturb the measurements. The determination of the different contributions of the thermosignal given by the camera allows us to propose a procedure in order to obtain an absolute temperature with a precision of one degree.

Horny, N.

2003-04-01

144

Analytical multicollimator camera calibration  

USGS Publications Warehouse

Calibration with the U.S. Geological survey multicollimator determines the calibrated focal length, the point of symmetry, the radial distortion referred to the point of symmetry, and the asymmetric characteristiecs of the camera lens. For this project, two cameras were calibrated, a Zeiss RMK A 15/23 and a Wild RC 8. Four test exposures were made with each camera. Results are tabulated for each exposure and averaged for each set. Copies of the standard USGS calibration reports are included. ?? 1978.

Tayman, W.P.

1978-01-01

145

New developments to improve SO2 cameras  

NASA Astrophysics Data System (ADS)

The SO2 camera is a remote sensing instrument that measures the two-dimensional distribution of SO2 (column densities) in volcanic plumes using scattered solar radiation as a light source. From these data SO2-fluxes can be derived. The high time resolution of the order of 1 Hz allows correlating SO2 flux measurements with other traditional volcanological measurement techniques, i.e., seismology. In the last years the application of SO2 cameras has increased, however, there is still potential to improve the instrumentation. First of all, the influence of aerosols and ash in the volcanic plume can lead to large errors in the calculated SO2 flux, if not accounted for. We present two different concepts to deal with the influence of ash and aerosols. The first approach uses a co-axial DOAS system that was added to a two filter SO2 camera. The camera used Filter A (peak transmission centred around 315 nm) to measures the optical density of SO2 and Filter B (centred around 330 nm) to correct for the influence of ash and aerosol. The DOAS system simultaneously performs spectroscopic measurements in a small area of the camera's field of view and gives additional information to correct for these effects. Comparing the optical densities for the two filters with the SO2 column density from the DOAS allows not only a much more precise calibration, but also to draw conclusions about the influence from ash and aerosol scattering. Measurement examples from Popocatépetl, Mexico in 2011 are shown and interpreted. Another approach combines the SO2 camera measurement principle with the extremely narrow and periodic transmission of a Fabry-Pérot interferometer. The narrow transmission window allows to select individual SO2 absorption bands (or series of bands) as a substitute for Filter A. Measurements are therefore more selective to SO2. Instead of Filter B, as in classical SO2 cameras, the correction for aerosol can be performed by shifting the transmission window of the Fabry-Pérot interferometer towards the SO2 absorption cross section minima. A correction of ash and aerosol influences with this technique can decrease deviation from the true column by more than 60%, since the wavelength difference between the two measurement channels is much smaller than in classical SO2 cameras. While the implementation of this approach for a 2D camera encompasses many challenges, it gives the possibility to build a relatively simple and robust scanning instrument for volcanic SO2 distributions. A second problem of the SO2 camera technique is the relatively high price, which prevents its use in many volcano observatories in developing countries. Most SO2 cameras use CCDs that were originally designed for astronomical purposes. The large pixel size and low noise of these detectors compensates for the low intensity of solar radiation in the UV and the low quantum efficiency of the detector in this spectral range. However, the detectors used cost several thousand US dollars. We present results from test measurements using a consumer DSLR camera as a detector of an SO2 camera. Since the camera is not sensitive in the UV, the incoming radiation is first imaged onto a screen that is covered with a suitable fluorescent dye converting the UV radiation to visible light.

Luebcke, P.; Bobrowski, N.; Hoermann, C.; Kern, C.; Klein, A.; Kuhn, J.; Vogel, L.; Platt, U.

2012-12-01

146

Miniaturized thermal snapshot camera  

NASA Astrophysics Data System (ADS)

This paper reports on the development of a new class of thermal cameras. Known as the FLAsh STabilized (FLAST) thermal imaging camera systme, these cameras are the first to be able to capture snapshop thermal images. Results from testing of the prototype unit will be presented and status on the design of amore efficient, miniaturized version for produciotn. The camera is highly programmable for images capture method, shot sequence, and shot quantity. To achieve the ability to operate in a snapship mode, the FLAST camera is designed to function without the need for cooling or other thermal regulation. In addition, the camera can operate over extended periods without the need for re-calibration. Thus, the cemera does not require a shutter, chopper or user inserted imager blocking system. This camera is capable operating for weeks using standard AA batteries. The initial camera configuration provides an image resolution of 320 x 240 and is able to turn-on and capture an image within approximately 1/4 sec. The FLAST camera operates autonomously, to collect, catalog and store over 500 images. Any interface and relay system capable of video formatted input will be able to serve as the image download transmission system.

Hornback, William B.; Payson, Ellwood; Linsacum, Deron L.; Ward, Kenneth; Kennedy, John; Myers, Leo; Cuadra, Dean; Li, Mark

2003-01-01

147

Ringfield lithographic camera  

DOEpatents

A projection lithography camera is presented with a wide ringfield optimized so as to make efficient use of extreme ultraviolet radiation from a large area radiation source (e.g., D.sub.source .apprxeq.0.5 mm). The camera comprises four aspheric mirrors optically arranged on a common axis of symmetry with an increased etendue for the camera system. The camera includes an aperture stop that is accessible through a plurality of partial aperture stops to synthesize the theoretical aperture stop. Radiation from a mask is focused to form a reduced image on a wafer, relative to the mask, by reflection from the four aspheric mirrors.

Sweatt, William C. (Albuquerque, NM)

1998-01-01

148

LSST Camera Optics Design  

SciTech Connect

The Large Synoptic Survey Telescope (LSST) uses a novel, three-mirror, telescope design feeding a camera system that includes a set of broad-band filters and three refractive corrector lenses to produce a flat field at the focal plane with a wide field of view. Optical design of the camera lenses and filters is integrated in with the optical design of telescope mirrors to optimize performance. We discuss the rationale for the LSST camera optics design, describe the methodology for fabricating, coating, mounting and testing the lenses and filters, and present the results of detailed analyses demonstrating that the camera optics will meet their performance goals.

Riot, V J; Olivier, S; Bauman, B; Pratuch, S; Seppala, L; Gilmore, D; Ku, J; Nordby, M; Foss, M; Antilogus, P; Morgado, N

2012-05-24

149

Narrow band 3 × 3 Mueller polarimetric endoscopy  

PubMed Central

Mueller matrix polarimetric imaging has shown potential in tissue diagnosis but is challenging to implement endoscopically. In this work, a narrow band 3 × 3 Mueller matrix polarimetric endoscope was designed by rotating the endoscope to generate 0°, 45° and 90° linearly polarized illumination and positioning a rotating filter wheel in front of the camera containing three polarisers to permit polarization state analysis for backscattered light. The system was validated with a rotating linear polarizer and a diffuse reflection target. Initial measurements of 3 × 3 Mueller matrices on a rat are demonstrated, followed by matrix decomposition into the depolarization and retardance matrices for further analysis. Our work shows the feasibility of implementing polarimetric imaging in a rigid endoscope conveniently and economically in order to reveal diagnostic information. PMID:24298405

Qi, Ji; Ye, Menglong; Singh, Mohan; Clancy, Neil T.; Elson, Daniel S.

2013-01-01

150

Constrained space camera assembly  

DOEpatents

A constrained space camera assembly which is intended to be lowered through a hole into a tank, a borehole or another cavity is disclosed. The assembly includes a generally cylindrical chamber comprising a head and a body and a wiring-carrying conduit extending from the chamber. Means are included in the chamber for rotating the body about the head without breaking an airtight seal formed therebetween. The assembly may be pressurized and accompanied with a pressure sensing means for sensing if a breach has occurred in the assembly. In one embodiment, two cameras, separated from their respective lenses, are installed on a mounting apparatus disposed in the chamber. The mounting apparatus includes means allowing both longitudinal and lateral movement of the cameras. Moving the cameras longitudinally focuses the cameras, and moving the cameras laterally away from one another effectively converges the cameras so that close objects can be viewed. The assembly further includes means for moving lenses of different magnification forward of the cameras. 17 figs.

Heckendorn, F.M.; Anderson, E.K.; Robinson, C.W.; Haynes, H.B.

1999-05-11

151

Security camera video authentication  

Microsoft Academic Search

The ability to authenticate images captured by a security camera, and localise any tampered areas, will increase the value of these images as evidence in a court of law. This paper outlines the challenges in security camera video authentication, and discusses the reasons why fingerprinting, a robust type of digital signature, provides a solution preferable to semi-fragile watermarking. A fingerprint

D. K. Roberts

2002-01-01

152

Camera Operator and Videographer  

ERIC Educational Resources Information Center

Television, video, and motion picture camera operators produce images that tell a story, inform or entertain an audience, or record an event. They use various cameras to shoot a wide range of material, including television series, news and sporting events, music videos, motion pictures, documentaries, and training sessions. Those who film or…

Moore, Pam

2007-01-01

153

CCD Luminescence Camera  

NASA Technical Reports Server (NTRS)

New diagnostic tool used to understand performance and failures of microelectronic devices. Microscope integrated to low-noise charge-coupled-device (CCD) camera to produce new instrument for analyzing performance and failures of microelectronics devices that emit infrared light during operation. CCD camera also used to indentify very clearly parts that have failed where luminescence typically found.

Janesick, James R.; Elliott, Tom

1987-01-01

154

DIGITAL CAMERA User's Manual  

E-print Network

En DIGITAL CAMERA User's Manual #12;Trademark Information · Microsoft, Windows and Windows Vista in the event of malfunction Should you notice smoke or an unusual smell coming from the camera or AC adapter to a Nikon-authorized service representative for inspection. Do not disassemble Touching the internal parts

Saldin, Dilano

155

What is the Space of Spectral Sensitivity Functions for Digital Color Cameras? Sabine Susstrunk  

E-print Network

methods to recover camera spectral sensitivities using regular reflective color targets (e.g., color spectral sensitivities are often measured with a monochromator that generates narrow-band light sim- plify the recovery of camera spectral sensitivities by us- ing specialized targets

Gu, Jinwei

156

Dry imaging cameras.  

PubMed

Dry imaging cameras are important hard copy devices in radiology. Using dry imaging camera, multiformat images of digital modalities in radiology are created from a sealed unit of unexposed films. The functioning of a modern dry camera, involves a blend of concurrent processes, in areas of diverse sciences like computers, mechanics, thermal, optics, electricity and radiography. Broadly, hard copy devices are classified as laser and non laser based technology. When compared with the working knowledge and technical awareness of different modalities in radiology, the understanding of a dry imaging camera is often superficial and neglected. To fill this void, this article outlines the key features of a modern dry camera and its important issues that impact radiology workflow. PMID:21799589

Indrajit, Ik; Alam, Aftab; Sahni, Hirdesh; Bhatia, Mukul; Sahu, Samaresh

2011-04-01

157

Dry imaging cameras  

PubMed Central

Dry imaging cameras are important hard copy devices in radiology. Using dry imaging camera, multiformat images of digital modalities in radiology are created from a sealed unit of unexposed films. The functioning of a modern dry camera, involves a blend of concurrent processes, in areas of diverse sciences like computers, mechanics, thermal, optics, electricity and radiography. Broadly, hard copy devices are classified as laser and non laser based technology. When compared with the working knowledge and technical awareness of different modalities in radiology, the understanding of a dry imaging camera is often superficial and neglected. To fill this void, this article outlines the key features of a modern dry camera and its important issues that impact radiology workflow. PMID:21799589

Indrajit, IK; Alam, Aftab; Sahni, Hirdesh; Bhatia, Mukul; Sahu, Samaresh

2011-01-01

158

Seasonal and vertical changes in leaf angle distribution for selected deciduous broadleaf tree species common to Europe  

NASA Astrophysics Data System (ADS)

Leaf inclination angle distribution is a key parameter in determining the transmission and reflection of radiation by vegetation canopies. It has been previously observed that leaf inclination angle might change gradually from more vertical in the upper canopy and in high light habitats to more horizontal in the lower canopy and in low light habitats [1]. Despite its importance, relatively few measurements on actual leaf angle distributions have been reported for different tree species. Even smaller number of studies have dealt with the possible seasonal changes in leaf angle distribution [2]. In this study the variation of leaf inclination angle distributions was examined both temporally throughout the growing season and vertically at different heights of trees. We report on leaf inclination angle distributions for five deciduous broadleaf species found commonly in several parts of Europe: grey alder (Alnus incana), Silver birch (Betula pendula Roth), chestnut (Castanea), Norway maple (Acer platanoides), and aspen (Populus tremula). The angles were measured using the leveled camera method [3], with the data collected at several separate heights and four times during the period of May-September 2013. The results generally indicate the greatest change in leaf inclination angles for spring, with the changes usually being the most pronounced at the top of the canopy. It should also be noted, however, that whereas the temporal variation proved to be rather consistent for different species, the vertical variation differed more between species. The leveled camera method was additionally tested in terms of sensitivity to different users. Ten people were asked to measure the leaf angles for four different species. The results indicate the method is quite robust in providing coinciding distributions irrespective of the user and level of previous experience with the method. However, certain caution must be exercised when measuring long narrow leaves. References [1] G.G. McMillen, and J.H. McClendon, "Leaf angle: an adaptive feature of sun and shade leaves," Botanical Gazette, vol. 140, pp. 437-442, 1979. [2] J. Pisek, O. Sonnentag, A.D. Richardson, and M. Mõttus, "Is the spherical leaf inclination angle distribution a valid assumption for temperate and boreal broadleaf tree species?" Agricultural and Forest Meteorology, vol. 169, pp. 186-194, 2013. [3] Y. Ryu, O. Sonnentag, T. Nilson, R. Vargas, H. Kobayashi, R. Wenk, and D. Baldocchi, "How to quantify tree leaf area index in a heterogenous savanna ecosystem: a multi-instrument and multimodel approach," Agricultural and Forest Meteorology, vol. 150, pp. 63-76, 2010.

Raabe, Kairi; Pisek, Jan; Sonnentag, Oliver; Annuk, Kalju

2014-05-01

159

Status of the Los Alamos Anger camera  

SciTech Connect

Results of preliminary tests of the neutron Anger camera being developed at Los Alamos are presented. This detector uses a unique encoding scheme involving parellel processing of multiple receptive fields. Design goals have not yet been met, but the results are very encouraging and improvements in the test procedures are expected to show that the detector will be ready for use on a small-angle scattering instrument next year. 3 refs., 4 figs.

Seeger, P.A.; Nutter, M.J.

1985-01-01

160

Measurement of Dicke Narrowing in Electromagnetically Induced Transparency  

E-print Network

Dicke narrowing is a phenomena that dramatically reduces the Doppler width of spectral lines, due to frequent velocity-changing collisions. A similar phenomena occurs for electromagnetically induced transparency (EIT) resonances, and facilitates ultra-narrow spectral features in room-temperature vapor. We directly measure the Dicke-like narrowing by studying EIT line-shapes as a function of the angle between the pump and the probe beams. The measurements are in good agreement with an analytic theory with no fit parameters. The results show that Dicke narrowing can increase substantially the tolerance of hot-vapor EIT to angular deviations. We demonstrate the importance of this effect for applications such as imaging and spatial solitons using a single-shot imaging experiment, and discuss the implications on the feasibility of storing images in atomic vapor.

M. Shuker; O. Firstenberg; R. Pugatch; A. Ben-Kish; A. Ron; N. Davidson

2007-03-13

161

Educational Applications for Digital Cameras.  

ERIC Educational Resources Information Center

Discusses uses of digital cameras in education. Highlights include advantages and disadvantages, digital photography assignments and activities, camera features and operation, applications for digital images, accessory equipment, and comparisons between digital cameras and other digitizers. (AEF)

Cavanaugh, Terence; Cavanaugh, Catherine

1997-01-01

162

The Dawn Framing Camera  

NASA Astrophysics Data System (ADS)

The Framing Camera (FC) is the German contribution to the Dawn mission. The camera will map 4 Vesta and 1 Ceres through a clear filter and 7 band-pass filters covering the wavelengths from the visible to the near-IR. The camera will allow the determination of the physical parameters of the asteroids, the reconstruction of their global shape as well as local topography and surface geomorphology, and provide information on composition via surface reflectance characteristics. The camera will also serve for orbit navigation. The resolution of the Framing Camera will be up to 12 m per pixel in low altitude mapping orbit at Vesta (62 m per pixel at Ceres), at an angular resolution of 93.7 ?rad px-1. The instrument uses a reclosable front door to protect the optical system and a filter-wheel mechanism to select the band-pass for observation. The detector data is read out and processed by a data processing unit. A power converter unit supplies all required power rails for operation and thermal maintenance. For redundancy reasons, two identical cameras were provided, both located side by side on the + Z-deck of the spacecraft. Each camera has a mass of 5.5 kg.

Sierks, H.; Keller, H. U.; Jaumann, R.; Michalik, H.; Behnke, T.; Bubenhagen, F.; Büttner, I.; Carsenty, U.; Christensen, U.; Enge, R.; Fiethe, B.; Gutiérrez Marqués, P.; Hartwig, H.; Krüger, H.; Kühne, W.; Maue, T.; Mottola, S.; Nathues, A.; Reiche, K.-U.; Richards, M. L.; Roatsch, T.; Schröder, S. E.; Szemerey, I.; Tschentscher, M.

2011-12-01

163

What are the benefits of having multiple camera angles?  

Atmospheric Science Data Center

... can be interpreted (with appropriate models) to document the properties of the target, just as the more familiar spectral differences are ... or minimize the effect of sun glint over the ocean and other water surfaces, thereby enabling observations even when traditional sensors are ...

2014-12-08

164

Calibration of Action Cameras for Photogrammetric Purposes  

PubMed Central

The use of action cameras for photogrammetry purposes is not widespread due to the fact that until recently the images provided by the sensors, using either still or video capture mode, were not big enough to perform and provide the appropriate analysis with the necessary photogrammetric accuracy. However, several manufacturers have recently produced and released new lightweight devices which are: (a) easy to handle, (b) capable of performing under extreme conditions and more importantly (c) able to provide both still images and video sequences of high resolution. In order to be able to use the sensor of action cameras we must apply a careful and reliable self-calibration prior to the use of any photogrammetric procedure, a relatively difficult scenario because of the short focal length of the camera and its wide angle lens that is used to obtain the maximum possible resolution of images. Special software, using functions of the OpenCV library, has been created to perform both the calibration and the production of undistorted scenes for each one of the still and video image capturing mode of a novel action camera, the GoPro Hero 3 camera that can provide still images up to 12 Mp and video up 8 Mp resolution. PMID:25237898

Balletti, Caterina; Guerra, Francesco; Tsioukas, Vassilios; Vernier, Paolo

2014-01-01

165

Calibration of action cameras for photogrammetric purposes.  

PubMed

The use of action cameras for photogrammetry purposes is not widespread due to the fact that until recently the images provided by the sensors, using either still or video capture mode, were not big enough to perform and provide the appropriate analysis with the necessary photogrammetric accuracy. However, several manufacturers have recently produced and released new lightweight devices which are: (a) easy to handle, (b) capable of performing under extreme conditions and more importantly (c) able to provide both still images and video sequences of high resolution. In order to be able to use the sensor of action cameras we must apply a careful and reliable self-calibration prior to the use of any photogrammetric procedure, a relatively difficult scenario because of the short focal length of the camera and its wide angle lens that is used to obtain the maximum possible resolution of images. Special software, using functions of the OpenCV library, has been created to perform both the calibration and the production of undistorted scenes for each one of the still and video image capturing mode of a novel action camera, the GoPro Hero 3 camera that can provide still images up to 12 Mp and video up 8 Mp resolution. PMID:25237898

Balletti, Caterina; Guerra, Francesco; Tsioukas, Vassilios; Vernier, Paolo

2014-01-01

166

Camera for landing applications  

NASA Astrophysics Data System (ADS)

This paper describes the Enhanced Video System (EVS) camera, built by OPGAL as subcontractor of Kollsman Inc. The EVS contains a Head up Display built by Honeywell, a special design camera for landing applications, and the external window installed on the plane together with the electronic control box built by Kollsman. The special design camera for lending applications is the subject of this paper. The entire system was installed on a Gulfstream V plane and passed the FAA proof of concept during August and September 2000.

Grimberg, Ernest

2001-08-01

167

Ringfield lithographic camera  

DOEpatents

A projection lithography camera is presented with a wide ringfield optimized so as to make efficient use of extreme ultraviolet radiation from a large area radiation source (e.g., D{sub source} {approx_equal} 0.5 mm). The camera comprises four aspheric mirrors optically arranged on a common axis of symmetry. The camera includes an aperture stop that is accessible through a plurality of partial aperture stops to synthesize the theoretical aperture stop. Radiation from a mask is focused to form a reduced image on a wafer, relative to the mask, by reflection from the four aspheric mirrors. 11 figs.

Sweatt, W.C.

1998-09-08

168

Comparing Cosmic Cameras  

NSDL National Science Digital Library

Learners will take and then compare the images taken by a camera - to learn about focal length (and its effects on field of view), resolution, and ultimately how cameras take close-up pictures of far away objects. Finally, they will apply this knowledge to the images of comet Tempel 1 taken by two different spacecraft with three different cameras, in this case Deep Impact and those expected/obtained from Stardust-NExT. This lesson could easily be adapted for use with other NASA missions.

169

Night Vision Camera  

NASA Technical Reports Server (NTRS)

PixelVision, Inc. developed the Night Video NV652 Back-illuminated CCD Camera, based on the expertise of a former Jet Propulsion Laboratory employee and a former employee of Scientific Imaging Technologies, Inc. The camera operates without an image intensifier, using back-illuminated and thinned CCD technology to achieve extremely low light level imaging performance. The advantages of PixelVision's system over conventional cameras include greater resolution and better target identification under low light conditions, lower cost and a longer lifetime. It is used commercially for research and aviation.

1996-01-01

170

1. INTRODUCTION Angle-Independent Doppler  

E-print Network

= -R(') / cos o.(1 In applications such as police radars or muzzle veloc- ity radars, a priori the target heading and the range vector is not known a priori, a Doppler velocity radar must estimate them knowledge of 0 is obtained by preset- ting a narrow beam radar at a predetermined angle relative to a known

Levanon, Nadav

171

Characterization of Narrow Band Filters for Infrared The Brfl and H 2 filters  

E-print Network

Characterization of Narrow Band Filters for Infrared Astronomy The Brfl and H 2 filters L. Vanzi used in Infrared Astronomy. Our study mainly quantifies the effect of temperature and tilt angle: Infrared, Narrow Band Filters, Imaging Abbreviations: IR -- Infrared; NIR -- Near infrared JEL codes: D24

Testi, Leonardo

172

Triangle Geometry: Angles  

NSDL National Science Digital Library

This interactive math site teaches students about angles and triangles. There are interactive activities for measuring angles, exploring types of angles, and adding angles. By using a Java applet and pictures, a proof of the Pythagorean Theorem is demonstrated.

Math Cove

2007-12-12

173

Calibration Procedures in Mid Format Camera Setups  

NASA Astrophysics Data System (ADS)

A growing number of mid-format cameras are used for aerial surveying projects. To achieve a reliable and geometrically precise result also in the photogrammetric workflow, awareness on the sensitive parts is important. The use of direct referencing systems (GPS/IMU), the mounting on a stabilizing camera platform and the specific values of the mid format camera make a professional setup with various calibration and misalignment operations necessary. An important part is to have a proper camera calibration. Using aerial images over a well designed test field with 3D structures and/or different flight altitudes enable the determination of calibration values in Bingo software. It will be demonstrated how such a calibration can be performed. The direct referencing device must be mounted in a solid and reliable way to the camera. Beside the mechanical work especially in mounting the camera beside the IMU, 2 lever arms have to be measured in mm accuracy. Important are the lever arms from the GPS Antenna to the IMU's calibrated centre and also the lever arm from the IMU centre to the Camera projection centre. In fact, the measurement with a total station is not a difficult task but the definition of the right centres and the need for using rotation matrices can cause serious accuracy problems. The benefit of small and medium format cameras is that also smaller aircrafts can be used. Like that, a gyro bases stabilized platform is recommended. This causes, that the IMU must be mounted beside the camera on the stabilizer. The advantage is, that the IMU can be used to control the platform, the problematic thing is, that the IMU to GPS antenna lever arm is floating. In fact we have to deal with an additional data stream, the values of the movement of the stabiliser to correct the floating lever arm distances. If the post-processing of the GPS-IMU data by taking the floating levers into account, delivers an expected result, the lever arms between IMU and camera can be applied. However, there is a misalignment (bore side angle) that must be evaluated by photogrammetric process using advanced tools e.g. in Bingo. Once, all these parameters have been determined, the system is capable for projects without or with only a few ground control points. But which effect has the photogrammetric process when directly applying the achieved direct orientation values compared with an AT based on a proper tiepoint matching? The paper aims to show the steps to be done by potential users and gives a kind of quality estimation about the importance and quality influence of the various calibration and adjustment steps.

Pivnicka, F.; Kemper, G.; Geissler, S.

2012-07-01

174

Divergent-ray projection method for measuring the flapping angle, lag angle, and torsional angle of a bumblebee wing  

NASA Astrophysics Data System (ADS)

A divergent-ray projection (DRP) method was developed for measuring the flapping angle, lag angle, and torsional angle of bumblebee wing during beating motion. This new method can measure the spatial coordinates of an insect wing by digitizing the images that are projected by two divergent laser rays from different directions. The advantage of the DRP method is its ability to measure those three angles simultaneously using only one high-speed camera. The resolution of the DRP method can be changed easily by adjusting system parameters to meet the needs of different types of objects. The measurement results for these angles of a bumblebee wing probe the effectiveness of the DRP method in studying the flight performance of insects.

Zeng, Lijiang; Matsumoto, Hirokazu; Kawachi, Keiji

1996-11-01

175

The Complementary Pinhole Camera.  

ERIC Educational Resources Information Center

Presents an experiment based on the principles of rectilinear motion of light operating in a pinhole camera that projects the image of an illuminated object through a small hole in a sheet to an image screen. (MDH)

Bissonnette, D.; And Others

1991-01-01

176

LSST camera control system  

NASA Astrophysics Data System (ADS)

The LSST Camera Control System (CCS) will manage the activities of the various camera subsystems and coordinate those activities with the LSST Observatory Control System (OCS). The CCS comprises a set of modules (nominally implemented in software) which are each responsible for managing one camera subsystem. Generally, a control module will be a long lived "server" process running on an embedded computer in the subsystem. Multiple control modules may run on a single computer or a module may be implemented in "firmware" on a subsystem. In any case control modules must exchange messages and status data with a master control module (MCM). The main features of this approach are: (1) control is distributed to the local subsystem level; (2) the systems follow a "Master/Slave" strategy; (3) coordination will be achieved by the exchange of messages through the interfaces between the CCS and its subsystems. The interface between the camera data acquisition system and its downstream clients is also presented.

Marshall, Stuart; Thaler, Jon; Schalk, Terry; Huffer, Michael

2006-06-01

177

Cardboard Box Camera Obscura  

NSDL National Science Digital Library

In this activity, learners construct a device that projects images onto a surface, so they can trace landscapes and other sights. The cardboard box camera captures the images using a mirror, holder, and lens, which work together like a simple camera. Use this activity to introduce learners to concepts related to optics, light, lenses, and mirrors. Note: this activity requires the use of a drill and saw, not included in the cost of materials.

2014-09-12

178

Sensing driver awareness by combining fisheye camera and Kinect  

NASA Astrophysics Data System (ADS)

In this paper, we propose a Driver's Awareness Catching System to sense the driver's awareness. The system consists of a fisheye camera and a Kinect. The Kinect mounted inside vehicle is used to recognize and locate the 3D face of the driver. The fisheye camera mounted outside vehicle is used to monitor the road. The relative pose between two cameras is calibrated via a state-of-the-art method for calibrating cameras with non-overlapping field of view. The camera system works in this way: First, the SDK of Kinect released by Microsoft is used to tracking driver's face and capture eye's location together with sight direction. Secondly, the eye's location and the sight direction are transformed to the coordinate system of fisheye camera. Thirdly, corresponding view field is extracted from fisheye image. As there is a small displacement between driver's eyes and the optical center of fisheye camera, it will lead to a view angle deviation. Finally, we did a systematic analysis of the error distribution by numerical simulation and proved the feasibility of our camera system. On the other hand, we realized this camera system and achieved desired effect in realworld experiment.

Wuhe, Z.; Lei, Z.; Ning, D.

2014-11-01

179

Spacecraft camera image registration  

NASA Technical Reports Server (NTRS)

A system for achieving spacecraft camera (1, 2) image registration comprises a portion external to the spacecraft and an image motion compensation system (IMCS) portion onboard the spacecraft. Within the IMCS, a computer (38) calculates an image registration compensation signal (60) which is sent to the scan control loops (84, 88, 94, 98) of the onboard cameras (1, 2). At the location external to the spacecraft, the long-term orbital and attitude perturbations on the spacecraft are modeled. Coefficients (K, A) from this model are periodically sent to the onboard computer (38) by means of a command unit (39). The coefficients (K, A) take into account observations of stars and landmarks made by the spacecraft cameras (1, 2) themselves. The computer (38) takes as inputs the updated coefficients (K, A) plus synchronization information indicating the mirror position (AZ, EL) of each of the spacecraft cameras (1, 2), operating mode, and starting and stopping status of the scan lines generated by these cameras (1, 2), and generates in response thereto the image registration compensation signal (60). The sources of periodic thermal errors on the spacecraft are discussed. The system is checked by calculating measurement residuals, the difference between the landmark and star locations predicted at the external location and the landmark and star locations as measured by the spacecraft cameras (1, 2).

Kamel, Ahmed A. (Inventor); Graul, Donald W. (Inventor); Chan, Fred N. T. (Inventor); Gamble, Donald W. (Inventor)

1987-01-01

180

Deployable Wireless Camera Penetrators  

NASA Technical Reports Server (NTRS)

A lightweight, low-power camera dart has been designed and tested for context imaging of sampling sites and ground surveys from an aerobot or an orbiting spacecraft in a microgravity environment. The camera penetrators also can be used to image any line-of-sight surface, such as cliff walls, that is difficult to access. Tethered cameras to inspect the surfaces of planetary bodies use both power and signal transmission lines to operate. A tether adds the possibility of inadvertently anchoring the aerobot, and requires some form of station-keeping capability of the aerobot if extended examination time is required. The new camera penetrators are deployed without a tether, weigh less than 30 grams, and are disposable. They are designed to drop from any altitude with the boost in transmitting power currently demonstrated at approximately 100-m line-of-sight. The penetrators also can be deployed to monitor lander or rover operations from a distance, and can be used for surface surveys or for context information gathering from a touch-and-go sampling site. Thanks to wireless operation, the complexity of the sampling or survey mechanisms may be reduced. The penetrators may be battery powered for short-duration missions, or have solar panels for longer or intermittent duration missions. The imaging device is embedded in the penetrator, which is dropped or projected at the surface of a study site at 90 to the surface. Mirrors can be used in the design to image the ground or the horizon. Some of the camera features were tested using commercial "nanny" or "spy" camera components with the charge-coupled device (CCD) looking at a direction parallel to the ground. Figure 1 shows components of one camera that weighs less than 8 g and occupies a volume of 11 cm3. This camera could transmit a standard television signal, including sound, up to 100 m. Figure 2 shows the CAD models of a version of the penetrator. A low-volume array of such penetrator cameras could be deployed from an aerobot or a spacecraft onto a comet or asteroid. A system of 20 of these penetrators could be designed and built in a 1- to 2-kg mass envelope. Possible future modifications of the camera penetrators, such as the addition of a chemical spray device, would allow the study of simple chemical reactions of reagents sprayed at the landing site and looking at the color changes. Zoom lenses also could be added for future use.

Badescu, Mircea; Jones, Jack; Sherrit, Stewart; Wu, Jiunn Jeng

2008-01-01

181

Autonomous pedestrian localization technique using CMOS camera sensors  

NASA Astrophysics Data System (ADS)

We present a pedestrian localization technique that does not need infrastructure. The proposed angle-only measurement method needs specially manufactured shoes. Each shoe has two CMOS cameras and two markers such as LEDs attached on the inward side. The line of sight (LOS) angles towards the two markers on the forward shoe are measured using the two cameras on the other rear shoe. Our simulation results shows that a pedestrian walking down in a shopping mall wearing this device can be accurately guided to the front of a destination store located 100m away, if the floor plan of the mall is available.

Chun, Chanwoo

2014-09-01

182

Lightweight, Compact, Long Range Camera Design  

NASA Astrophysics Data System (ADS)

The model 700 camera is the latest in a 30-year series of LOROP cameras developed by McDonnell Douglas Astronautics Company (MDAC) and their predecessor companies. The design achieves minimum size and weight and is optimized for low-contrast performance. The optical system includes a 66-inch focal length, f/5.6, apochromatic lens and three folding mirrors imaging on a 4.5-inch square format. A three-axis active stabilization system provides the capability for long exposure time and, hence, fine grain films can be used. The optical path forms a figure "4" behind the lens. In front of the lens is a 45° pointing mirror. This folded configuration contributed greatly to the lightweight and compact design. This sequential autocycle frame camera has three modes of operation with one, two, and three step positions to provide a choice of swath widths within the range of lateral coverage. The magazine/shutter assembly rotates in relationship with the pointing mirror and aircraft drift angle to maintain film format alignment with the flight path. The entire camera is angular rate stabilized in roll, pitch, and yaw. It also employs a lightweight, electro-magnetically damped, low-natural-frequency spring suspension for passive isolation from aircraft vibration inputs. The combined film transport and forward motion compensation (FMC) mechanism, which is operated by a single motor, is contained in a magazine that can, depending on accessibility which is installation dependent, be changed in flight. The design also stresses thermal control, focus control, structural stiffness, and maintainability. The camera is operated from a remote control panel. This paper describes the leading particulars and features of the camera as related to weight and configuration.

Shafer, Donald V.

1983-08-01

183

Satellite camera image navigation  

NASA Technical Reports Server (NTRS)

Pixels within a satellite camera (1, 2) image are precisely located in terms of latitude and longitude on a celestial body, such as the earth, being imaged. A computer (60) on the earth generates models (40, 50) of the satellite's orbit and attitude, respectively. The orbit model (40) is generated from measurements of stars and landmarks taken by the camera (1, 2), and by range data. The orbit model (40) is an expression of the satellite's latitude and longitude at the subsatellite point, and of the altitude of the satellite, as a function of time, using as coefficients (K) the six Keplerian elements at epoch. The attitude model (50) is based upon star measurements taken by each camera (1, 2). The attitude model (50) is a set of expressions for the deviations in a set of mutually orthogonal reference optical axes (x, y, z) as a function of time, for each camera (1, 2). Measured data is fit into the models (40, 50) using a walking least squares fit algorithm. A transformation computer (66 ) transforms pixel coordinates as telemetered by the camera (1, 2) into earth latitude and longitude coordinates, using the orbit and attitude models (40, 50).

Kamel, Ahmed A. (Inventor); Graul, Donald W. (Inventor); Savides, John (Inventor); Hanson, Charles W. (Inventor)

1987-01-01

184

The Dark Energy Camera  

E-print Network

The Dark Energy Camera is a new imager with a 2.2-degree diameter field of view mounted at the prime focus of the Victor M. Blanco 4-meter telescope on Cerro Tololo near La Serena, Chile. The camera was designed and constructed by the Dark Energy Survey Collaboration, and meets or exceeds the stringent requirements designed for the wide-field and supernova surveys for which the collaboration uses it. The camera consists of a five element optical corrector, seven filters, a shutter with a 60 cm aperture, and a CCD focal plane of 250 micron thick fully-depleted CCDs cooled inside a vacuum Dewar. The 570 Mpixel focal plane comprises 62 2kx4k CCDs for imaging and 12 2kx2k CCDs for guiding and focus. The CCDs have 15 microns x15 microns pixels with a plate scale of 0.263 arc sec per pixel. A hexapod system provides state-of-the-art focus and alignment capability. The camera is read out in 20 seconds with 6-9 electrons readout noise. This paper provides a technical description of the camera's engineering, construct...

Flaugher, B; Honscheid, K; Abbott, T M C; Alvarez, O; Angstadt, R; Annis, J T; Antonik, M; Ballester, O; Beaufore, L; Bernstein, G M; Bernstein, R A; Bigelow, B; Bonati, M; Boprie, D; Brooks, D; Buckley-Geer, E J; Campa, J; Cardiel-Sas, L; Castander, F J; Castilla, J; Cease, H; Cela-Ruiz, J M; Chappa, S; Chi, E; Cooper, C; da Costa, L N; Dede, E; Derylo, G; DePoy, D L; de Vicente, J; Doel, P; Drlica-Wagner, A; Eiting, J; Elliott, A E; Emes, J; Estrada, J; Neto, A Fausti; Finley, D A; Flores, R; Frieman, J; Gerdes, D; Gladders, M D; Gregory, B; Gutierrez, G R; Hao, J; Holland, S E; Holm, S; Huffman, D; Jackson, C; James, D J; Jonas, M; Karcher, A; Karliner, I; Kent, S; Kessler, R; Kozlovsky, M; Kron, R G; Kubik, D; Kuehn, K; Kuhlmann, S; Kuk, K; Lahav, O; Lathrop, A; Lee, J; Levi, M E; Lewis, P; Li, T S; Mandrichenko, I; Marshall, J L; Martinez, G; Merritt, K W; Miquel, R; Munoz, F; Neilsen, E H; Nichol, R C; Nord, B; Ogando, R; Olsen, J; Palio, N; Patton, K; Peoples, J; Plazas, A A; Rauch, J; Reil, K; Rheault, J -P; Roe, N A; Rogers, H; Roodman, A; Sanchez, E; Scarpine, V; Schindler, R H; Schmidt, R; Schmitt, R; Schubnell, M; Schultz, K; Schurter, P; Scott, L; Serrano, S; Shaw, T M; Smith, R C; Soares-Santos, M; Stefanik, A; Stuermer, W; Suchyta, E; Sypniewski, A; Tarle, G; Thaler, J; Tighe, R; Tran, C; Tucker, D; Walker, A R; Wang, G; Watson, M; Weaverdyck, C; Wester, W; Woods, R; Yanny, B

2015-01-01

185

Solid state television camera  

NASA Technical Reports Server (NTRS)

The design, fabrication, and tests of a solid state television camera using a new charge-coupled imaging device are reported. An RCA charge-coupled device arranged in a 512 by 320 format and directly compatible with EIA format standards was the sensor selected. This is a three-phase, sealed surface-channel array that has 163,840 sensor elements, which employs a vertical frame transfer system for image readout. Included are test results of the complete camera system, circuit description and changes to such circuits as a result of integration and test, maintenance and operation section, recommendations to improve the camera system, and a complete set of electrical and mechanical drawing sketches.

1976-01-01

186

Artificial human vision camera  

NASA Astrophysics Data System (ADS)

In this paper we present a real-time vision system modeling the human vision system. Our purpose is to inspire from human vision bio-mechanics to improve robotic capabilities for tasks such as objects detection and tracking. This work describes first the bio-mechanical discrepancies between human vision and classic cameras and the retinal processing stage that takes place in the eye, before the optic nerve. The second part describes our implementation of these principles on a 3-camera optical, mechanical and software model of the human eyes and associated bio-inspired attention model.

Goudou, J.-F.; Maggio, S.; Fagno, M.

2014-10-01

187

Viewing angle controllable liquid crystal display under optical compensation  

NASA Astrophysics Data System (ADS)

We investigated a viewing angle control of a liquid crystal display (LCD) under optical compensation for the enhancement of viewing angle characteristics in a wide viewing angle mode. The viewing angle controllable (VAC) LCD was operated in the configuration of three terminal electrodes consisting of a fringe-field-switching electrode at a bottom substrate and a common electrode at a top substrate. Using Poincaré sphere analysis, the optical compensation for the VAC LCD was designed so that the viewing angle characteristics were much improved in the wide viewing mode while they were degraded in the narrow viewing mode.

Jo, Soo In; Lee, Seul Gee; Lee, You-Jin; Kim, Jae-Hoon; Yu, Chang-Jae

2011-09-01

188

Optical Structure and Colors of Faint Compact Narrow Emission-Line Galaxies  

Microsoft Academic Search

We present the results of Hubble Space Telescope Wide Field Planetary Camera imaging in V606 and I814 of five compact narrow emission line galaxies (CNELGs) and four slightly more extended faint blue galaxies (FBGs) with 20.3<=B<=22.4 and redshifts z~0.22 -0.66. Half-light radii are measured to span from 0.\\

Rafael Guzman; Anna Jangren; David C. Koo; Matthew A. Bershady; Luc Simard

1998-01-01

189

Quality criterion for digital still camera  

NASA Astrophysics Data System (ADS)

The main quality requirements for a digital still camera are color capturing accuracy, low noise level, and quantum efficiency. Different consumers assign different priorities to the listed parameters, and camera designers need clearly formulated methods for their evaluation. While there are procedures providing noise level and quantum efficiency estimation, there are no effective means for color capturing accuracy estimation. Introduced in this paper criterion allows to fill this gap. Luther-Ives condition for correct color reproduction system became known in the beginning of the last century. However, since no detector system satisfies Luther-Ives condition, there are always stimuli that are distinctly different for an observer, but which detectors are unable to distinguish. To estimate conformity of a detector set with Luther-Ives condition and calculate a measure of discrepancy, an angle between detector sensor sensitivity and Cohen's Fundamental Color Space may be used. In this paper, the divergence angle is calculated for some typical CCD sensors and a demonstration provided on how this angle might be reduced with a corrective filter. In addition, it is shown that with a specific corrective filter Foveon sensors turn into a detector system with a good Luther-Ives condition compliance.

Bezryadin, Sergey

2007-02-01

190

Image Sensors Enhance Camera Technologies  

NASA Technical Reports Server (NTRS)

In the 1990s, a Jet Propulsion Laboratory team led by Eric Fossum researched ways of improving complementary metal-oxide semiconductor (CMOS) image sensors in order to miniaturize cameras on spacecraft while maintaining scientific image quality. Fossum s team founded a company to commercialize the resulting CMOS active pixel sensor. Now called the Aptina Imaging Corporation, based in San Jose, California, the company has shipped over 1 billion sensors for use in applications such as digital cameras, camera phones, Web cameras, and automotive cameras. Today, one of every three cell phone cameras on the planet feature Aptina s sensor technology.

2010-01-01

191

Make a Pinhole Camera  

ERIC Educational Resources Information Center

On Earth, using ordinary visible light, one can create a single image of light recorded over time. Of course a movie or video is light recorded over time, but it is a series of instantaneous snapshots, rather than light and time both recorded on the same medium. A pinhole camera, which is simple to make out of ordinary materials and using ordinary…

Fisher, Diane K.; Novati, Alexander

2009-01-01

192

The LSST Camera Overview  

SciTech Connect

The LSST camera is a wide-field optical (0.35-1um) imager designed to provide a 3.5 degree FOV with better than 0.2 arcsecond sampling. The detector format will be a circular mosaic providing approximately 3.2 Gigapixels per image. The camera includes a filter mechanism and, shuttering capability. It is positioned in the middle of the telescope where cross-sectional area is constrained by optical vignetting and heat dissipation must be controlled to limit thermal gradients in the optical beam. The fast, f/1.2 beam will require tight tolerances on the focal plane mechanical assembly. The focal plane array operates at a temperature of approximately -100 C to achieve desired detector performance. The focal plane array is contained within an evacuated cryostat, which incorporates detector front-end electronics and thermal control. The cryostat lens serves as an entrance window and vacuum seal for the cryostat. Similarly, the camera body lens serves as an entrance window and gas seal for the camera housing, which is filled with a suitable gas to provide the operating environment for the shutter and filter change mechanisms. The filter carousel can accommodate 5 filters, each 75 cm in diameter, for rapid exchange without external intervention.

Gilmore, Kirk; Kahn, Steven A.; Nordby, Martin; Burke, David; O'Connor, Paul; Oliver, John; Radeka, Veljko; Schalk, Terry; Schindler, Rafe; /SLAC

2007-01-10

193

Face Fixing Cameras  

E-print Network

by releasing a camera with a "beauty re-touch" function that gives you the ability to improve your look in photographs. The function allows you to alter the shape of your face, widen your eyes, whiten your teeth, apply makeup and change your skin tone; in short...

Boyd, David

2011-06-22

194

Caustics of Catadioptric Cameras  

Microsoft Academic Search

Conventional vision systems and algorithms assume the camera to have a single viewpoint. However, sensors need not always maintain a single viewpoint. For instance, an in- correctly aligned system could cause non-single viewpoints. Also, systems could be designed to specifically deviate from a single viewpoint to trade-off image characteristics such as resolution and field of view. In these cases, the

Rahul Swaminathan; Michael D. Grossberg; Shree K. Nayar

2001-01-01

195

Secure Digital Camera  

Microsoft Academic Search

In this paper, we propose a biometric solution to solve some of the significant problems associated with use of digital camera images as evidence in a court of law. We present a lossless watermarking solution to the problems associated with digital image integrity and the relationship to its chain of custody. The integrity of digital images as evidence rests on

Paul Blythe; Jessica Fridrich

196

Photogrammetric camera calibration  

USGS Publications Warehouse

Section 2 (Calibration) of the document "Recommended Procedures for Calibrating Photogrammetric Cameras and Related Optical Tests" from the International Archives of Photogrammetry, Vol. XIII, Part 4, is reviewed in the light of recent practical work, and suggestions for changes are made. These suggestions are intended as a basis for a further discussion. ?? 1984.

Tayman, W.P.; Ziemann, H.

1984-01-01

197

The LSST Camera System  

NASA Astrophysics Data System (ADS)

The LSST camera provides a 3.2 Gigapixel focal plane array, tiled by 189 4Kx4K CCD science sensors with 10um pixels. This pixel count is a direct consequence of sampling the 9.6 deg^2 field-of-view (0.64m diameter) with 0.2 arcsec pixels (Nyquist sampling in the best expected seeing of 0.4 arcsec). The sensors are deep depleted, back-illuminated devices with a highly segmented architecture that enables the entire array to be read in 2 seconds. The detectors are grouped into 3x3 rafts, each containing its own dedicated front-end and back-end electronics boards. The rafts are mounted on a silicon carbide grid inside a vacuum cryostat, with an intricate thermal cryostat and the third of the three refractive lenses in the camera. The other two lenses are mounted in an optics structure at the front of the camera body, which also contains a mechanical shutter, and a carousel assembly that holds five large optical filters(ugrizy). A sixth optical filter will also be fabricated and can replace any of the others via procedures accomplished during daylight hours. This poster will illustrate the current mechanical design of the camera, FEA and thermal analysis of the cryostat, and overview of the data acquisition system and the performance characteristics of the filters.

Gilmore, D. Kirk; Kahn, S.; Fouts, K.; LSST Camera Team

2009-01-01

198

Stereo from uncalibrated cameras  

Microsoft Academic Search

The problem of computing placement of points in 3-D space, given two uncalibrated perspective views, is considered. The main theorem shows that the placement of the points is determined only up to an arbitrary projective transformation of 3-space. Given additional ground control points, however, the location of the points and the camera parameters may be determined. The method is linear

Richard Hartley; Rajiv Gupta; Tom Chang

1992-01-01

199

Behind the Camera.  

ERIC Educational Resources Information Center

Intended for the beginning filmmaker, this book presents basic information on major aspects of shooting a film. It covers characteristics of various cameras, films, lenses, and lighting equipment and tells how to use them. The importance of a shooting script is stressed. The mechanics of sound systems, editing, and titles, animations, and special…

Kuhns, William; Giardino, Thomas F.

200

Jack & the Video Camera  

ERIC Educational Resources Information Center

This article narrates how the use of video camera has transformed the life of Jack Williams, a 10-year-old boy from Colorado Springs, Colorado, who has autism. The way autism affected Jack was unique. For the first nine years of his life, Jack remained in his world, alone. Functionally non-verbal and with motor skill problems that affected his…

Charlan, Nathan

2010-01-01

201

Cameras Everywhere Eric Hehner  

E-print Network

years, police brutality has been caught and punished when, by luck, someone happens to record reported, the police can see who left the crime scene, follow their recorded movements from camera the current cost of crime, both to the victims, and to the community that pays for police investigations

Hehner, Eric C.R.

202

Communities, Cameras, and Conservation  

ERIC Educational Resources Information Center

Communities, Cameras, and Conservation (CCC) is the most exciting and valuable program the author has seen in her 30 years of teaching field science courses. In this citizen science project, students and community volunteers collect data on mountain lions ("Puma concolor") at four natural areas and public parks along the Front Range of Colorado.…

Patterson, Barbara

2012-01-01

203

Early Experience & Multisensory Perceptual Narrowing  

PubMed Central

Perceptual narrowing is a reflection of early experience and contributes in key ways to perceptual and cognitive development. In general, findings have shown that unisensory perceptual sensitivity in early infancy is broadly tuned such that young infants respond to, and discriminate, native as well as non-native sensory inputs, whereas older infants only respond to native inputs. Recently, my colleagues and I discovered that perceptual narrowing occurs at the multisensory processing level as well. The present article reviews this new evidence and puts it in the larger context of multisensory perceptual development and the role that perceptual experience plays in it. Together, the evidence on unisensory and multisensory narrowing shows that early experience shapes the emergence of perceptual specialization and expertise. PMID:24435505

Lewkowicz, David J.

2014-01-01

204

Multi-camera calibration based on openCV and multi-view registration  

Microsoft Academic Search

For multi-camera calibration systems, a method based on OpenCV and multi-view registration combining calibration algorithm is proposed. First of all, using a Zhang's calibration plate (8X8 chessboard diagram) and a number of cameras (with three industrial-grade CCD) to be 9 group images shooting from different angles, using OpenCV to calibrate the parameters fast in the camera. Secondly, based on the

Xiao-Ming Deng; Xiong Wan; Zhi-Min Zhang; Bi-Yan Leng; Ning-Ning Lou; Shuai He

2010-01-01

205

Example of Weathering And Sun Angle  

NASA Technical Reports Server (NTRS)

The letter 'B' or perhaps the figure '8' appears to have been etched into the Mars rock at the left edge of this picture taken yesterday by the Viking 1 Lander. It is believed to be an illusion caused by weathering processes and the angle of the sun as it illuminated the scene for the spacecraft camera. The object at lower left is the housing containing the surface sampler scoop.

1976-01-01

206

Roll Angle Estimation for Motorcycles: Comparing Video and Inertial Sensor Approaches  

E-print Network

Roll Angle Estimation for Motorcycles: Comparing Video and Inertial Sensor Approaches Marc such modules to motorcycles, the camera pose has to be taken into account, as e. g. large roll angles produce significant variations in the recorded images. Therefore, roll angle estimation is an important task

Schlipsing, Marc

207

Automated Camera Array Fine Calibration  

NASA Technical Reports Server (NTRS)

Using aerial imagery, the JPL FineCalibration (JPL FineCal) software automatically tunes a set of existing CAHVOR camera models for an array of cameras. The software finds matching features in the overlap region between images from adjacent cameras, and uses these features to refine the camera models. It is not necessary to take special imagery of a known target and no surveying is required. JPL FineCal was developed for use with an aerial, persistent surveillance platform.

Clouse, Daniel; Padgett, Curtis; Ansar, Adnan; Cheng, Yang

2008-01-01

208

Streak camera receiver definition study  

NASA Technical Reports Server (NTRS)

Detailed streak camera definition studies were made as a first step toward full flight qualification of a dual channel picosecond resolution streak camera receiver for the Geoscience Laser Altimeter and Ranging System (GLRS). The streak camera receiver requirements are discussed as they pertain specifically to the GLRS system, and estimates of the characteristics of the streak camera are given, based upon existing and near-term technological capabilities. Important problem areas are highlighted, and possible corresponding solutions are discussed.

Johnson, C. B.; Hunkler, L. T., Sr.; Letzring, S. A.; Jaanimagi, P.

1990-01-01

209

Camera Trajectory fromWide Baseline Images  

NASA Astrophysics Data System (ADS)

Camera trajectory estimation, which is closely related to the structure from motion computation, is one of the fundamental tasks in computer vision. Reliable camera trajectory estimation plays an important role in 3D reconstruction, self localization, and object recognition. There are essential issues for a reliable camera trajectory estimation, for instance, choice of the camera and its geometric projection model, camera calibration, image feature detection and description, and robust 3D structure computation. Most of approaches rely on classical perspective cameras because of the simplicity of their projection models and ease of their calibration. However, classical perspective cameras offer only a limited field of view, and thus occlusions and sharp camera turns may cause that consecutive frames look completely different when the baseline becomes longer. This makes the image feature matching very difficult (or impossible) and the camera trajectory estimation fails under such conditions. These problems can be avoided if omnidirectional cameras, e.g. a fish-eye lens convertor, are used. The hardware which we are using in practice is a combination of Nikon FC-E9 mounted via a mechanical adaptor onto a Kyocera Finecam M410R digital camera. Nikon FC-E9 is a megapixel omnidirectional addon convertor with 180° view angle which provides images of photographic quality. Kyocera Finecam M410R delivers 2272×1704 images at 3 frames per second. The resulting combination yields a circular view of diameter 1600 pixels in the image. Since consecutive frames of the omnidirectional camera often share a common region in 3D space, the image feature matching is often feasible. On the other hand, the calibration of these cameras is non-trivial and is crucial for the accuracy of the resulting 3D reconstruction. We calibrate omnidirectional cameras off-line using the state-of-the-art technique and Mi?ušík's two-parameter model, that links the radius of the image point r to the angle ? of its corresponding rays w.r.t. the optical axis as ? = ar 1+br2 . After a successful calibration, we know the correspondence of the image points to the 3D optical rays in the coordinate system of the camera. The following steps aim at finding the transformation between the camera and the world coordinate systems, i.e. the pose of the camera in the 3D world, using 2D image matches. For computing 3D structure, we construct a set of tentative matches detecting different affine covariant feature regions including MSER, Harris Affine, and Hessian Affine in acquired images. These features are alternative to popular SIFT features and work comparably in our situation. Parameters of the detectors are chosen to limit the number of regions to 1-2 thousands per image. The detected regions are assigned local affine frames (LAF) and transformed into standard positions w.r.t. their LAFs. Discrete Cosine Descriptors are computed for each region in the standard position. Finally, mutual distances of all regions in one image and all regions in the other image are computed as the Euclidean distances of their descriptors and tentative matches are constructed by selecting the mutually closest pairs. Opposed to the methods using short baseline images, simpler image features which are not affine covariant cannot be used because the view point can change a lot between consecutive frames. Furthermore, feature matching has to be performed on the whole frame because no assumptions on the proximity of the consecutive projections can be made for wide baseline images. This is making the feature detection, description, and matching much more time-consuming than it is for short baseline images and limits the usage to low frame rate sequences when operating in real-time. Robust 3D structure can be computed by RANSAC which searches for the largest subset of the set of tentative matches which is, within a predefined threshold ", consistent with an epipolar geometry. We use ordered sampling as suggested in to draw 5-tuples from the list of tentative matches ordered ascendin

Havlena, M.; Torii, A.; Pajdla, T.

2008-09-01

210

Wide-angle imaging system with fiberoptic components providing angle-dependent virtual material stops  

NASA Technical Reports Server (NTRS)

A strip imaging wide angle optical system is provided. The optical system is provided with a 'virtual' material stop to avoid aberrational effects inherent in wide angle optical systems. The optical system includes a spherical mirror section for receiving light from a 180 deg strip or arc of a target image. Light received by the spherical mirror section is reflected to a frustoconical mirror section for subsequent rereflection to a row of optical fibers. Each optical fiber transmits a portion of the received light to a detector. The optical system exploits the narrow cone of acceptance associated with optical fibers to substantially eliminate vignetting effects inherent in wide angle systems. Further, the optical system exploits the narrow cone of acceptance of the optical fibers to substantially limit spherical aberration. The optical system is ideally suited for any application wherein a 180 deg strip image need be detected, and is particularly well adapted for use in hostile environments such as in planetary exploration.

Vaughan, Arthur H. (inventor)

1993-01-01

211

Event detection intelligent camera development  

Microsoft Academic Search

A new camera system ‘event detection intelligent camera’ (EDICAM) is being developed for the video diagnostics of W-7X stellarator, which consists of 10 distinct and standalone measurement channels each holding a camera. Different operation modes will be implemented for continuous and for triggered readout as well. Hardware level trigger signals will be generated from real time image processing algorithms optimized

A. Szappanos; G. Kocsis; A. Molnár; J. Sárkozi; S. Zoletnik

2008-01-01

212

Algorithms for coplanar camera calibration  

Microsoft Academic Search

Coplanar camera calibration is the process of de- termining the extrinsic and intrinsic camera parameters from a given set of image and world points, when the world points lie on a two-dimensional plane. Noncoplanar cali- bration, on the other hand, involves world points that do not lie on a plane. While optimal solutions for both the camera- calibration procedures can

Chanchal Chatterjee; Vwani P. Roychowdhury

2000-01-01

213

Smart Cameras as Embedded Systems  

Microsoft Academic Search

Recent technological advances are enabling a new generation of smart cameras that represent a quantum leap in sophistication. While today's digital cameras capture images, smart cameras capture high-level descriptions of the scene and analyze what they see. These devices could support a wide variety of applications including human and animal detection, surveillance, motion analysis, and facial identification. Video processing has

Wayne Wolf; Burak Ozer; Lv Tiehan

2002-01-01

214

NSTX Tangential Divertor Camera  

SciTech Connect

Strong magnetic field shear around the divertor x-point is numerically predicted to lead to strong spatial asymmetries in turbulence driven particle fluxes. To visualize the turbulence and associated impurity line emission near the lower x-point region, a new tangential observation port has been recently installed on NSTX. A reentrant sapphire window with a moveable in-vessel mirror images the divertor region from the center stack out to R 80 cm and views the x-point for most plasma configurations. A coherent fiber optic bundle transmits the image through a remotely selected filter to a fast camera, for example a 40500 frames/sec Photron CCD camera. A gas puffer located in the lower inboard divertor will localize the turbulence in the region near the x-point. Edge fluid and turbulent codes UEDGE and BOUT will be used to interpret impurity and deuterium emission fluctuation measurements in the divertor.

A.L. Roquemore; Ted Biewer; D. Johnson; S.J. Zweben; Nobuhiro Nishino; V.A. Soukhanovskii

2004-07-16

215

Combustion pinhole camera system  

DOEpatents

A pinhole camera system utilizing a sealed optical-purge assembly which provides optical access into a coal combustor or other energy conversion reactors. The camera system basically consists of a focused-purge pinhole optical port assembly, a conventional TV vidicon receiver, an external, variable density light filter which is coupled electronically to the vidicon automatic gain control (agc). The key component of this system is the focused-purge pinhole optical port assembly which utilizes a purging inert gas to keep debris from entering the port and a lens arrangement which transfers the pinhole to the outside of the port assembly. One additional feature of the port assembly is that it is not flush with the interior of the combustor.

Witte, Arvel B. (Rolling Hills, CA)

1984-02-21

216

Combustion pinhole camera system  

DOEpatents

A pinhole camera system is described utilizing a sealed optical-purge assembly which provides optical access into a coal combustor or other energy conversion reactors. The camera system basically consists of a focused-purge pinhole optical port assembly, a conventional TV vidicon receiver, an external, variable density light filter which is coupled electronically to the vidicon automatic gain control (agc). The key component of this system is the focused-purge pinhole optical port assembly which utilizes a purging inert gas to keep debris from entering the port and a lens arrangement which transfers the pinhole to the outside of the port assembly. One additional feature of the port assembly is that it is not flush with the interior of the combustor. 2 figs.

Witte, A.B.

1984-02-21

217

The Dark Energy Camera  

NASA Astrophysics Data System (ADS)

The DES Collaboration has completed construction of the Dark Energy Camera (DECam), a 3 square degree, 570 Megapixel CCD camera which is now mounted at the prime focus of the Blanco 4-meter telescope at the Cerro Tololo Inter-American Observatory. DECam is comprised of 74 250 micron thick fully depleted CCDs: 62 2k x 4k CCDs for imaging and 12 2k x 2k CCDs for guiding and focus. A filter set of u,g,r,i,z, and Y, a hexapod for focus and lateral alignment as well as thermal management of the cage temperature. DECam will be used to perform the Dark Energy Survey with 30% of the telescope time over a 5 year period. During the remainder of the time, and after the survey, DECam will be available as a community instrument. An overview of the DECam design, construction and initial on-sky performance information will be presented.

Flaugher, Brenna; DES Collaboration

2013-01-01

218

The NEAT Camera Project  

NASA Technical Reports Server (NTRS)

The NEAT (Near Earth Asteroid Tracking) camera system consists of a camera head with a 6.3 cm square 4096 x 4096 pixel CCD, fast electronics, and a Sun Sparc 20 data and control computer with dual CPUs, 256 Mbytes of memory, and 36 Gbytes of hard disk. The system was designed for optimum use with an Air Force GEODSS (Ground-based Electro-Optical Deep Space Surveillance) telescope. The GEODSS telescopes have 1 m f/2.15 objectives of the Ritchey-Chretian type, designed originally for satellite tracking. Installation of NEAT began July 25 at the Air Force Facility on Haleakala, a 3000 m peak on Maui in Hawaii.

Jr., Ray L. Newburn

1995-01-01

219

Mobile Phones Digital Cameras  

E-print Network

News· Tutorials· Reviews· Features· Videos· Search· Mobile Phones· Notebooks· Digital Cameras· Gaming· Computers· Audio· Software· Follow Us· Subscribe· Airport Security to Get New Scanning Device Relations Accredited online university. Get an international relations degree. www.AMUOnline.com security

Suslick, Kenneth S.

220

Gamma ray camera  

DOEpatents

A gamma ray camera is disclosed for detecting rays emanating from a radiation source such as an isotope. The gamma ray camera includes a sensor array formed of a visible light crystal for converting incident gamma rays to a plurality of corresponding visible light photons, and a photosensor array responsive to the visible light photons in order to form an electronic image of the radiation therefrom. The photosensor array is adapted to record an integrated amount of charge proportional to the incident gamma rays closest to it, and includes a transparent metallic layer, photodiode consisting of a p-i-n structure formed on one side of the transparent metallic layer, and comprising an upper p-type layer, an intermediate layer and a lower n-type layer. In the preferred mode, the scintillator crystal is composed essentially of a cesium iodide (CsI) crystal preferably doped with a predetermined amount impurity, and the p-type upper intermediate layers and said n-type layer are essentially composed of hydrogenated amorphous silicon (a-Si:H). The gamma ray camera further includes a collimator interposed between the radiation source and the sensor array, and a readout circuit formed on one side of the photosensor array. 6 figs.

Perez-Mendez, V.

1997-01-21

221

Gamma ray camera  

DOEpatents

A gamma ray camera for detecting rays emanating from a radiation source such as an isotope. The gamma ray camera includes a sensor array formed of a visible light crystal for converting incident gamma rays to a plurality of corresponding visible light photons, and a photosensor array responsive to the visible light photons in order to form an electronic image of the radiation therefrom. The photosensor array is adapted to record an integrated amount of charge proportional to the incident gamma rays closest to it, and includes a transparent metallic layer, photodiode consisting of a p-i-n structure formed on one side of the transparent metallic layer, and comprising an upper p-type layer, an intermediate layer and a lower n-type layer. In the preferred mode, the scintillator crystal is composed essentially of a cesium iodide (CsI) crystal preferably doped with a predetermined amount impurity, and the p-type upper intermediate layers and said n-type layer are essentially composed of hydrogenated amorphous silicon (a-Si:H). The gamma ray camera further includes a collimator interposed between the radiation source and the sensor array, and a readout circuit formed on one side of the photosensor array.

Perez-Mendez, Victor (Berkeley, CA)

1997-01-01

222

Hemispherical Laue camera  

DOEpatents

A hemispherical Laue camera comprises a crystal sample mount for positioning a sample to be analyzed at the center of sphere of a hemispherical, X-radiation sensitive film cassette, a collimator, a stationary or rotating sample mount and a set of standard spherical projection spheres. X-radiation generated from an external source is directed through the collimator to impinge onto the single crystal sample on the stationary mount. The diffracted beam is recorded on the hemispherical X-radiation sensitive film mounted inside the hemispherical film cassette in either transmission or back-reflection geometry. The distances travelled by X-radiation diffracted from the crystal to the hemispherical film are the same for all crystal planes which satisfy Bragg's Law. The recorded diffraction spots or Laue spots on the film thereby preserve both the symmetry information of the crystal structure and the relative intensities which are directly related to the relative structure factors of the crystal orientations. The diffraction pattern on the exposed film is compared with the known diffraction pattern on one of the standard spherical projection spheres for a specific crystal structure to determine the orientation of the crystal sample. By replacing the stationary sample support with a rotating sample mount, the hemispherical Laue camera can be used for crystal structure determination in a manner previously provided in conventional Debye-Scherrer cameras.

Li, James C. M. (Pittsford, NY); Chu, Sungnee G. (Rochester, NY)

1980-01-01

223

Orbiter Camera Payload System  

NASA Technical Reports Server (NTRS)

Components for an orbiting camera payload system (OCPS) include the large format camera (LFC), a gas supply assembly, and ground test, handling, and calibration hardware. The LFC, a high resolution large format photogrammetric camera for use in the cargo bay of the space transport system, is also adaptable to use on an RB-57 aircraft or on a free flyer satellite. Carrying 4000 feet of film, the LFC is usable over the visible to near IR, at V/h rates of from 11 to 41 milliradians per second, overlap of 10, 60, 70 or 80 percent and exposure times of from 4 to 32 milliseconds. With a 12 inch focal length it produces a 9 by 18 inch format (long dimension in line of flight) with full format low contrast resolution of 88 lines per millimeter (AWAR), full format distortion of less than 14 microns and a complement of 45 Reseau marks and 12 fiducial marks. Weight of the OCPS as supplied, fully loaded is 944 pounds and power dissipation is 273 watts average when in operation, 95 watts in standby. The LFC contains an internal exposure sensor, or will respond to external command. It is able to photograph starfields for inflight calibration upon command.

1980-01-01

224

Characterization of Previously Unidentified Lunar Pyroclastic Deposits using Lunar Reconnaissance Orbiter Camera (LROC) Data  

NASA Astrophysics Data System (ADS)

We used a Lunar Reconnaissance Orbiter Cameras (LROC) global monochrome Wide Angle Camera (WAC) mosaic to conduct a survey of the Moon to search for previously unidentified pyroclastic deposits. Promising locations were examined in detail using LROC multispectral WAC mosaics, high-resolution LROC Narrow Angle Camera (NAC) images, and Clementine multispectral (ultraviolet-visible, or UVVIS) data. Out of 47 potential deposits chosen for closer examination, 13 were selected as probable newly identified pyroclastic deposits. Potential pyroclastic deposits were generally found in settings similar to previously identified deposits, including areas within or near mare deposits adjacent to highlands, within floor-fractured craters, and along fissures in mare deposits. A significant new finding is the discovery of localized pyroclastic deposits within floor-fractured craters Anderson E & F on the lunar farside, isolated from other known similar deposits. These appear to be Alphonsus-type deposits erupted from a series of small vents aligned with the fracture system, suggesting their origin in volatile-rich vulcanian-style eruptions containing relatively small amounts of juvenile material. The presence of such volcanic features on the lunar farside outside of the major basins such as Moscoviense, Orientale, and South Pole - Aitken indicates that magma ascent and eruption have occurred even in the central farside highlands, despite the thicker farside crust. However, this is the only such occurrence that we have located, and it appears to represent an endpoint in the continuum of eruption styles where the eruption was just barely able to reach the surface but could not transport enough magma to the surface to form an effusive deposit. Many of the 47 potential locations screened were eliminated from consideration based on inconclusive evidence regarding their mode of emplacement. Additional optical imaging, or analyses of other data sets such as radar, imaging spectroscopy, or thermal inertia, could result in identification of additional pyroclastic deposits, especially lighter-toned deposits. However, our search also confirms that most major regional and localized pyroclastic deposits have likely been identified on the Moon down to ~100 m/pix resolution, and that additional newly identified pyroclastic deposits are likely to be either isolated small deposits or additional portions of discontinuous, patchy deposits. Based on the locations where we identified previously unidentified pyroclastic deposits, the greatest potential for identification of additional pyroclastic deposits is likely to be in regions with other volcanic constructs associated with mare deposits, highland locations along the margins of maria, and smaller floor-fractured craters that have not yet been thoroughly imaged at higher resolution, particularly on the farside (such as Anderson E & F).

Gustafson, O.; Bell, J. F.; Gaddis, L. R.; Hawke, B. R.; Giguere, T.

2011-12-01

225

Optical performance analysis of plenoptic camera systems  

NASA Astrophysics Data System (ADS)

Adding an array of microlenses in front of the sensor transforms the capabilities of a conventional camera to capture both spatial and angular information within a single shot. This plenoptic camera is capable of obtaining depth information and providing it for a multitude of applications, e.g. artificial re-focusing of photographs. Without the need of active illumination it represents a compact and fast optical 3D acquisition technique with reduced effort in system alignment. Since the extent of the aperture limits the range of detected angles, the observed parallax is reduced compared to common stereo imaging systems, which results in a decreased depth resolution. Besides, the gain of angular information implies a degraded spatial resolution. This trade-off requires a careful choice of the optical system parameters. We present a comprehensive assessment of possible degrees of freedom in the design of plenoptic systems. Utilizing a custom-built simulation tool, the optical performance is quantified with respect to particular starting conditions. Furthermore, a plenoptic camera prototype is demonstrated in order to verify the predicted optical characteristics.

Langguth, Christin; Oberdörster, Alexander; Brückner, Andreas; Wippermann, Frank; Bräuer, Andreas

2014-09-01

226

Issues and solutions in surveillance camera placement  

Microsoft Academic Search

Cameras are becoming a common tool for automated vision purposes due to their low cost. Many surveillance and inspection systems include cameras as their sensor of choice. How useful these camera systems are is very dependent upon the positioning of the cameras. This is especially true if the cameras are to be used in automated systems as a beneficial camera

Duc Fehr; Loren Fiore; Nikolaos Papanikolopoulos

2009-01-01

227

Universal ICT Picosecond Camera  

NASA Astrophysics Data System (ADS)

The paper reports on the design of an ICI camera operating in the mode of linear or three-frame image scan. The camera incorporates two tubes: time-analyzing ICI PIM-107 1 with cathode S-11, and brightness amplifier PMU-2V (gain about 104) for the image shaped by the first tube. The camera is designed on the basis of streak camera AGAT-SF3 2 with almost the same power sources, but substantially modified pulse electronics. Schematically, the design of tube PIM-107 is depicted in the figure. The tube consists of cermet housing 1, photocathode 2 made in a separate vacuum volume and introduced into the housing by means of a manipulator. In a direct vicinity of the photocathode, accelerating electrode is located made of a fine-structure grid. An electrostatic lens formed by focusing electrode 4 and anode diaphragm 5 produces a beam of electrons with a "remote crossover". The authors have suggested this term for an electron beam whose crossover is 40 to 60 mm away from the anode diaphragm plane which guarantees high sensitivity of scan plates 6 with respect to multiaperture framing diaphragm 7. Beyond every diaphragm aperture, a pair of deflecting plates 8 is found shielded from compensation plates 10 by diaphragm 9. The electronic image produced by the photocathode is focused on luminescent screen 11. The tube is controlled with the help of two saw-tooth voltages applied in antiphase across plates 6 and 10. Plates 6 serve for sweeping the electron beam over the surface of diaphragm 7. The beam is either allowed toward the screen, or delayed by the diaphragm walls. In such a manner, three frames are obtained, the number corresponding to that of the diaphragm apertures. Plates 10 serve for stopping the compensation of the image streak sweep on the screen. To avoid overlapping of frames, plates 8 receive static potentials responsible for shifting frames on the screen. Changing the potentials applied to plates 8, one can control the spacing between frames and partially or fully overlap the frames. This sort of control is independent of the frequency of frame running and of their duration, and can only determine frame positioning on the screen. Since diaphragm 7 is located in the area of crossover and electron trajectories cross in the crossover, the frame is not decomposed into separate elements during its formation. The image is transferred onto the screen practically within the entire time of frame duration increasing the aperture ratio of the tube as compared to that in Ref. 3.

Lebedev, Vitaly B.; Syrtzev, V. N.; Tolmachyov, A. M.; Feldman, Gregory G.; Chernyshov, N. A.

1989-06-01

228

6.RP Security Camera  

NSDL National Science Digital Library

This is a task from the Illustrative Mathematics website that is one part of a complete illustration of the standard to which it is aligned. Each task has at least one solution and some commentary that addresses important asects of the task and its potential use. Here are the first few lines of the commentary for this task: A shop owner wants to prevent shoplifting. He decides to install a security camera on the ceiling of his shop. Below is a picture of the shop floor pla...

2012-05-01

229

Neutron Imaging Camera  

NASA Technical Reports Server (NTRS)

The Neutron Imaging Camera (NIC) is based on the Three-dimensional Track Imager (3DTI) technology developed at GSFC for gamma-ray astrophysics applications. The 3-DTI, a large volume time-projection chamber, provides accurate, approximately 0.4 mm resolution, 3-D tracking of charged particles. The incident direction of fast neutrons, En > 0.5 MeV, are reconstructed from the momenta and energies of the proton and triton fragments resulting from (sup 3)He(n,p) (sup 3)H interactions in the 3-DTI volume. The performance of the NIC from laboratory and accelerator tests is presented.

Hunter, Stanley; deNolfo, G. A.; Barbier, L. M.; Link, J. T.; Son, S.; Floyd, S. R.; Guardala, N.; Skopec, M.; Stark, B.

2008-01-01

230

Mars Science Laboratory Engineering Cameras  

NASA Technical Reports Server (NTRS)

NASA's Mars Science Laboratory (MSL) Rover, which launched to Mars in 2011, is equipped with a set of 12 engineering cameras. These cameras are build-to-print copies of the Mars Exploration Rover (MER) cameras, which were sent to Mars in 2003. The engineering cameras weigh less than 300 grams each and use less than 3 W of power. Images returned from the engineering cameras are used to navigate the rover on the Martian surface, deploy the rover robotic arm, and ingest samples into the rover sample processing system. The navigation cameras (Navcams) are mounted to a pan/tilt mast and have a 45-degree square field of view (FOV) with a pixel scale of 0.82 mrad/pixel. The hazard avoidance cameras (Haz - cams) are body-mounted to the rover chassis in the front and rear of the vehicle and have a 124-degree square FOV with a pixel scale of 2.1 mrad/pixel. All of the cameras utilize a frame-transfer CCD (charge-coupled device) with a 1024x1024 imaging region and red/near IR bandpass filters centered at 650 nm. The MSL engineering cameras are grouped into two sets of six: one set of cameras is connected to rover computer A and the other set is connected to rover computer B. The MSL rover carries 8 Hazcams and 4 Navcams.

Maki, Justin N.; Thiessen, David L.; Pourangi, Ali M.; Kobzeff, Peter A.; Lee, Steven W.; Dingizian, Arsham; Schwochert, Mark A.

2012-01-01

231

Small angle spectrometers: Summary  

SciTech Connect

Aspects of experiments at small angles at the Superconducting Super Collider are considered. Topics summarized include a small angle spectrometer, a high contingency spectrometer, dipole and toroid spectrometers, and magnet choices. (LEW)

Courant, E.; Foley, K.J.; Schlein, P.E.; Rosner, J.; Slaughter, J.; Bromberg, C.; Jones, L.; Garren, A.; Groom, D.; Johnson, D.E.

1986-01-01

232

Angles of Reflection  

NSDL National Science Digital Library

This interactive simulation shows what happens to light when it hits a mirror. The simluation allows the user to change the angle of the incoming or incident light wave and see the corresponding reflected angle.

Michael W. Davidson

2006-06-15

233

Reflectance characteristics of the Viking lander camera reference test charts  

NASA Technical Reports Server (NTRS)

Reference test charts provide radiometric, colorimetric, and spatial resolution references for the Viking lander cameras on Mars. Reflectance measurements of these references are described, including the absolute bidirectional reflectance of the radiometric references and the relative spectral reflectance of both radiometric and colorimetric references. Results show that the bidirection reflectance of the radiometric references is Lambertian to within + or - 7% for incidence angles between 20 deg and 60 deg, and that their spectral reflectance is constant with wavelength to within + or - 5% over the spectral range of the cameras. Estimated accuracy of the measurements is + or - 0.05 in relative spectral reflectance.

Wall, S. D.; Burcher, E. E.; Jabson, D. J.

1975-01-01

234

Bundle Adjustment for Multi-Camera Systems with Points at Infinity  

NASA Astrophysics Data System (ADS)

We present a novel approach for a rigorous bundle adjustment for omnidirectional and multi-view cameras, which enables an efficient maximum-likelihood estimation with image and scene points at infinity. Multi-camera systems are used to increase the resolution, to combine cameras with different spectral sensitivities (Z/I DMC, Vexcel Ultracam) or - like omnidirectional cameras - to augment the effective aperture angle (Blom Pictometry, Rollei Panoscan Mark III). Additionally multi-camera systems gain in importance for the acquisition of complex 3D structures. For stabilizing camera orientations - especially rotations - one should generally use points at the horizon over long periods of time within the bundle adjustment that classical bundle adjustment programs are not capable of. We use a minimal representation of homogeneous coordinates for image and scene points. Instead of eliminating the scale factor of the homogeneous vectors by Euclidean normalization, we normalize the homogeneous coordinates spherically. This way we can use images of omnidirectional cameras with single-view point like fisheye cameras and scene points, which are far away or at infinity. We demonstrate the feasibility and the potential of our approach on real data taken with a single camera, the stereo camera FinePix Real 3D W3 from Fujifilm and the multi-camera system Ladybug 3 from Point Grey.

Schneider, J.; Schindler, F.; Läbe, T.; Förstner, W.

2012-07-01

235

Wide and narrow dual image guidance system for ground vehicle on fast focusing and stereo matching operation  

NASA Astrophysics Data System (ADS)

We have developed the wide and narrow dual image guidance system for ground vehicle on fast focusing and stereo matching operation. The fast focusing catches the distance information of outside world. The stereo matching operation on the focused two wide images finds the characteristic position to detect the fine distance information through the fast focusing using the narrow images due to the camera with the long focal length. Our fast focusing algorithm works precisely on the differential image such as the Daubechies wavelet transformed high pass image, the Roberts image, Prewitt image, Sobel image and the Laplacian image. After the stereo matching operation on the focused wide images, the two cameras serves the narrow image focusing operation. This procedure establishes the reliability of the detection of the object and gives the fine image information of the object. The pointing operation of the long focal length camera of the narrow image uses the related pixel address information due to the stereo matching and the 2 axes gimbal equipment of the precise resolution. We experimented the detection of the object by stereo matching and ranging the fine distance by narrow image focusing. The experiment gives the appropriate detection and fine pointing of the narrow image focusing to meet the guidance capability of the ground vehicle.

Akiyama, Akira; Kobayashi, Nobuaki; Mutoh, Eiichiro; Kumagai, Hideo; Ishii, Hiromitsu

2011-10-01

236

MEMS digital camera  

NASA Astrophysics Data System (ADS)

MEMS technology uses photolithography and etching of silicon wafers to enable mechanical structures with less than 1 ?m tolerance, important for the miniaturization of imaging systems. In this paper, we present the first silicon MEMS digital auto-focus camera for use in cell phones with a focus range of 10 cm to infinity. At the heart of the new silicon MEMS digital camera, a simple and low-cost electromagnetic actuator impels a silicon MEMS motion control stage on which a lens is mounted. The silicon stage ensures precise alignment of the lens with respect to the imager, and enables precision motion of the lens over a range of 300 ?m with < 5 ?m hysteresis and < 2 ?m repeatability. Settling time is < 15 ms for 200 ?m step, and < 5ms for 20 ?m step enabling AF within 0.36 sec at 30 fps. The precise motion allows COTS optics to maintain MTF > 0.8 at 20 cy/mm up to 80% field over the full range of motion. Accelerated lifetime testing has shown that the alignment and precision of motion is maintained after 8,000 g shocks, thermal cycling from - 40 C to 85 C, and operation over 20 million cycles.

Gutierrez, R. C.; Tang, T. K.; Calvet, R.; Fossum, E. R.

2007-02-01

237

Angles All Around  

NSDL National Science Digital Library

Standard: Identify and measure right, obtuse, and acute angles. This is a two day activity. OBJECTIVE: We have learned about five different types of angles: right, acute, obtuse, straight, and reflex. We have also learned how to use a protractor to measure angles. With this lesson, you will practice what ...

Mrs. Bennett

2011-12-14

238

Digital synchroballistic schlieren camera for high-speed photography of bullets and rocket sleds  

NASA Astrophysics Data System (ADS)

A high-speed digital streak camera designed for simultaneous high-resolution color photography and focusing schlieren imaging is described. The camera uses a computer-controlled galvanometer scanner to achieve synchroballistic imaging through a narrow slit. Full color 20 megapixel images of a rocket sled moving at 480 m/s and of projectiles fired at around 400 m/s were captured, with high-resolution schlieren imaging in the latter cases, using conventional photographic flash illumination. The streak camera can achieve a line rate for streak imaging of up to 2.4 million lines/s.

Buckner, Benjamin D.; L'Esperance, Drew

2013-08-01

239

In-flight calibration of the Dawn Framing Camera II: Flat fields and stray light correction  

NASA Astrophysics Data System (ADS)

The NASA Dawn spacecraft acquired thousands of images of asteroid Vesta during its year-long orbital tour, and is now on its way to asteroid Ceres. A method for calibrating images acquired by the onboard Framing Camera was described by Schröder et al. (Schröder et al. [2013]. Icarus 226, 1304). However, their method is only valid for point sources. In this paper we extend the calibration to images of extended sources like Vesta. For this, we devise a first-order correction for in-field stray light, which is known to plague images taken through the narrow band filters, and revise the flat fields that were acquired in an integrating sphere before launch. We used calibrated images of the Vesta surface to construct simple photometric models for all filters, that allow us to study how the spectrum changes with increasing phase angle (phase reddening). In combination with these models, our calibration method can be used to create near-seamless mosaics that are radiometrically accurate to a few percent. Such mosaics are provided in JVesta, the Vesta version of the JMARS geographic information system.

Schröder, S. E.; Mottola, S.; Matz, K.-D.; Roatsch, T.

2014-05-01

240

Transmission electron microscope CCD camera  

DOEpatents

In order to improve the performance of a CCD camera on a high voltage electron microscope, an electron decelerator is inserted between the microscope column and the CCD. This arrangement optimizes the interaction of the electron beam with the scintillator of the CCD camera while retaining optimization of the microscope optics and of the interaction of the beam with the specimen. Changing the electron beam energy between the specimen and camera allows both to be optimized.

Downing, Kenneth H. (Lafayette, CA)

1999-01-01

241

The All Sky Camera Network  

NSDL National Science Digital Library

In 2001, the All Sky Camera Network came to life as an outreach program to connect the Denver Museum of Nature and Science (DMNS) exhibit Space Odyssey with Colorado schools. The network is comprised of cameras placed strategically at schools throughout Colorado to capture fireballs--rare events that produce meteorites. Students involved in the network participate in an authentic, inquiry-based experience by tracking meteor events. This article discusses the past, present, and future of the All Sky Camera Network.

Andy Caldwell

2005-02-01

242

The LSST camera instrument model  

NASA Astrophysics Data System (ADS)

The design of the Large Synoptic Survey Telescope (LSST) requires a camera system of unprecedented size and complexity. Achieving the science goals of the LSST project, through design, fabrication, integration, and operation, requires a thorough understanding of the camera performance. Essential to this effort is the camera modeling which defines the effects of a large number of potential mechanical, optical, electronic or sensor variations which can only be captured with sophisticated instrument modeling that incorporates all of the crucial parameters. This paper presents the ongoing development of LSST camera instrument modeling and details the parametric issues and attendant analysis involved with this modeling.

Gilmore, D. Kirk K.; Kahn, Steven; Hascall, Pat; Ku, John; O'Connor, Paul; Rasmussen, Andrew; Riot, Vincent; Singal, Jack

2012-09-01

243

Particle friction angles in steep mountain channels  

NASA Astrophysics Data System (ADS)

Sediment transport rates in steep mountain channels are typically an order of magnitude lower than predicted by models developed for lowland rivers. One hypothesis for this observation is that particles are more stable in mountain channels due to particle-particle interlocking or bridging across the channel width. This hypothesis has yet to be tested, however, because we lack direct measurements of particle friction angles in steep mountain channels. Here we address this data gap by directly measuring the minimum force required to dislodge sediment (pebbles to boulders) and the sediment weight in mountain channels using a handheld force gauge. At eight sites in California, with reach-averaged bed angles ranging from 0.5° to 23° and channel widths ranging from 2 m to 16 m, we show that friction angles in natural streams average 68° and are 16° larger than those typically measured in laboratory experiments, which is likely due to particle interlocking and burial. Results also show that larger grains are disproportionately more stable than predicted by existing models and that grains organized into steps are twice as stable as grains outside of steps. However, the mean particle friction angle does not vary systematically with bed slope. These results do not support systematic increases in friction angle in steeper and narrower channels to explain the observed low sediment transport rates in mountain channels. Instead, the spatial pattern and grain-size dependence of particle friction angles may indirectly lower transport rates in steep, narrow channels by stabilizing large clasts and channel-spanning steps, which act as momentum sinks due to form drag.

Prancevic, Jeff P.; Lamb, Michael P.

2015-02-01

244

A Motionless Camera  

NASA Technical Reports Server (NTRS)

Omniview, a motionless, noiseless, exceptionally versatile camera was developed for NASA as a receiving device for guiding space robots. The system can see in one direction and provide as many as four views simultaneously. Developed by Omniview, Inc. (formerly TRI) under a NASA Small Business Innovation Research (SBIR) grant, the system's image transformation electronics produce a real-time image from anywhere within a hemispherical field. Lens distortion is removed, and a corrected "flat" view appears on a monitor. Key elements are a high resolution charge coupled device (CCD), image correction circuitry and a microcomputer for image processing. The system can be adapted to existing installations. Applications include security and surveillance, teleconferencing, imaging, virtual reality, broadcast video and military operations. Omniview technology is now called IPIX. The company was founded in 1986 as TeleRobotics International, became Omniview in 1995, and changed its name to Interactive Pictures Corporation in 1997.

1994-01-01

245

Neutron Imaging Camera  

NASA Technical Reports Server (NTRS)

We describe the Neutron Imaging Camera (NIC) being developed for DTRA applications by NASA/GSFC and NSWC/Carderock. The NIC is based on the Three-dimensional Track Imager (3-DTI) technology developed at GSFC for gamma-ray astrophysics applications. The 3-DTI, a large volume time-projection chamber, provides accurate, approximately 0.4 mm resolution. 3-D tracking of charged particles. The incident direction of fast neutrons, E(sub N) > 0.5 MeV. arc reconstructed from the momenta and energies of the proton and triton fragments resulting from 3He(n,p)3H interactions in the 3-DTI volume. We present angular and energy resolution performance of the NIC derived from accelerator tests.

Hunter, Stanley D.; DeNolfo, Georgia; Floyd, Sam; Krizmanic, John; Link, Jason; Son, Seunghee; Guardala, Noel; Skopec, Marlene; Stark, Robert

2008-01-01

246

LSST Camera Electronics  

NASA Astrophysics Data System (ADS)

The 3.2 Gpixel LSST camera will be read out by means of 189 highly segmented 4K x 4K CCDs. A total of 3024 video channels will be processed by a modular, in-cryostat electronics package based on two custom multichannel analog ASICs now in development. Performance goals of 5 electrons noise, .01% electronic crosstalk, and 80 mW power dissipation per channel are targeted. The focal plane is organized as a set of 12K x 12K sub-mosaics ("rafts") with front end electronics housed in an enclosure falling within the footprint of the CCDs making up the raft. The assembly of CCDs, baseplate, electronics boards, and cooling components constitutes a self-contained and testable 144 Mpix imager ("raft tower"), and 21 identical raft towers make up the LSST science focal plane. Electronic, mechanical, and thermal prototypes are now undergoing testing and results will be presented at the meeting.

Newcomer, F. Mitchell; Bailey, S.; Britton, C. L.; Felt, N.; Geary, J.; Hashimi, K.; Lebbolo, H.; Lebbolo, H.; Ning, Z.; O'Connor, P.; Oliver, J.; Radeka, V.; Sefri, R.; Tocut, V.; Van Berg, R.

2009-01-01

247

What convention is used for the illumination and view angles?  

Atmospheric Science Data Center

... from the direction of travel to local north. For both the Sun and cameras, azimuth describes the direction in which the photons are ... light, and near 180 degrees for backward scattered light. Sun and View angles are available in the MISR Geometric Parameters (MIB2GEOP) ...

2014-12-08

248

Testing of the Apollo 15 Metric Camera System.  

NASA Technical Reports Server (NTRS)

Description of tests conducted (1) to assess the quality of Apollo 15 Metric Camera System data and (2) to develop production procedures for total block reduction. Three strips of metric photography over the Hadley Rille area were selected for the tests. These photographs were utilized in a series of evaluation tests culminating in an orbitally constrained block triangulation solution. Results show that film deformations up to 25 and 5 microns are present in the mapping and stellar materials, respectively. Stellar reductions can provide mapping camera orientations with an accuracy that is consistent with the accuracies of other parameters in the triangulation solutions. Pointing accuracies of 4 to 10 microns can be expected for the mapping camera materials, depending on variations in resolution caused by changing sun angle conditions.

Helmering, R. J.; Alspaugh, D. H.

1972-01-01

249

The "All Sky Camera Network"  

ERIC Educational Resources Information Center

In 2001, the "All Sky Camera Network" came to life as an outreach program to connect the Denver Museum of Nature and Science (DMNS) exhibit "Space Odyssey" with Colorado schools. The network is comprised of cameras placed strategically at schools throughout Colorado to capture fireballs--rare events that produce meteorites. Meteorites have great…

Caldwell, Andy

2005-01-01

250

An Educational PET Camera Model  

ERIC Educational Resources Information Center

Positron emission tomography (PET) cameras are now in widespread use in hospitals. A model of a PET camera has been installed in Stockholm House of Science and is used to explain the principles of PET to school pupils as described here.

Johansson, K. E.; Nilsson, Ch.; Tegner, P. E.

2006-01-01

251

Camera artifacts in IUE spectra  

NASA Technical Reports Server (NTRS)

This study of emission line mimicking features in the IUE cameras has produced an atlas of artifiacts in high-dispersion images with an accompanying table of prominent artifacts and a table of prominent artifacts in the raw images along with a medium image of the sky background for each IUE camera.

Bruegman, O. W.; Crenshaw, D. M.

1994-01-01

252

The Eye of the Camera  

Microsoft Academic Search

This study addresses the effects of security cameras on prosocial behavior. Results from previous studies indicate that the presence of others can trigger helping behavior, arising from the need for approval of others. Extending these findings, the authors propose that security cameras can likewise trigger such approval-seeking behaviors by implying the presence of a watchful eye. Because people vary in

Rompay van Thomas J. L; Dorette J. Vonk; Marieke L. Fransen

2009-01-01

253

Airborne ballistic camera tracking systems  

NASA Technical Reports Server (NTRS)

An operational airborne ballistic camera tracking system was tested for operational and data reduction feasibility. The acquisition and data processing requirements of the system are discussed. Suggestions for future improvements are also noted. A description of the data reduction mathematics is outlined. Results from a successful reentry test mission are tabulated. The test mission indicated that airborne ballistic camera tracking systems are feasible.

Redish, W. L.

1976-01-01

254

Math Applications with Digital Cameras  

NSDL National Science Digital Library

Digital cameras are excellent tools for enhancing the math classroom. They may also help, under the current math reform movement, to move away from isolated problems in a drill-and-practice format to one more rooted in authentic experiences and problem solving. Check this Web site for lesson ideas that combine digitial cameras and math.

Cavanaugh, Terrance

255

Mars Exploration Rover engineering cameras  

USGS Publications Warehouse

NASA's Mars Exploration Rover (MER) Mission will place a total of 20 cameras (10 per rover) onto the surface of Mars in early 2004. Fourteen of the 20 cameras are designated as engineering cameras and will support the operation of the vehicles on the Martian surface. Images returned from the engineering cameras will also be of significant importance to the scientific community for investigative studies of rock and soil morphology. The Navigation cameras (Navcams, two per rover) are a mast-mounted stereo pair each with a 45?? square field of view (FOV) and an angular resolution of 0.82 milliradians per pixel (mrad/pixel). The Hazard Avoidance cameras (Hazcams, four per rover) are a body-mounted, front- and rear-facing set of stereo pairs, each with a 124?? square FOV and an angular resolution of 2.1 mrad/pixel. The Descent camera (one per rover), mounted to the lander, has a 45?? square FOV and will return images with spatial resolutions of ???4 m/pixel. All of the engineering cameras utilize broadband visible filters and 1024 x 1024 pixel detectors. Copyright 2003 by the American Geophysical Union.

Maki, J.N.; Bell, J.F., III; Herkenhoff, K.E.; Squyres, S.W.; Kiely, A.; Klimesh, M.; Schwochert, M.; Litwin, T.; Willson, R.; Johnson, Aaron H.; Maimone, M.; Baumgartner, E.; Collins, A.; Wadsworth, M.; Elliot, S.T.; Dingizian, A.; Brown, D.; Hagerott, E.C.; Scherr, L.; Deen, R.; Alexander, D.; Lorre, J.

2003-01-01

256

Digital Cameras for Student Use.  

ERIC Educational Resources Information Center

Describes the features, equipment and operations of digital cameras and compares three different digital cameras for use in education. Price, technology requirements, features, transfer software, and accessories for the Kodak DC25, Olympus D-200L and Casio QV-100 are presented in a comparison table. (AEF)

Simpson, Carol

1997-01-01

257

SEOS frame camera applications study  

NASA Technical Reports Server (NTRS)

A research and development satellite is discussed which will provide opportunities for observation of transient phenomena that fall within the fixed viewing circle of the spacecraft. The evaluation of possible applications for frame cameras, for SEOS, are studied. The computed lens characteristics for each camera are listed.

1974-01-01

258

Advisory Surveillance Cameras Page 1 of 2  

E-print Network

be produced and how will it be secured, who will have access to the tape? 7. At what will the camera to ensure the cameras' presence doesn't create a false sense of security #12;Advisory ­ Surveillance CamerasAdvisory ­ Surveillance Cameras May 2008 Page 1 of 2 ADVISORY -- USE OF CAMERAS/VIDEO SURVEILLANCE

Liebling, Michael

259

Surveillance Camera Scheduling: A Virtual Vision Approach  

E-print Network

Surveillance Camera Scheduling: A Virtual Vision Approach Faisal Z. Qureshi1 and Demetri a surveillance system, comprising wide field-of-view (FOV) passive cameras and pan/tilt/zoom (PTZ) active cameras cameras generate synthetic video feeds that emu- late those generated by real surveillance cameras

Terzopoulos, Demetri

260

Saturn's hydrogen aurora: Wide field and planetary camera 2 imaging from the Hubble Space Telescope  

Microsoft Academic Search

Wide field and planetary camera 2\\/Hubble Space Telescope (WFPC2\\/HST) images of Saturn's far ultraviolet aurora reveal emissions confined to a narrow band of latitudes near Saturn's north and south poles. The aurorae are most prominent in the morning sector with patterns that appear fixed in local time. The geographic distribution and vertical extent of the auroral emissions seen in these

John T. Trauger; John T. Clarke; Gilda E. Ballester; Robin W. Evans; Christopher J. Burrows; David Crisp; John S. Gallagher; Richard E. Griffiths; J. Jeff Hester; John G. Hoessel; Jon A. Holtzman; John E. Krist; Jeremy R. Mould; Raghvendra Sahai; Paul A. Scowen; Karl R. Stapelfeldt; Alan M. Watson

1998-01-01

261

Performance of new low-cost 1/3" security cameras for meteor surveillance  

NASA Astrophysics Data System (ADS)

It has been almost 5 years since the CAMS (Cameras for All-sky Meteor Surveillance) system specifications were designed for video meteor surveillance. CAMS has been based on a relatively expensive black-and-white Watec WAT-902H2 Ultimate camera, which uses a 1/2" sensor. In this paper, we investigate the ability of new, lower cost color cameras based on smaller 1/3" sensors to be able to perform adequately for CAMS. We did not expect them to equal or outperform the sensitivity for the same field of view of the Watec 1/2" camera, but the goal was to see if they could perform within the tolerances of the sensitivity requirements for the CAMS project. Their lower cost brings deployment of meteor surveillance cameras within reach of amateur astronomers and makes it possible to deploy many more cameras to increase yield. The lens focal length is matched to the elevation angle of the camera to maintain an image scale and spatial resolution close to that of the standard CAMS camera and lens combination, crucial for obtaining sufficiently accurate orbital elements. An all-sky array based on 16 such cameras, to be operated from a single computer, was built and the performance of individual cameras was tested.

Samuels, Dave; Wray, James; Gural, Peter S.; Jenniskens, Peter

2014-02-01

262

IMAX camera (12-IML-1)  

NASA Technical Reports Server (NTRS)

The IMAX camera system is used to record on-orbit activities of interest to the public. Because of the extremely high resolution of the IMAX camera, projector, and audio systems, the audience is afforded a motion picture experience unlike any other. IMAX and OMNIMAX motion picture systems were designed to create motion picture images of superior quality and audience impact. The IMAX camera is a 65 mm, single lens, reflex viewing design with a 15 perforation per frame horizontal pull across. The frame size is 2.06 x 2.77 inches. Film travels through the camera at a rate of 336 feet per minute when the camera is running at the standard 24 frames/sec.

1992-01-01

263

Coherent infrared imaging camera (CIRIC)  

SciTech Connect

New developments in 2-D, wide-bandwidth HgCdTe (MCT) and GaAs quantum-well infrared photodetectors (QWIP) coupled with Monolithic Microwave Integrated Circuit (MMIC) technology are now making focal plane array coherent infrared (IR) cameras viable. Unlike conventional IR cameras which provide only thermal data about a scene or target, a coherent camera based on optical heterodyne interferometry will also provide spectral and range information. Each pixel of the camera, consisting of a single photo-sensitive heterodyne mixer followed by an intermediate frequency amplifier and illuminated by a separate local oscillator beam, constitutes a complete optical heterodyne receiver. Applications of coherent IR cameras are numerous and include target surveillance, range detection, chemical plume evolution, monitoring stack plume emissions, and wind shear detection.

Hutchinson, D.P.; Simpson, M.L.; Bennett, C.A.; Richards, R.K.; Emery, M.S.; Crutcher, R.I.; Sitter, D.N. Jr.; Wachter, E.A.; Huston, M.A.

1995-07-01

264

Tacoma Narrows Bridge: Extreme History  

NSDL National Science Digital Library

Stretching across the southern portion of Puget Sound, the elegant Tacoma Narrows bridge is considered one of the finest suspension bridges in the United States. The current bridge is the second on the site, as it was constructed in 1950 to serve as a replacement to the famous "Galloping Gertie" bridge, which collapsed in a windstorm in the fall of 1940. Currently, the Washington State Department of Transportation is building a bridge to replace the existing structure, and it is anticipated that it will be completed in 2007. This site offers a host of materials on all three structures, including ample information on the construction of the bridges and their aesthetic appeal. Along with these materials, the site also provides a glossary of related terms, Weird Facts, and some information about the dog "Tubby", who perished when "Galloping Gertie" collapsed on that fateful fall day back in 1940.

265

Real-time object detection for a panning\\/tilting surveillance camera using an embedded DSP platform  

Microsoft Academic Search

Object detection for a non-static (Panning and Tilting) camera has been the centre of interests of various applications. In this paper, we present a novel real-time object detection approach that can efficiently handle the problem of a moving camera that performs panning and tilting motions to incorporate a wide field of view for more security and a wider angle of

M. L. Lemmou; N. Dahnoun

2010-01-01

266

Observation of Planetary Motion Using a Digital Camera  

ERIC Educational Resources Information Center

A digital SLR camera with a standard lens (50 mm focal length, f/1.4) on a fixed tripod is used to obtain photographs of the sky which contain stars up to 8[superscript m] apparent magnitude. The angle of view is large enough to ensure visual identification of the photograph with a large sky region in a stellar map. The resolution is sufficient to…

Meyn, Jan-Peter

2008-01-01

267

Mars Orbiter Camera Views the 'Face on Mars' - Best View from Viking  

NASA Technical Reports Server (NTRS)

Shortly after midnight Sunday morning (5 April 1998 12:39 AM PST), the Mars Orbiter Camera (MOC) on the Mars Global Surveyor (MGS) spacecraft successfully acquired a high resolution image of the 'Face on Mars' feature in the Cydonia region. The image was transmitted to Earth on Sunday, and retrieved from the mission computer data base Monday morning (6 April 1998). The image was processed at the Malin Space Science Systems (MSSS) facility 9:15 AM and the raw image immediately transferred to the Jet Propulsion Laboratory (JPL) for release to the Internet. The images shown here were subsequently processed at MSSS.

The picture was acquired 375 seconds after the spacecraft's 220th close approach to Mars. At that time, the 'Face', located at approximately 40.8o N, 9.6o W, was 275 miles (444 km) from the spacecraft. The 'morning' sun was 25o above the horizon. The picture has a resolution of 14.1 feet (4.3 meters) per pixel, making it ten times higher resolution than the best previous image of the feature, which was taken by the Viking Mission in the mid-1970's. The full image covers an area 2.7 miles (4.4 km) wide and 25.7 miles (41.5 km) long.

This Viking Orbiter image is one of the best Viking pictures of the area Cydonia where the 'Face' is located. Marked on the image are the 'footprint' of the high resolution (narrow angle) Mars Orbiter Camera image and the area seen in enlarged views (dashed box). See PIA01440-1442 for these images in raw and processed form.

Malin Space Science Systems and the California Institute of Technology built the MOC using spare hardware from the Mars Observer mission. MSSS operates the camera from its facilities in San Diego, CA. The Jet Propulsion Laboratory's Mars Surveyor Operations Project operates the Mars Global Surveyor spacecraft with its industrial partner, Lockheed Martin Astronautics, from facilities in Pasadena, CA and Denver, CO.

1998-01-01

268

Overview of Neutrino Mixing Models and Their Mixing Angle Predictions  

SciTech Connect

An overview of neutrino-mixing models is presented with emphasis on the types of horizontal flavor and vertical family symmetries that have been invoked. Distributions for the mixing angles of many models are displayed. Ways to differentiate among the models and to narrow the list of viable models are discussed.

Albright, Carl H.

2009-11-01

269

21 CFR 892.1110 - Positron camera.  

Code of Federal Regulations, 2010 CFR

...2010-04-01 2010-04-01 false Positron camera. 892.1110 Section 892.1110 Food...Diagnostic Devices § 892.1110 Positron camera. (a) Identification. A positron camera is a device intended to image the...

2010-04-01

270

21 CFR 886.1120 - Opthalmic camera.  

Code of Federal Regulations, 2010 CFR

...2010-04-01 2010-04-01 false Opthalmic camera. 886.1120 Section 886.1120 Food...Diagnostic Devices § 886.1120 Opthalmic camera. (a) Identification. An ophthalmic camera is an AC-powered device intended to...

2010-04-01

271

Smart Camera Networks in Virtual Reality  

E-print Network

and active simulated video surveillance cameras that provides extensive coverage of a large virtual public simulated network of smart cameras performs persistent visual surveillance of individual pedestrians observation; sensor networks; smart cameras; virtual reality; visual surveillance I. INTRODUCTION Future

Qureshi, Faisal Z.

272

Camera handoff and placement for automated tracking systems with multiple omnidirectional cameras  

E-print Network

Camera placement Camera handoff Multi-object multi-camera tracking Automated surveillance systems a b s t r a c t In a multi-camera surveillance system, both camera handoff and placement play an important tracking, and stereo vision of omnidirectional cameras. With the increased scale of practical surveillance

Abidi, Mongi A.

273

Interline transfer CCD camera  

SciTech Connect

An interline CCD sensing device for use in a camera system, includes an imaging area sensitive to impinging light, for generating charges corresponding to the intensity of the impinging light. Sixteen independent registers R1 - R16 sequentially receive the interline data from the imaging area, corresponding to the generated charges. Sixteen output amplifiers S1 - S16 and sixteen ports P1 - P16 for sequentially transferring the interline data, one pixel at a time, in order to supply a desired image transfer speed. The imaging area is segmented into sixteen independent imaging segments A1 - A16, each of which corresponds to one register, on output amplifier, and one output port. Each one of the imaging segments A1 - A16 includes an array of rows and columns of pixels. Each pixel includes a photogate area, an interline CCD channel area, and an anti-blooming area. The anti-blooming area is, in turn, divided into an anti-blooming barrier and an anti-blooming drain.

Prokop, M.S.; McCurnin, T.W.; Stump, C.J.; Stradling, G.L.

1993-12-31

274

LSST Camera Electronics  

NASA Astrophysics Data System (ADS)

The 3.2 Gpixel LSST camera will be read out by means of 189 highly segmented 4K x 4K CCDs. A total of 3024 video channels will be processed by a modular, in-cryostat electronics package based on two custom multichannel analog ASICs now in development. Performance goals of 5 electrons noise, .01% electronic crosstalk, and 80 mW power dissipation per channel are targeted. The focal plane is organized as a set of 12K x 12K sub-mosaics ("rafts") with front end electronics housed in an enclosure falling within the footprint of the CCDs making up the raft. CCD surfaces within a raft are required to be coplanar to within 6.5 microns. The assembly of CCDs, baseplate, electronics boards, and cooling components constitutes a self-contained and testable 144 Mpix imager ("raft tower"), and 21 identical raft towers make up the LSST science focal plane. Electronic, mechanical, and thermal prototypes are now undergoing testing and results will be presented at the meeting.

Van Berg, Richard; O'Connor, P.; Oliver, J.; Geary, J.; Radeka, V.

2007-12-01

275

Focal Plane Metrology for the LSST Camera  

SciTech Connect

Meeting the science goals for the Large Synoptic Survey Telescope (LSST) translates into a demanding set of imaging performance requirements for the optical system over a wide (3.5{sup o}) field of view. In turn, meeting those imaging requirements necessitates maintaining precise control of the focal plane surface (10 {micro}m P-V) over the entire field of view (640 mm diameter) at the operating temperature (T {approx} -100 C) and over the operational elevation angle range. We briefly describe the hierarchical design approach for the LSST Camera focal plane and the baseline design for assembling the flat focal plane at room temperature. Preliminary results of gravity load and thermal distortion calculations are provided, and early metrological verification of candidate materials under cold thermal conditions are presented. A detailed, generalized method for stitching together sparse metrology data originating from differential, non-contact metrological data acquisition spanning multiple (non-continuous) sensor surfaces making up the focal plane, is described and demonstrated. Finally, we describe some in situ alignment verification alternatives, some of which may be integrated into the camera's focal plane.

A Rasmussen, Andrew P.; Hale, Layton; Kim, Peter; Lee, Eric; Perl, Martin; Schindler, Rafe; Takacs, Peter; Thurston, Timothy; /SLAC

2007-01-10

276

X-ray Pinhole Camera Measurements  

SciTech Connect

The development of the rod pinch diode [1] has led to high-resolution radiography for dynamic events such as explosive tests. Rod pinch diodes use a small diameter anode rod, which extends through the aperture of a cathode plate. Electrons borne off the aperture surface can self-insulate and pinch onto the tip of the rod, creating an intense, small x-ray source (Primary Pinch). This source has been utilized as the main diagnostic on numerous experiments that include high-value, single-shot events. In such applications there is an emphasis on machine reliability, x-ray reproducibility, and x-ray quality [2]. In tests with the baseline rod pinch diode, we have observed that an additional pinch (Secondary Pinch) occurs at the interface near the anode rod and the rod holder. This suggests that stray electrons exist that are not associated with the Primary Pinch. In this paper we present measurements on both pinches using an x-ray pinhole camera. The camera is placed downstream of the Primary Pinch at an angle of 60° with respect to the diode centerline. This diagnostic will be employed to diagnose x-ray reproducibility and quality. In addition, we will investigate the performance of hybrid diodes relating to the formation of the Primary and Secondary Pinches.

Nelson, D. S. [NSTec; Berninger, M. J. [NSTec; Flores, P. A. [NSTec; Good, D. E. [NSTec; Henderson, D. J. [NSTec; Hogge, K. W. [NSTec; Huber, S. R. [NSTec; Lutz, S. S. [NSTec; Mitchell, S. E. [NSTec; Howe, R. A. [NSTec; Mitton, C. V. [NSTec; Molina, I. [NSTec; Bozman, D. R. [SNL; Cordova, S. R. [SNL; Mitchell, D. R. [SNL; Oliver, B. V. [SNL; Ormond, E. C. [SNL

2013-07-01

277

Design of motion compensation mechanism of satellite remote sensing camera  

NASA Astrophysics Data System (ADS)

With the development of aerospace remote sensing technology, the ground resolution of remote sensing camera enhances continuously. Since there is relative motion between camera and ground target when taking pictures, the target image recorded in recording media is moved and blurred. In order to enhance the imaging quality and resolution of the camera, the image motion had to be compensated. In order to abate the effect of image motion to image quality of space camera and improve the resolution of the camera, the compensation method of image motion to space camera is researched. First, the reason of producing drift angle and adjustment principle are analyzed in this paper. This paper introduce the composition and transmission principle of image motion compensation mechanism. Second, the system adopts 80C31 as controller of drift angle, and adopts stepping motor for actuators, and adopts absolute photoelectric encoder as the drift Angle measuring element. Then the control mathematical model of the image motion compensation mechanism are deduced, and it achieve the closed-loop control of the drift angle position. At the last, this paper analyses the transmission precision of the mechanism. Through the experiment, we measured the actual precision of the image motion compensation mechanism, and compared with the theoretical analysis.There are two major contributions in this paper. First, the traditional image motion compensation mechanism is big volume and quality heavy. This has not fit for the development trend of space camera miniaturization and lightweight. But if reduce the volume and quality of mechanism, it will bring adverse effects for the precision and stiffness of mechanism. For this problem, This paper designed a image motion compensation that have some advantages such as small size, light weight at the same time, high precision, stiffness and so on. This image motion compensation can be applicable to the small optics cameras with high resolution. Second, the traditional mechanism control need to corrected, fitting and iterative for the control formula of mechanism. Only in this way, we can get the optimal control mathematical model. This paper has high precision of the control formula derived. It can achieve the high precision control without fitting, It also simplify the difficulty of control mathematical model establishment.This paper designed the range of adjusting of image motion compensation mechanism between -5°~ +5°. Based on choosing-5°, -4°, -3°, -2°, -1°, 0°, +1°, +2, +3°, +4°, +4° as the expectation value of the imaginary drift angle, we get ten groups of the fact data in adjusting drift angle measured. The test results show that the precision of the drift angle control system can be achieved in 1. It can meet the system requirements that the precision of the control system is less than 3 ', and it can achieve the high-precision image motion compensation.

Gu, Song; Yan, Yong; Xu, Kai; Jin, Guang

2011-08-01

278

A Three-Line Stereo Camera Concept for Planetary Exploration  

NASA Technical Reports Server (NTRS)

This paper presents a low-weight stereo camera concept for planetary exploration. The camera uses three CCD lines within the image plane of one single objective. Some of the main features of the camera include: focal length-90 mm, FOV-18.5 deg, IFOV-78 (mu)rad, convergence angles-(+/-)10 deg, radiometric dynamics-14 bit, weight-2 kg, and power consumption-12.5 Watts. From an orbit altitude of 250 km the ground pixel size is 20m x 20m and the swath width is 82 km. The CCD line data is buffered in the camera internal mass memory of 1 Gbit. After performing radiometric correction and application-dependent preprocessing the data is compressed and ready for downlink. Due to the aggressive application of advanced technologies in the area of microelectronics and innovative optics, the low mass and power budgets of 2 kg and 12.5 Watts is achieved, while still maintaining high performance. The design of the proposed light-weight camera is also general purpose enough to be applicable to other planetary missions such as the exploration of Mars, Mercury, and the Moon. Moreover, it is an example of excellent international collaboration on advanced technology concepts developed at DLR, Germany, and NASA's Jet Propulsion Laboratory, USA.

Sandau, Rainer; Hilbert, Stefan; Venus, Holger; Walter, Ingo; Fang, Wai-Chi; Alkalai, Leon

1997-01-01

279

What's My Angle?  

NSDL National Science Digital Library

This interactive module offers learners the opportunity to check their knowledge of angle measure and estimation, and the use of a protractor. There are ten activities that vary the tasks and the degree of precision. The site is designed for whiteboard demonstration as well, and it includes a tutorial on angle types and protractor use.

2011-01-01

280

Reading Angles in Maps  

ERIC Educational Resources Information Center

Preschool children can navigate by simple geometric maps of the environment, but the nature of the geometric relations they use in map reading remains unclear. Here, children were tested specifically on their sensitivity to angle. Forty-eight children (age 47:15-53:30 months) were presented with fragments of geometric maps, in which angle sections…

Izard, Véronique; O'Donnell, Evan; Spelke, Elizabeth S.

2014-01-01

281

Polygon Angle Applet  

NSDL National Science Digital Library

This interactive Java applet supports the investigation of the relationship between the number of vertices of a polygon and its interior angle sum. Learners choose and locate the vertices, the angle measures are displayed, and then the student can drag the measures into a circle to see them summed relative to 360 degrees.

Nicholas Exner

2000-05-31

282

Person re-identification over camera networks using multi-task distance metric learning.  

PubMed

Person reidentification in a camera network is a valuable yet challenging problem to solve. Existing methods learn a common Mahalanobis distance metric by using the data collected from different cameras and then exploit the learned metric for identifying people in the images. However, the cameras in a camera network have different settings and the recorded images are seriously affected by variability in illumination conditions, camera viewing angles, and background clutter. Using a common metric to conduct person reidentification tasks on different camera pairs overlooks the differences in camera settings; however, it is very time-consuming to label people manually in images from surveillance videos. For example, in most existing person reidentification data sets, only one image of a person is collected from each of only two cameras; therefore, directly learning a unique Mahalanobis distance metric for each camera pair is susceptible to over-fitting by using insufficiently labeled data. In this paper, we reformulate person reidentification in a camera network as a multitask distance metric learning problem. The proposed method designs multiple Mahalanobis distance metrics to cope with the complicated conditions that exist in typical camera networks. We address the fact that these Mahalanobis distance metrics are different but related, and learned by adding joint regularization to alleviate over-fitting. Furthermore, by extending, we present a novel multitask maximally collapsing metric learning (MtMCML) model for person reidentification in a camera network. Experimental results demonstrate that formulating person reidentification over camera networks as multitask distance metric learning problem can improve performance, and our proposed MtMCML works substantially better than other current state-of-the-art person reidentification methods. PMID:24956368

Ma, Lianyang; Yang, Xiaokang; Tao, Dacheng

2014-08-01

283

Polarizing aperture stereoscopic cinema camera  

NASA Astrophysics Data System (ADS)

The art of stereoscopic cinematography has been held back because of the lack of a convenient way to reduce the stereo camera lenses' interaxial to less than the distance between the eyes. This article describes a unified stereoscopic camera and lens design that allows for varying the interaxial separation to small values using a unique electro-optical polarizing aperture design for imaging left and right perspective views onto a large single digital sensor, the size of the standard 35 mm frame, with the means to select left and right image information. Even with the added stereoscopic capability, the appearance of existing camera bodies will be unaltered.

Lipton, Lenny

2012-07-01

284

BLAST Autonomous Daytime Star Cameras  

E-print Network

We have developed two redundant daytime star cameras to provide the fine pointing solution for the balloon-borne submillimeter telescope, BLAST. The cameras are capable of providing a reconstructed pointing solution with an absolute accuracy daytime float conditions. Each camera combines a 1 megapixel CCD with a 200 mm f/2 lens to image a 2 degree x 2.5 degree field of the sky. The instruments are autonomous. An internal computer controls the temperature, adjusts the focus, and determines a real-time pointing solution at 1 Hz. The mechanical details and flight performance of these instruments are presented.

Marie Rex; Edward Chapin; Mark J. Devlin; Joshua Gundersen; Jeff Klein; Enzo Pascale; Donald Wiebe

2006-05-01

285

BLAST Autonomous Daytime Star Cameras  

E-print Network

We have developed two redundant daytime star cameras to provide the fine pointing solution for the balloon-borne submillimeter telescope, BLAST. The cameras are capable of providing a reconstructed pointing solution with an absolute accuracy daytime float conditions. Each camera combines a 1 megapixel CCD with a 200 mm f/2 lens to image a 2 degree x 2.5 degree field of the sky. The instruments are autonomous. An internal computer controls the temperature, adjusts the focus, and determines a real-time pointing solution at 1 Hz. The mechanical details and flight performance of these instruments are presented.

Rex, M; Devlin, M J; Gundersen, J; Klein, J; Pascale, E; Wiebe, D; Rex, Marie; Chapin, Edward; Devlin, Mark J.; Gundersen, Joshua; Klein, Jeff; Pascale, Enzo; Wiebe, Donald

2006-01-01

286

Special Angle Pairs Discovery Activity  

NSDL National Science Digital Library

This lesson uses a discovery approach to identify the special angles formed when a set of parallel lines is cut by a transversal. During this lesson students identify the angle pair and the relationship between the angles. Students use this relationship and special angle pairs to make conjectures about which angle pairs are considered special angles.

Barbara Henry

2012-04-16

287

Critical Heat Flux In Inclined Rectangular Narrow Long Channel  

SciTech Connect

In the TMI-2 accident, the lower part of the reactor pressure vessel had been overheated and then rather rapidly cooled down, as was later identified in a vessel investigation project. This accounted for the possibility of gap cooling feasibility. For this reason, several investigations were performed to determine the critical heat flux (CHF) from the standpoint of invessel retention. The experiments are conducted to investigate the general boiling phenomena, and the triggering mechanism for the CHF in a narrow gap using a 5 x 105 mm2 crevice type heater assembly and de-mineralized water. The test parameters include the gap size of 5 mm, and the surface orientation angles from the downward facing position (180o) to the vertical position (90o). The orientation angle affects the bubble layer and escape from the narrow gap. The CHF is less than that in a shorter channel, compared with the previous experiments having a heated length of 35 mmin the copper test section.

J. L. Rempe; S. W. Noh; Y. H. Kim; K. Y. Suh; F.B.Cheung; S. B. Kim

2005-05-01

288

Viewing-angle controllable liquid crystal display using a fringe and vertical-field driven hybrid aligned nematic liquid crystal  

Microsoft Academic Search

Viewing-angle controllable liquid crystal display (LCD) without using an additional panel or pixel division is proposed. In the device, hybrid aligned nematic liquid crystal using a liquid crystal with negative dielectric anisotropy is used. The device shows narrow viewing angle when it is driven only by fringe-electric field, however, it exhibits wide viewing angle if driven by vertical- as well

Eun Jeong; Young Jin Lim; Mi Hyung Chin; Jin Ho Kim; Seung Hee Lee; Seung Hoon Ji; Gi-Dong Lee; Kyoung Ho Park; Hyun Chul Choi; Byung Chul Ahn

2008-01-01

289

Properties of Creeping Discharge in Narrow Gap between Dielectric Plates in PFAE Oil  

NASA Astrophysics Data System (ADS)

Using the lightning impulse voltages of ±1.2/50 ?s and ±1.2/1000 ?s with ±140 kVpeak in maximum, the behaviors of creeping streamers progressed in a narrow gap between two solid dielectric plates immersed in palm fatty acid ester (PFAE) oil were investigated. The discharge shapes and the streamer lengths were observed using a still camera equipped with a night viewer and the progression steps of the streamer were observed using a high-speed image converter camera. The effects of two interfaces between solid dielectrics, the back side electrode (BSE) and the wave tail of impulse voltages were examined on the growth of positive and negative streamers, the flashover voltage and the streamer velocity. The streamer length is extended by the presence of the BSE and a long wave tail of impulse voltage. This results in the reduction of the flashover voltage. It is worthy of notice that the negative streamer progressed in a narrow gap grows longer than the positive streamer under identical applied voltage. This polarity effect on the streamer length is the inverse from that in an oil/pressboard interface without a narrow gap. The streamers in both polarities also slow down its velocity because of a narrow gap with two solid interfaces. These results on the creeping discharge have been compared with those in commercial mineral oil.

Usui, Takuro; Hanaoka, Ryoichi; Takata, Shinzo; Kanamaru, Yasunori; Koide, Hidenobu; Nakagami, Yoshitake

290

Flexural waves on narrow plates.  

PubMed

Flexural wave speeds on beams or plates depend upon the bending stiffnesses which differ by the well-known factor (1 - nu2). A quantitative analysis of a plate of finite lateral width displays the plate-to-beam transition, and permits asymptotic analysis that shows the leading order dependence on the width. Orthotropic plates are analyzed using both the Kirchhoff and Kirchhoff-Rayleigh theories, and isotropic plates are considered for Mindlin's theory with and without rotational inertia. A frequency-dependent Young's modulus for beams or strips of finite width is suggested, although the form of the correction to the modulus is not unique and depends on the theory used. The sign of the correction for the Kirchhoff theory is opposite to that for the Mindlin theory. These results indicate that the different plate and beam theories can produce quite distinct behavior. This divergence in predictions is further illustrated by comparison of the speeds for antisymmetric flexural, or torsional, modes on narrow plates. The four classical theories predict limiting wave speeds as the plate width vanishes, but the values are different in each case. The deviations can be understood in terms of torsional waves and how each theory succeeds, or fails, in approximating the effect of torsion. Dispersion equations are also derived, some for the first time, for the flexural edge wave in each of the four "engineering" theories. PMID:12765383

Norris, Andrew N

2003-05-01

291

A lexicon for Camera Obscura  

E-print Network

The camera obscura has allowed artists, scientists, and philosophers to view the world as a flat image. Two - dimensional renditions of visual reality seem to be more manageable and easier to grasp than reality itself. A ...

Rosinsky, Robert David

1984-01-01

292

An Inexpensive Digital Infrared Camera  

ERIC Educational Resources Information Center

Details are given for the conversion of an inexpensive webcam to a camera specifically sensitive to the near infrared (700-1000 nm). Some experiments and practical applications are suggested and illustrated. (Contains 9 figures.)

Mills, Allan

2012-01-01

293

National Park Service Web Cameras  

NSDL National Science Digital Library

The National Park Service (NPS) operates digital cameras at many parks in the lower 48 states, Alaska, and Hawaii to help educate the public on air quality issues. These cameras often show the effects of air pollution, especially visibility impairment. Because the cameras are typically located near air quality monitoring sites, their web pages display other information along with the photo, such as current levels of ozone, particulate matter, or sulfur dioxide, visual range, and weather conditions. The digital photos are usually updated every 15 minutes, while air quality data values are revised hourly. Charts of the last ten days of hourly weather, ozone, particulate matter, or sulfur dioxide data are also available. The cameras are accessible by clicking on an interactive map.

294

Omnidirectional narrow bandpass filters based on one-dimensional superconductor-dielectric photonic crystal heterostructors  

NASA Astrophysics Data System (ADS)

By using transfer matrix method, narrow passbands of TE wave from one-dimensional superconductor-dielectric photonic crystal heterostructures are presented. Various superconductor within the two-fluid model are considered. Results show that by selecting proper width for superconductor and dielectric layers and proper materials selection, single narrow passband in visible region can be obtained. Behavior of these passbands versus the temperature of superconductors, external magnetic field and incident angle are considered. We have shown that it is possible to obtain omnidirectional passbands with examining temperature, the dilation factor of the half part of a heterostructure and the other parameters of the heterostrutures. These tunable narrow passband may be useful in designing of narrow band filters or multichannel filters.

Barvestani, Jamal

2015-01-01

295

Solid State Television Camera (CID)  

NASA Technical Reports Server (NTRS)

The design, development and test are described of a charge injection device (CID) camera using a 244x248 element array. A number of video signal processing functions are included which maximize the output video dynamic range while retaining the inherently good resolution response of the CID. Some of the unique features of the camera are: low light level performance, high S/N ratio, antiblooming, geometric distortion, sequential scanning and AGC.

Steele, D. W.; Green, W. T.

1976-01-01

296

SCface - surveillance cameras face database  

Microsoft Academic Search

In this paper we describe a database of static images of human faces. Images were taken in uncontrolled indoor environment\\u000a using five video surveillance cameras of various qualities. Database contains 4,160 static images (in visible and infrared\\u000a spectrum) of 130 subjects. Images from different quality cameras should mimic real-world conditions and enable robust face\\u000a recognition algorithms testing, emphasizing different law

Mislav Grgic; Kresimir Delac; Sonja Grgic

2011-01-01

297

Vibration detection and calibration method used to remote sensing optical camera  

NASA Astrophysics Data System (ADS)

In order to obtain sharp remote sensing images, the image stabilization technology of space camera and the remote sensing image restoration technology are usually used now. Vibration detection is the key to realize these technologies: an image stabilization system needs the displacement vector derived from vibration detection to drive the compensation mechanism; and the remote sensing image restoration technology needs the vibration displacement vector to construct the point spread function (PSF). Vibration detection not only can be used to improve image quality of panchromatic camera, infrared cameras and other optical camera, also is motion compensation basis of satellite radar equipment. In this paper we have constructed a vibration measuring method based on Fiber optic gyro (FOG). FOG is a device sensitive to angular velocity or angular displacement. High-precision FOG can be used to measure the jitter angle of the optic axis of a space camera fixed on satellite platform. According to the measured data, the vibration displacement vector of the imaging plane can be calculated. Consequently the vibration data provide a basis for image stabilization of space camera and restoration of remote sensing images. We simulated the vibration of a space camera by using a piezoelectric ceramic deflection platform, and calibrated vibration measurement by using laser beam and a high-speed linear array camera. We compared the feedback output of the deflection platform, the FOG measured data and the calibrated data of the linear array camera, and obtained a calibration accuracy better than 1.5 ?rad.

Li, Qi; Dong, Wende; Xu, Zhihai; Feng, Huajun

2013-09-01

298

Riser angle control apparatus and method  

SciTech Connect

A riser angle control apparatus and method is described for use with a floating vessel having well drilling equipment, including an elongated riser having upper elements connected to the floating vessel by vessel connection means, and first flexible coupling means located adjacent the vessel connection means, the riser extending down from the vessel connection means to a point adjacent the ocean floor. The improvement described here consists of riser angle control apparatus comprising: second flexible coupling means operatively engaged with a portion of the upper elements of the riser below the first flexible coupling means, liquid phase of the pumpable waste or increase the specific gravity thereof; and adsorbents, in order to minimize convergence of the salt cavern by narrowing the difference between the specific gravity of the salt cavern walls and the specific gravity of the liquid phase of the pumpable waste by converting the liquid phase to a paste-like or solid consistency.

Dension, E.B.

1986-03-18

299

Laser angle sensor development  

NASA Technical Reports Server (NTRS)

Electrical and optical parameters were developed for a two axis (pitch/roll) laser angle sensor. The laser source and detector were mounted in the plenum above the model. Two axis optical distortion measurements of flow characteristics in a 0.3 transonic cryogenic tunnel were made with a shearing interferometer. The measurement results provide a basis for estimating the optical parameters of the laser angle sensor. Experimental and analytical information was generated on model windows to cover the reflector. A two axis breadboard was assembled to evaluate different measurement concepts. The measurement results were used to develop a preliminary design of a laser angle sensor. Schematics and expected performance specifications are included.

Pond, C. R.; Texeira, P. D.

1980-01-01

300

Narrow gap electronegative capacitive discharges  

NASA Astrophysics Data System (ADS)

Narrow gap electronegative (EN) capacitive discharges are widely used in industry and have unique features not found in conventional discharges. In this paper, plasma parameters are determined over a range of decreasing gap length L from values for which an electropositive (EP) edge exists (2-region case) to smaller L-values for which the EN region connects directly to the sheath (1-region case). Parametric studies are performed at applied voltage Vrf=500 V for pressures of 10, 25, 50, and 100 mTorr, and additionally at 50 mTorr for 1000 and 2000 V. Numerical results are given for a parallel plate oxygen discharge using a planar 1D3v (1 spatial dimension, 3 velocity components) particle-in-cell (PIC) code. New interesting phenomena are found for the case in which an EP edge does not exist. This 1-region case has not previously been investigated in detail, either numerically or analytically. In particular, attachment in the sheaths is important, and the central electron density ne0 is depressed below the density nesh at the sheath edge. The sheath oscillations also extend into the EN core, creating an edge region lying within the sheath and not characterized by the standard diffusion in an EN plasma. An analytical model is developed using minimal inputs from the PIC results, and compared to the PIC results for a base case at Vrf=500 V and 50 mTorr, showing good agreement. Selected comparisons are made at the other voltages and pressures. A self-consistent model is also developed and compared to the PIC results, giving reasonable agreement.

Kawamura, E.; Lieberman, M. A.; Lichtenberg, A. J.

2013-10-01

301

Narrow gap electronegative capacitive discharges  

SciTech Connect

Narrow gap electronegative (EN) capacitive discharges are widely used in industry and have unique features not found in conventional discharges. In this paper, plasma parameters are determined over a range of decreasing gap length L from values for which an electropositive (EP) edge exists (2-region case) to smaller L-values for which the EN region connects directly to the sheath (1-region case). Parametric studies are performed at applied voltage V{sub rf}=500 V for pressures of 10, 25, 50, and 100 mTorr, and additionally at 50 mTorr for 1000 and 2000 V. Numerical results are given for a parallel plate oxygen discharge using a planar 1D3v (1 spatial dimension, 3 velocity components) particle-in-cell (PIC) code. New interesting phenomena are found for the case in which an EP edge does not exist. This 1-region case has not previously been investigated in detail, either numerically or analytically. In particular, attachment in the sheaths is important, and the central electron density n{sub e0} is depressed below the density n{sub esh} at the sheath edge. The sheath oscillations also extend into the EN core, creating an edge region lying within the sheath and not characterized by the standard diffusion in an EN plasma. An analytical model is developed using minimal inputs from the PIC results, and compared to the PIC results for a base case at V{sub rf}=500 V and 50 mTorr, showing good agreement. Selected comparisons are made at the other voltages and pressures. A self-consistent model is also developed and compared to the PIC results, giving reasonable agreement.

Kawamura, E.; Lieberman, M. A.; Lichtenberg, A. J. [Department of Electrical Engineering and Computer Sciences, University of California, Berkeley, California 94720 (United States)] [Department of Electrical Engineering and Computer Sciences, University of California, Berkeley, California 94720 (United States)

2013-10-15

302

Versatility of the CFR algorithm for limited angle reconstruction  

SciTech Connect

The constrained Fourier reconstruction (CFR) algorithm and the iterative reconstruction-reprojection (IRR) algorithm are evaluated based on their accuracy for three types of limited angle reconstruction problems. The cFR algorithm performs better for problems such as Xray CT imaging of a nuclear reactor core with one large data gap due to structural blocking of the source and detector pair. For gated heart imaging by Xray CT, radioisotope distribution imaging by PET or SPECT, using a polygonal array of gamma cameras with insensitive gaps between camera boundaries, the IRR algorithm has a slight advantage over the CFR algorithm but the difference is not significant.

Fujieda, I.; Heiskanen, K.; Perez-Mendez, V. (Lawrence Berkeley Lab., CA (USA))

1990-04-01

303

Infants Experience Perceptual Narrowing for Nonprimate Faces  

ERIC Educational Resources Information Center

Perceptual narrowing--a phenomenon in which perception is broad from birth, but narrows as a function of experience--has previously been tested with primate faces. In the first 6 months of life, infants can discriminate among individual human and monkey faces. Though the ability to discriminate monkey faces is lost after about 9 months, infants…

Simpson, Elizabeth A.; Varga, Krisztina; Frick, Janet E.; Fragaszy, Dorothy

2011-01-01

304

On Narrowing, Refutation Proofs and Constraints  

E-print Network

such that E j= soe = toe. Narrowing was originally devised as an efficient E­unification procedure using can be applied to narrow a term s into soe[roe] p , denoted s ; soe[roe] p , if oe is the mgu of sj p of previous steps), one easily shows that for each (irreducible) solution oe every rewrite proof goal(soe; toe

Nieuwenhuis, Robert

305

Wetting and Contact Angle  

NSDL National Science Digital Library

Students are presented with the concepts of wetting and contact angle. They are also introduced to the distinction between hydrophobic and hydrophilic surfaces. Students observe how different surfaces are used to maintain visibility under different conditions.

NSF CAREER Award and RET Program, Mechanical Engineering and Material Science,

306

Multi-Angle View of the Canary Islands  

NASA Technical Reports Server (NTRS)

A multi-angle view of the Canary Islands in a dust storm, 29 February 2000. At left is a true-color image taken by the Multi-angle Imaging SpectroRadiometer (MISR) instrument on NASA's Terra satellite. This image was captured by the MISR camera looking at a 70.5-degree angle to the surface, ahead of the spacecraft. The middle image was taken by the MISR downward-looking (nadir) camera, and the right image is from the aftward 70.5-degree camera. The images are reproduced using the same radiometric scale, so variations in brightness, color, and contrast represent true variations in surface and atmospheric reflectance with angle. Windblown dust from the Sahara Desert is apparent in all three images, and is much brighter in the oblique views. This illustrates how MISR's oblique imaging capability makes the instrument a sensitive detector of dust and other particles in the atmosphere. Data for all channels are presented in a Space Oblique Mercator map projection to facilitate their co-registration. The images are about 400 km (250 miles)wide, with a spatial resolution of about 1.1 kilometers (1,200 yards). North is toward the top. MISR was built and is managed by NASA's Jet Propulsion Laboratory, Pasadena, CA, for NASA's Office of Earth Science, Washington, DC. The Terra satellite is managed by NASA's Goddard Space Flight Center, Greenbelt, MD. JPL is a division of the California Institute of Technology.

2000-01-01

307

The European Photon Imaging Camera on XMM-Newton: The MOS cameras : The MOS cameras  

Microsoft Academic Search

The EPIC focal plane imaging spectrometers on XMM-Newton use CCDs to record the images and spectra of celestial X-ray sources focused by the three X-ray mirrors. There is one camera at the focus of each mirror; two of the cameras contain seven MOS CCDs, while the third uses twelve PN CCDs, defining a circular field of view of 30' diameter

M. J. L. Turner; A. Abbey; M. Arnaud; M. Balasini; M. Barbera; E. Belsole; P. J. Bennie; J. P. Bernard; G. F. Bignami; M. Boer; U. Briel; I. Butler; C. Cara; C. Chabaud; R. Cole; A. Collura; M. Conte; A. Cros; M. Denby; P. Dhez; G. Di Coco; J. Dowson; P. Ferrando; S. Ghizzardi; F. Gianotti; C. V. Goodall; L. Gretton; R. G. Griffiths; O. Hainaut; J. F. Hochedez; A. D. Holland; E. Jourdain; E. Kendziorra; A. Lagostina; R. Laine; N. La Palombara; M. Lortholary; D. Lumb; P. Marty; S. Molendi; C. Pigot; E. Poindron; K. A. Pounds; J. N. Reeves; C. Reppin; R. Rothenflug; P. Salvetat; J. L. Sauvageot; D. Schmitt; S. Sembay; A. D. T. Short; J. Spragg; J. Stephen; L. Strüder; A. Tiengo; M. Trifoglio; J. Trümper; S. Vercellone; L. Vigroux; G. Villa; M. J. Ward; S. Whitehead; E. Zonca

2001-01-01

308

Narrow band gap amorphous silicon semiconductors  

DOEpatents

Disclosed is a narrow band gap amorphous silicon semiconductor comprising an alloy of amorphous silicon and a band gap narrowing element selected from the group consisting of Sn, Ge, and Pb, with an electron donor dopant selected from the group consisting of P, As, Sb, Bi and N. The process for producing the narrow band gap amorphous silicon semiconductor comprises the steps of forming an alloy comprising amorphous silicon and at least one of the aforesaid band gap narrowing elements in amount sufficient to narrow the band gap of the silicon semiconductor alloy below that of amorphous silicon, and also utilizing sufficient amounts of the aforesaid electron donor dopant to maintain the amorphous silicon alloy as an n-type semiconductor.

Madan, A.; Mahan, A.H.

1985-01-10

309

Autoconfiguration of a Dynamic Nonoverlapping Camera Network  

Microsoft Academic Search

In order to monitor sufficiently large areas of interest for surveillance or any event detection, we need to look beyond stationary cameras and employ an automatically configurable network of nonoverlapping cameras. These cameras need not have an overlapping field of view and should be allowed to move freely in space. Moreover, features like zooming in\\/out, readily available in security cameras

Imran N. Junejo; Xiaochun Cao; Hassan Foroosh

2007-01-01

310

Prospective case series on trabecular-iris angle status after an acute episode of phacomorphic angle closure  

PubMed Central

AIM To investigate the trabecular-iris angle with ultrasound biomicroscopy (UBM) post cataract extraction after an acute attack of phacomorphic angle closure. METHODS This prospective study involved 10 cases of phacomorphic angle closure that underwent cataract extraction and intraocular lens insertion after intraocular pressure (IOP) lowering. Apart from visual acuity and IOP, the trabecular-iris angle was measured by gonioscopy and UBM at 3 months post attack. RESULTS In 10 consecutive cases of acute phacomorphic angle closure from December 2009 to December 2010, gonioscopic findings showed peripheral anterior synechiae (PAS) ? 90° in 30% of phacomorphic patients and a mean Shaffer grading of (3.1±1.0). UBM showed a mean angle of (37.1°±4.5°) in the phacomorphic eye with the temporal quadrant being the most opened and (37.1°±8.0°) in the contralateral uninvolved eye. The mean time from consultation to cataract extraction was (1.4±0.7) days and the mean total duration of phacomorphic angle closure was (3.6±2.8) days but there was no correlation to the degree of angle closure on UBM (Spearman correlation P=0.7). The presenting mean IOP was (50.5±7.4) mmHg and the mean IOP at 3 months was (10.5±3.4) mmHg but there were no correlations with the degree of angle closure (Spearman correlations P=0.9). CONCLUSION An open trabecular-iris angle and normal IOP can be achieved after an acute attack of phacomorphic angle closure if cataract extraction is performed within 1 day - 2 days after IOP control. Gonioscopic findings were in agreement with UBM, which provided a more specific and object angle measurement. The superior angle is relatively more narrowed compared to the other quadrants. All contralateral eyes in this series had open angles. PMID:23549291

Lee, Jacky; Lai, Jimmy; Yick, Doris; Yuen, Can

2013-01-01

311

'Magic Angle Precession'  

SciTech Connect

An advanced and exact geometric description of nonlinear precession dynamics modeling very accurately natural and artificial couplings showing Lorentz symmetry is derived. In the linear description it is usually ignored that the geometric phase of relativistic motion couples back to the orbital motion providing for a non-linear recursive precession dynamics. The high coupling strength in the nonlinear case is found to be a gravitomagnetic charge proportional to the precession angle and angular velocity generated by geometric phases, which are induced by high-speed relativistic rotations and are relevant to propulsion technologies but also to basic interactions. In the quantum range some magic precession angles indicating strong coupling in a phase-locked chaotic system are identified, emerging from a discrete time dynamical system known as the cosine map showing bifurcations at special precession angles relevant to heavy nuclei stability. The 'Magic Angle Precession' (MAP) dynamics can be simulated and visualized by cones rolling in or on each other, where the apex and precession angles are indexed by spin, charge or precession quantum numbers, and corresponding magic angles. The most extreme relativistic warping and twisting effect is given by the Dirac spinor half spin constellation with 'Hyperdiamond' MAP, which resembles quark confinement.

Binder, Bernd [Quanics.com, Germany, 88679 Salem, P.O. Box 1247 (United States)], E-mail: binder@quanics.com

2008-01-21

312

Photometric Calibration of Consumer Video Cameras  

NASA Technical Reports Server (NTRS)

Equipment and techniques have been developed to implement a method of photometric calibration of consumer video cameras for imaging of objects that are sufficiently narrow or sufficiently distant to be optically equivalent to point or line sources. Heretofore, it has been difficult to calibrate consumer video cameras, especially in cases of image saturation, because they exhibit nonlinear responses with dynamic ranges much smaller than those of scientific-grade video cameras. The present method not only takes this difficulty in stride but also makes it possible to extend effective dynamic ranges to several powers of ten beyond saturation levels. The method will likely be primarily useful in astronomical photometry. There are also potential commercial applications in medical and industrial imaging of point or line sources in the presence of saturation.This development was prompted by the need to measure brightnesses of debris in amateur video images of the breakup of the Space Shuttle Columbia. The purpose of these measurements is to use the brightness values to estimate relative masses of debris objects. In most of the images, the brightness of the main body of Columbia was found to exceed the dynamic ranges of the cameras. A similar problem arose a few years ago in the analysis of video images of Leonid meteors. The present method is a refined version of the calibration method developed to solve the Leonid calibration problem. In this method, one performs an endto- end calibration of the entire imaging system, including not only the imaging optics and imaging photodetector array but also analog tape recording and playback equipment (if used) and any frame grabber or other analog-to-digital converter (if used). To automatically incorporate the effects of nonlinearity and any other distortions into the calibration, the calibration images are processed in precisely the same manner as are the images of meteors, space-shuttle debris, or other objects that one seeks to analyze. The light source used to generate the calibration images is an artificial variable star comprising a Newtonian collimator illuminated by a light source modulated by a rotating variable neutral- density filter. This source acts as a point source, the brightness of which varies at a known rate. A video camera to be calibrated is aimed at this source. Fixed neutral-density filters are inserted in or removed from the light path as needed to make the video image of the source appear to fluctuate between dark and saturated bright. The resulting video-image data are analyzed by use of custom software that determines the integrated signal in each video frame and determines the system response curve (measured output signal versus input brightness). These determinations constitute the calibration, which is thereafter used in automatic, frame-by-frame processing of the data from the video images to be analyzed.

Suggs, Robert; Swift, Wesley, Jr.

2007-01-01

313

Surveillance camera scheduling: a virtual vision approach  

Microsoft Academic Search

We present a surveillance system, comprising wide field-of-view (FOV) passive cameras and pan\\/tilt\\/zoom (PTZ) active cameras, which automatically captures and labels high-resolution videos of pedestrians as they move through a designated area. A wide-FOV stationary camera can track multiple pedestrians, while any PTZ active camera can capture high-quality videos of a single pedestrian at a time. We propose a multi-camera

Faisal Z. Qureshi; Demetri Terzopoulos

2005-01-01

314

Surveillance camera scheduling: a virtual vision approach  

Microsoft Academic Search

ABSTRACT We present a surveillance system, comprising wide field-of-view (FOV) passive cameras and pan\\/tilt\\/zoom (PTZ) active cameras, which automatically captures and labels high-resolution videos of pedestrians as they move,through a designated area. A wide-FOV stationary camera can track multiple pedestrians, while any PTZ active camera can capture high-quality videos of a single pedestrian at a time. We propose a multi-camera

Faisal Z. Qureshi; Demetri Terzopoulos

2006-01-01

315

1/15/2007 Mohanty 1 Secure Digital CameraSecure Digital Camera  

E-print Network

1/15/2007 Mohanty 1 Secure Digital CameraSecure Digital Camera Saraju P. Mohanty VLSI Design Management (DRM) Secure Digital Camera (SDC) One of our Digital Integrated Circuit solutions Invisible;1/15/2007 Mohanty 10 Our Solution for DRM:Our Solution for DRM: Secure Digital Camera (SDC)Secure Digital Camera

Mohanty, Saraju P.

316

11/18/2006 Mohanty 1 Secure Digital CameraSecure Digital Camera  

E-print Network

11/18/2006 Mohanty 1 Secure Digital CameraSecure Digital Camera Saraju P. Mohanty VLSI Design Management (DRM) Secure Digital Camera (SDC) One of our Digital Integrated Circuit solutions Invisible;11/18/2006 Mohanty 10 Our Solution for DRM:Our Solution for DRM: Secure Digital Camera (SDC)Secure Digital Camera

Mohanty, Saraju P.

317

IMAGE-BASED PAN-TILT CAMERA CONTROL IN A MULTI-CAMERA SURVEILLANCE ENVIRONMENT  

E-print Network

IMAGE-BASED PAN-TILT CAMERA CONTROL IN A MULTI-CAMERA SURVEILLANCE ENVIRONMENT Ser-Nam Lim, Ahmed,Elgammal,lsd}@umiacs.umd.edu ABSTRACT In automated surveillance systems with multiple cameras, the sys- tem must be able to position In a surveillance environment with multiple cameras monitoring a scene, the first task is to position the cameras

Davis, Larry

318

The Clementine longwave infrared camera  

SciTech Connect

The Clementine mission provided the first ever complete, systematic surface mapping of the moon from the ultra-violet to the near-infrared regions. More than 1.7 million images of the moon, earth and space were returned from this mission. The longwave-infrared (LWIR) camera supplemented the UV/Visible and near-infrared mapping cameras providing limited strip coverage of the moon, giving insight to the thermal properties of the soils. This camera provided {approximately}100 m spatial resolution at 400 km periselene, and a 7 km across-track swath. This 2.1 kg camera using a 128 x 128 Mercury-Cadmium-Telluride (MCT) FPA viewed thermal emission of the lunar surface and lunar horizon in the 8.0 to 9.5 {micro}m wavelength region. A description of this light-weight, low power LWIR camera along with a summary of lessons learned is presented. Design goals and preliminary on-orbit performance estimates are addressed in terms of meeting the mission`s primary objective for flight qualifying the sensors for future Department of Defense flights.

Priest, R.E.; Lewis, I.T.; Sewall, N.R.; Park, H.S.; Shannon, M.J.; Ledebuhr, A.G.; Pleasance, L.D. [Lawrence Livermore National Lab., CA (United States); Massie, M.A. [Pacific Advanced Technology, Solvang, CA (United States); Metschuleit, K. [Amber/A Raytheon Co., Goleta, CA (United States)

1995-04-01

319

WIDE-FIELD ASTRONOMICAL MULTISCALE CAMERAS  

SciTech Connect

In order to produce sufficiently low aberrations with a large aperture, telescopes have a limited field of view. Because of this narrow field, large areas of the sky at a given time are unobserved. We propose several telescopes based on monocentric reflective, catadioptric, and refractive objectives that may be scaled to wide fields of view and achieve 1.''1 resolution, which in most locations is the practical seeing limit of the atmosphere. The reflective and Schmidt catadioptric objectives have relatively simple configurations and enable large fields to be captured at the expense of the obscuration of the mirror by secondary optics, a defect that may be managed by image plane design. The refractive telescope design does not have an obscuration but the objective has substantial bulk. The refractive design is a 38 gigapixel camera which consists of a single monocentric objective and 4272 microcameras. Monocentric multiscale telescopes, with their wide fields of view, may observe phenomena that might otherwise be unnoticed, such as supernovae, glint from orbital space debris, and near-earth objects.

Marks, Daniel L.; Brady, David J., E-mail: dbrady@ee.duke.edu [Department of Electrical and Computer Engineering and Fitzpatrick Institute for Photonics, Box 90291, Duke University, Durham, NC 27708 (United States)

2013-05-15

320

Exploring the Moon at High-Resolution: First Results From the Lunar Reconnaissance Orbiter Camera (LROC)  

NASA Astrophysics Data System (ADS)

The Lunar Reconnaissance Orbiter (LRO) spacecraft was launched on an Atlas V 401 rocket from the Cape Canaveral Air Force Station Launch Complex 41 on June 18, 2009. After spending four days in Earth-Moon transit, the spacecraft entered a three month commissioning phase in an elliptical 30×200 km orbit. On September 15, 2009, LRO began its planned one-year nominal mapping mission in a quasi-circular 50 km orbit. A multi-year extended mission in a fixed 30×200 km orbit is optional. The Lunar Reconnaissance Orbiter Camera (LROC) consists of a Wide Angle Camera (WAC) and two Narrow Angle Cameras (NACs). The WAC is a 7-color push-frame camera, which images the Moon at 100 and 400 m/pixel in the visible and UV, respectively, while the two NACs are monochrome narrow-angle linescan imagers with 0.5 m/pixel spatial resolution. LROC was specifically designed to address two of the primary LRO mission requirements and six other key science objectives, including 1) assessment of meter-and smaller-scale features in order to select safe sites for potential lunar landings near polar resources and elsewhere on the Moon; 2) acquire multi-temporal synoptic 100 m/pixel images of the poles during every orbit to unambiguously identify regions of permanent shadow and permanent or near permanent illumination; 3) meter-scale mapping of regions with permanent or near-permanent illumination of polar massifs; 4) repeat observations of potential landing sites and other regions to derive high resolution topography; 5) global multispectral observations in seven wavelengths to characterize lunar resources, particularly ilmenite; 6) a global 100-m/pixel basemap with incidence angles (60° -80° ) favorable for morphological interpretations; 7) sub-meter imaging of a variety of geologic units to characterize their physical properties, the variability of the regolith, and other key science questions; 8) meter-scale coverage overlapping with Apollo-era panoramic images (1-2 m/pixel) to document the number of small impacts since 1971-1972. LROC allows us to determine the recent impact rate of bolides in the size range of 0.5 to 10 meters, which is currently not well known. Determining the impact rate at these sizes enables engineering remediation measures for future surface operations and interplanetary travel. The WAC has imaged nearly the entire Moon in seven wavelengths. A preliminary global WAC stereo-based topographic model is in preparation [1] and global color processing is underway [2]. As the mission progresses repeat global coverage will be obtained as lighting conditions change providing a robust photometric dataset. The NACs are revealing a wealth of morpho-logic features at the meter scale providing the engineering and science constraints needed to support future lunar exploration. All of the Apollo landing sites have been imaged, as well as the majority of robotic landing and impact sites. Through the use of off-nadir slews a collection of stereo pairs is being acquired that enable 5-m scale topographic mapping [3-7]. Impact mor-phologies (terraces, impact melt, rays, etc) are preserved in exquisite detail at all Copernican craters and are enabling new studies of impact mechanics and crater size-frequency distribution measurements [8-12]. Other topical studies including, for example, lunar pyroclastics, domes, and tectonics are underway [e.g., 10-17]. The first PDS data release of LROC data will be in March 2010, and will include all images from the commissioning phase and the first 3 months of the mapping phase. [1] Scholten et al. (2010) 41st LPSC, #2111; [2] Denevi et al. (2010a) 41st LPSC, #2263; [3] Beyer et al. (2010) 41st LPSC, #2678; [4] Archinal et al. (2010) 41st LPSC, #2609; [5] Mattson et al. (2010) 41st LPSC, #1871; [6] Tran et al. (2010) 41st LPSC, #2515; [7] Oberst et al. (2010) 41st LPSC, #2051; [8] Bray et al. (2010) 41st LPSC, #2371; [9] Denevi et al. (2010b) 41st LPSC, #2582; [10] Hiesinger et al. (2010a) 41st LPSC, #2278; [11] Hiesinger et al. (2010b) 41st LPSC, #2304; [12] van der Bogert et al. (2010) 41st LPSC, #2165;

Robinson, Mark; Hiesinger, Harald; McEwen, Alfred; Jolliff, Brad; Thomas, Peter C.; Turtle, Elizabeth; Eliason, Eric; Malin, Mike; Ravine, A.; Bowman-Cisneros, Ernest

321

Development of filter exchangeable 3CCD camera for multispectral imaging acquisition  

NASA Astrophysics Data System (ADS)

There are a lot of methods to acquire multispectral images. Dynamic band selective and area-scan multispectral camera has not developed yet. This research focused on development of a filter exchangeable 3CCD camera which is modified from the conventional 3CCD camera. The camera consists of F-mounted lens, image splitter without dichroic coating, three bandpass filters, three image sensors, filer exchangeable frame and electric circuit for parallel image signal processing. In addition firmware and application software have developed. Remarkable improvements compared to a conventional 3CCD camera are its redesigned image splitter and filter exchangeable frame. Computer simulation is required to visualize a pathway of ray inside of prism when redesigning image splitter. Then the dimensions of splitter are determined by computer simulation which has options of BK7 glass and non-dichroic coating. These properties have been considered to obtain full wavelength rays on all film planes. The image splitter is verified by two line lasers with narrow waveband. The filter exchangeable frame is designed to make swap bandpass filters without displacement change of image sensors on film plane. The developed 3CCD camera is evaluated to application of detection to scab and bruise on Fuji apple. As a result, filter exchangeable 3CCD camera could give meaningful functionality for various multispectral applications which need to exchange bandpass filter.

Lee, Hoyoung; Park, Soo Hyun; Kim, Moon S.; Noh, Sang Ha

2012-05-01

322

Perceptual Color Characterization of Cameras  

PubMed Central

Color camera characterization, mapping outputs from the camera sensors to an independent color space, such as XY Z, is an important step in the camera processing pipeline. Until now, this procedure has been primarily solved by using a 3 × 3 matrix obtained via a least-squares optimization. In this paper, we propose to use the spherical sampling method, recently published by Finlayson et al., to perform a perceptual color characterization. In particular, we search for the 3 × 3 matrix that minimizes three different perceptual errors, one pixel based and two spatially based. For the pixel-based case, we minimize the CIE ?E error, while for the spatial-based case, we minimize both the S-CIELAB error and the CID error measure. Our results demonstrate an improvement of approximately 3% for the ?E error, 7% for the S-CIELAB error and 13% for the CID error measures. PMID:25490586

Vazquez-Corral, Javier; Connah, David; Bertalmío, Marcelo

2014-01-01

323

The GISMO-2 Bolometer Camera  

NASA Technical Reports Server (NTRS)

We present the concept for the GISMO-2 bolometer camera) which we build for background-limited operation at the IRAM 30 m telescope on Pico Veleta, Spain. GISM0-2 will operate Simultaneously in the 1 mm and 2 mm atmospherical windows. The 1 mm channel uses a 32 x 40 TES-based Backshort Under Grid (BUG) bolometer array, the 2 mm channel operates with a 16 x 16 BUG array. The camera utilizes almost the entire full field of view provided by the telescope. The optical design of GISM0-2 was strongly influenced by our experience with the GISMO 2 mm bolometer camera which is successfully operating at the 30m telescope. GISMO is accessible to the astronomical community through the regular IRAM call for proposals.

Staguhn, Johannes G.; Benford, Dominic J.; Fixsen, Dale J.; Hilton, Gene; Irwin, Kent D.; Jhabvala, Christine A.; Kovacs, Attila; Leclercq, Samuel; Maher, Stephen F.; Miller, Timothy M.; Moseley, Samuel H.; Sharp, Elemer H.; Wollack, Edward J.

2012-01-01

324

Using Inscribed Angles and Polygons  

NSDL National Science Digital Library

This unit will teach you about inscribed angles, intercepted arcs, their measures, inscribed polygons, and their associated theorems. OK, time for notes! Define Inscribed Angles, using the following website (Only define the inscribed angle from this site): Inscribed Angle Definition Using this new idea, you can use the following activity to figure out the formula for the measure of an inscribed angle: Inscribed Angle Formula Discovery The whole lesson depends upon this definition. Define Intercepted Arc, Inscribed polygons, ...

Mrs. Neubert

2011-03-10

325

Characterization of a PET Camera Optimized for ProstateImaging  

SciTech Connect

We present the characterization of a positron emission tomograph for prostate imaging that centers a patient between a pair of external curved detector banks (ellipse: 45 cm minor, 70 cm major axis). The distance between detector banks adjusts to allow patient access and to position the detectors as closely as possible for maximum sensitivity with patients of various sizes. Each bank is composed of two axial rows of 20 HR+ block detectors for a total of 80 detectors in the camera. The individual detectors are angled in the transaxial plane to point towards the prostate to reduce resolution degradation in that region. The detectors are read out by modified HRRT data acquisition electronics. Compared to a standard whole-body PET camera, our dedicated-prostate camera has the same sensitivity and resolution, less background (less randoms and lower scatter fraction) and a lower cost. We have completed construction of the camera. Characterization data and reconstructed images of several phantoms are shown. Sensitivity of a point source in the center is 946 cps/mu Ci. Spatial resolution is 4 mm FWHM in the central region.

Huber, Jennifer S.; Choong, Woon-Seng; Moses, William W.; Qi,Jinyi; Hu, Jicun; Wang, G.C.; Wilson, David; Oh, Sang; Huesman, RonaldH.; Derenzo, Stephen E.

2005-11-11

326

Analysis of Reference Sources for the Characterization and Calibration of Infrared Cameras  

NASA Astrophysics Data System (ADS)

This paper gives an analysis of the radiometric properties of different types of reference sources applied for the characterization and calibration of infrared cameras. For the absolute radiance measurement with an infrared camera, a metrological characterization and calibration of the instrument are essential. Similar to the calibration of radiation thermometers, this calibration is generally performed with reference sources of known radiance. As infrared cameras are optically and electronically more complex than radiation thermometers, which are equipped with a single element detector, the applied reference sources have to be carefully characterized and limitations in their performance have to be considered. Each pixel of the image measured with an infrared camera should depict correctly the desired physical quantity value of the projected object area. This should be achieved for all relevant conditions of observation, e.g., at different distances or at different incident angles. The performance of cavity radiators and plate radiators is analyzed based on ray-tracing calculations and spatially and angularly resolved radiance measurements with radiation thermometers and cameras. Relevant components of a calibration facility for infrared cameras at PTB are presented with their specifications. A first analysis of the relevant characteristics of the applied infrared calibration sources and infrared cameras is presented as the essential basic information for the realization of the calibration of infrared cameras.

Gutschwager, B.; Taubert, D.; Hollandt, J.

2015-03-01

327

Narrowing of mid-infrared laser pulses by two-photon absorption in semiconductor etalons  

NASA Astrophysics Data System (ADS)

We present a theoretical analysis of how the nonlinearity of the free carriers produced by two-photon absorption in an InSb etalon could be used to shorten 10.6 ?m laser pulses. The etalon can be tuned from an initial low-intensity minimum of transmittance to a maximum and back to a minimum by the laser pulse, as a consequence of free carrier generation. Simulations show that, for an adequate choice of parameters, pulse narrowing by a factor of 30 can be achieved. The influence of peak intensity, etalon thickness and incidence angle on pulse narrowing is investigated, as well as the spectral reshaping of the transmitted pulses.

Richard, Isabelle; Piché, Michel

1998-07-01

328

The IAC's Near Infrared Camera  

NASA Astrophysics Data System (ADS)

Here we report the main characteristics of the recently acquired near infrared camera. It is based on a 256x256 HgCdTe Nicmos-3 array, sensitive from 1 mu m to 2.5mu m. The pixel size is 40mu mx40mu m, adequate for the spatial and spectral scales at the different telescopes operating at the observatories of La Palma and Tenerife. The cooling system operates with LN2 with a hold time larger than 20 hours. The camera will be used for photometric, spectroscopic and spectropolarimetric observations.

Collados, M.; Joven, E.; Fuentes, F. J.; Diaz, J. J.; Gonzalez Escalera, V.; Bonet, J. A.; Vazquez, M.

329

Using Narrow Band Photometry to Detect Young Brown Dwarfs in IC348  

E-print Network

We report the discovery of a population of young brown dwarf candidates in the open star cluster IC348 and the development of a new spectroscopic classification technique using narrow band photometry. Observations were made using FLITECAM, the First Light Camera for SOFIA, at the 3-m Shane Telescope at Lick Observatory. FLITECAM is a new 1-5 micron camera with an 8 arcmin field of view. Custom narrow band filters were developed to detect absorption features of water vapor (at 1.495 microns) and methane (at 1.66 microns) characteristic of brown dwarfs. These filters enable spectral classification of stars and brown dwarfs without spectroscopy. FLITECAM's narrow and broadband photometry was verified by examining the color-color and color-magnitude characteristics of stars whose spectral type and reddening was known from previous surveys. Using our narrow band filter photometry method, it was possible to identify an object measured with a signal-to-noise ratio of 20 or better to within +/-3 spectral class subtypes for late-type stars. With this technique, very deep images of the central region of IC348 (H ~ 20.0) have identified 18 sources as possible L or T dwarf candidates. Out of these 18, we expect that between 3 - 6 of these objects are statistically likely to be background stars, with the remainder being true low-mass members of the cluster. If confirmed as cluster members then these are very low-mass objects (~5 Mjupiter). We also describe how two additional narrow band filters can improve the contrast between M, L, and T dwarfs as well as provide a means to determine the reddening of an individual object.

A. K. Mainzer; Ian S. McLean

2003-06-30

330

What's the Angle?  

NSDL National Science Digital Library

This classroom activity helps students understand how the angle of the Sun affects temperatures around the globe. After experimenting with a heat lamp and thermometers at differing angles, students apply what they learned to explain temperature variations on Earth. The printable six-page handout includes a series of inquiry-based questions to get students thinking about what they already know about temperature patterns, detailed experiment directions and a worksheet that helps students use the experiment results to gain a deeper understanding of seasonal temperature changes and why Antarctica is always so cold.

331

New data on separation and position angle of selected binaries  

NASA Astrophysics Data System (ADS)

We report on a sample of the data aquired on may 2012 at the 31 inch NURO telescope at Anderson Mesa near Flagstaff, Arizona pertaining the separation and position angle of selected binary stars. A CCD camera coupled to the NURO telescope allows for a simple and straightforward procedure for obtaining the images of the binaries. Analysis of the images is straigthforward and both direct and software methodology yield the separation and position angle of the binary stars. The data obtained is suitable for insertion in the Washington Double Star Catalog of the US Naval Observatory.

Muller, Rafael J.; Lopez, Andy J.; Torres, Brian S.; Mendoza, Lizyan; Vergara, Nelson; Cersosimo, Juan; Martinez, Luis

2015-01-01

332

Velocity and Angle of Spatter in Fine Laser Processing  

NASA Astrophysics Data System (ADS)

Scattering velocity and angle of spatter were investigated by recognizing and tracking spatter with the high-speed video observation. The spatter movement was observed by two high-speed video cameras, and analyzed by two- direction tracking method, in which the 3D tracking lines of spatter were constructed in the forward and backward frames and the actual trajectory of individual spatter was obtained by averaging those tracking lines. It made clear that the initial velocity of spatter was less than the speed of sound and mainly ranged from 75m/s to 200m/s. The velocity of spatter showed the slight correlation with the angle.

Okamoto, Yasuhiro; Yamamoto, Hibiki; Okada, Akira; Shirasaya, Kento; Kolehmainen, Jari Tapani

333

Reassessing Narrow Rings at Uranus and Neptune  

NASA Astrophysics Data System (ADS)

We outline questions surrounding narrow rings, their radial confinement, global modes, and arcs. The rings of Chariklo may prove a helpful analogue. Additional observations by telescopes and spacecraft, with continued modeling, may lead to new answers.

Tiscareno, M. S.

2014-07-01

334

NARROW-BAND VOLUME RENDERING FOR  

E-print Network

NARROW-BAND VOLUME RENDERING FOR FREEHAND 3D ULTRASOUND A. H. Gee, R. W. Prager, G. M. Treece and L Street Cambridge CB2 1PZ England Email: ahg/rwp/gmt11 @eng.cam.ac.uk, lb@radiol.cam.ac.uk #12; Narrow-Band ultrasound, for specifying these target volumes. The basic idea is to extend the familiar planar and non

Drummond, Tom

335

Hot Wax Sweeps Debris From Narrow Passages  

NASA Technical Reports Server (NTRS)

Safe and effective technique for removal of debris and contaminants from narrow passages involves entrainment of undesired material in thermoplastic casting material. Semisolid wax slightly below melting temperature pushed along passage by pressurized nitrogen to remove debris. Devised to clean out fuel passages in main combustion chamber of Space Shuttle main engine. Also applied to narrow, intricate passages in internal-combustion-engine blocks, carburetors, injection molds, and other complicated parts.

Ricklefs, Steven K.

1990-01-01

336

G-APDs in Cherenkov astronomy: The FACT camera  

NASA Astrophysics Data System (ADS)

Geiger-mode avalanche photodiodes (G-APD, SiPM) are a much discussed alternative to photomultiplier tubes in Cherenkov astronomy. The First G-APD Cherenkov Telescope (FACT) collaboration builds a camera based on a hexagonal array of 1440 G-APDs and has now finalized its construction phase. A light-collecting solid PMMA cone is glued to each G-APD to eliminate dead space between the G-APDs by increasing the active area, and to restrict the light collection angle of the sensor to the reflector area in order to reduce the amount of background light. The processing of the signals is integrated in the camera and includes the digitization using the domino ring sampling chip DRS4.

Krähenbühl, T.; Anderhub, H.; Backes, M.; Biland, A.; Boller, A.; Braun, I.; Bretz, T.; Commichau, V.; Djambazov, L.; Dorner, D.; Farnier, C.; Gendotti, A.; Grimm, O.; von Gunten, H.; Hildebrand, D.; Horisberger, U.; Huber, B.; Kim, K.-S.; Köhne, J.-H.; Krumm, B.; Lee, M.; Lenain, J.-P.; Lorenz, E.; Lustermann, W.; Lyard, E.; Mannheim, K.; Meharga, M.; Neise, D.; Nessi-Tedaldi, F.; Overkemping, A.-K.; Pauss, F.; Renker, D.; Rhode, W.; Ribordy, M.; Rohlfs, R.; Röser, U.; Stucki, J.-P.; Schneider, J.; Thaele, J.; Tibolla, O.; Viertel, G.; Vogler, P.; Walter, R.; Warda, K.; Weitzel, Q.

2012-12-01

337

Contact Angle Measurements Using a Simplified Experimental Setup  

ERIC Educational Resources Information Center

A basic and affordable experimental apparatus is described that measures the static contact angle of a liquid drop in contact with a solid. The image of the drop is made with a simple digital camera by taking a picture that is magnified by an optical lens. The profile of the drop is then processed with ImageJ free software. The ImageJ contact…

Lamour, Guillaume; Hamraoui, Ahmed; Buvailo, Andrii; Xing, Yangjun; Keuleyan, Sean; Prakash, Vivek; Eftekhari-Bafrooei, Ali; Borguet, Eric

2010-01-01

338

A Fringe-Field Driven Hybrid Aligned Nematic Liquid Crystal Display for Narrow Viewing Angle Display  

Microsoft Academic Search

Liquid crystal displays (LCDs) that exhibit a high image quality at the normal direction only were investigated. For this purpose, a hybrid aligned nematic liquid crystal (LC) cell driven by a fringe electric field was chosen, and its electro-optic characteristics with an optimal cell structure were evaluated. The device showed a high light efficiency of 90%, a low driving voltage

Je Woo Ryu; Young Jin Lim; Youn Hak Jeong; Kyungmi Kim; Gi-Dong Lee; Seung Hee Lee

2007-01-01

339

High speed multiwire photon camera  

NASA Technical Reports Server (NTRS)

An improved multiwire proportional counter camera having particular utility in the field of clinical nuclear medicine imaging. The detector utilizes direct coupled, low impedance, high speed delay lines, the segments of which are capacitor-inductor networks. A pile-up rejection test is provided to reject confused events otherwise caused by multiple ionization events occurring during the readout window.

Lacy, Jeffrey L. (Inventor)

1989-01-01

340

High speed multiwire photon camera  

NASA Technical Reports Server (NTRS)

An improved multiwire proportional counter camera having particular utility in the field of clinical nuclear medicine imaging. The detector utilizes direct coupled, low impedance, high speed delay lines, the segments of which are capacitor-inductor networks. A pile-up rejection test is provided to reject confused events otherwise caused by multiple ionization events occuring during the readout window.

Lacy, Jeffrey L. (Inventor)

1991-01-01

341

OSIRIS camera barrel optomechanical design  

NASA Astrophysics Data System (ADS)

A Camera Barrel, located in the OSIRIS imager/spectrograph for the Gran Telescopio Canarias (GTC), is described in this article. The barrel design has been developed by the Institute for Astronomy of the University of Mexico (IA-UNAM), in collaboration with the Institute for Astrophysics of Canarias (IAC), Spain. The barrel is being manufactured by the Engineering Center for Industrial Development (CIDESI) at Queretaro, Mexico. The Camera Barrel includes a set of eight lenses (three doublets and two singlets), with their respective supports and cells, as well as two subsystems: the Focusing Unit, which is a mechanism that modifies the first doublet relative position; and the Passive Displacement Unit (PDU), which uses the third doublet as thermal compensator to maintain the camera focal length and image quality when the ambient temperature changes. This article includes a brief description of the scientific instrument; describes the design criteria related with performance justification; and summarizes the specifications related with misalignment errors and generated stresses. The Camera Barrel components are described and analytical calculations, FEA simulations and error budgets are also included.

Farah, Alejandro; Tejada, Carlos; Gonzalez, Jesus; Cobos, Francisco J.; Sanchez, Beatriz; Fuentes, Javier; Ruiz, Elfego

2004-09-01

342

Gamma-ray camera flyby  

SciTech Connect

Animation based on an actual classroom demonstration of the prototype CCI-2 gamma-ray camera's ability to image a hidden radioactive source, a cesium-137 line source, in three dimensions. For more information see http://newscenter.lbl.gov/feature-stories/2010/06/02/applied-nuclear-physics/.

None

2010-01-01

343

New Rotating Prism Drum Camera  

NASA Astrophysics Data System (ADS)

The car industry is using high speed photography for crash testing and other dynamic testing of components. For this purpose high speed cameras with frame rates between 400 and 1000 frames per second are being used, whereas the time period to be measured is in the range of about 100 - 250 ms.

Hochtl, W.; Wittwer, W.

1983-03-01

344

Measuring Distances Using Digital Cameras  

ERIC Educational Resources Information Center

This paper presents a generic method of calculating accurate horizontal and vertical object distances from digital images taken with any digital camera and lens combination, where the object plane is parallel to the image plane or tilted in the vertical plane. This method was developed for a project investigating the size, density and spatial…

Kendal, Dave

2007-01-01

345

Camera assisted multimodal user interaction  

NASA Astrophysics Data System (ADS)

Since more processing power, new sensing and display technologies are already available in mobile devices, there has been increased interest in building systems to communicate via different modalities such as speech, gesture, expression, and touch. In context identification based user interfaces, these independent modalities are combined to create new ways how the users interact with hand-helds. While these are unlikely to completely replace traditional interfaces, they will considerably enrich and improve the user experience and task performance. We demonstrate a set of novel user interface concepts that rely on built-in multiple sensors of modern mobile devices for recognizing the context and sequences of actions. In particular, we use the camera to detect whether the user is watching the device, for instance, to make the decision to turn on the display backlight. In our approach the motion sensors are first employed for detecting the handling of the device. Then, based on ambient illumination information provided by a light sensor, the cameras are turned on. The frontal camera is used for face detection, while the back camera provides for supplemental contextual information. The subsequent applications triggered by the context can be, for example, image capturing, or bar code reading.

Hannuksela, Jari; Silvén, Olli; Ronkainen, Sami; Alenius, Sakari; Vehviläinen, Markku

2010-01-01

346

Camera lens adapter magnifies image  

NASA Technical Reports Server (NTRS)

Polaroid Land camera with an illuminated 7-power magnifier adapted to the lens, photographs weld flaws. The flaws are located by inspection with a 10-power magnifying glass and then photographed with this device, thus providing immediate pictorial data for use in remedial procedures.

Moffitt, F. L.

1967-01-01

347

Exact ligand cone angles.  

PubMed

Many properties of transition-metal complexes depend on the steric bulk of bound ligands, usually quantified by the Tolman (?) and solid (?) cone angles, which have proven utility but suffer from various limitations and coarse approximations. Here, we present an improved, mathematically rigorous method to determine an exact cone angle (?°) by solving for the most acute right circular cone that contains the entire ligand. The procedure is applicable to any ligand, planar or nonplanar, monodentate or polydentate, bound to any metal center in any environment, and it is ideal for analyzing structures from quantum chemical computations as well as X-ray crystallography experiments. Exact cone angles were evaluated for a wide array of phosphine and amine ligands bound to palladium, nickel, or platinum by optimizing structures using B3LYP/6-31G* density functional theory with effective core potentials for the transition metals. The mean absolute deviations of the standard ? and ? parameters from the exact cone angles were 15-25°, mostly caused by distortions from the assumed idealized structures. PMID:23408559

Bilbrey, Jenna A; Kazez, Arianna H; Locklin, Jason; Allen, Wesley D

2013-05-30

348

Casting and Angling.  

ERIC Educational Resources Information Center

The self-contained packet contains background information, lesson plans, 15 transparency and student handout masters, drills and games, 2 objective examinations, and references for teaching a 15-day unit on casting and angling to junior high and senior high school students, either as part of a regular physical education program or as a club…

Little, Mildred J.; Bunting, Camille

349

Taxicab Angles and Trigonometry  

Microsoft Academic Search

A natural analogue to angles and trigonometry is developed in taxicab geometry. This structure is then analyzed to see which, if any, congruent triangle relations hold. A nice application involving the use of parallax to determine the exact (taxicab) distance to an object is also discussed.

Kevin Thompson; Tevian Dray

2011-01-01

350

open-angle glaucoma  

E-print Network

Purpose: To investigate whether associations with the nitric oxide synthase gene (NOS3) variants and risk of primary open-angle glaucoma (POAG) depend on female reproductive factors. Methods: Two functional and two tagging single nucleotide polymorphisms (SNPs; T-786C: rs2070744, Glu298Asp: rs

Jae Hee Kang; Janey L. Wiggs; Jonathan Haines; Wael Abdrabou; Louis R. Pasquale

351

Dynamic contact angle analysis of silicone hydrogel contact lenses.  

PubMed

Contact angle measurements are used to infer the clinical wetting characteristics of contact lenses. Such characterization has become more commonplace since the introduction of silicone hydrogel contact lens materials, which have been associated with reduced in vivo wetting due to the inclusion of siloxane-containing components. Using consistent methodology and a single investigator, advancing and receding contact angles were measured for 11 commercially available silicone hydrogel contact lens types with a dynamic captive bubble technique employing customized, fully automated image analysis. Advancing contact angles were found to range between 20° and 72° with the lenses falling into six statistically discrete groupings. Receding contact angles fell within a narrower range, between 17° and 22°, with the lenses segregated into three groups. The relationship between these laboratory measurements and the clinical performance of the lenses requires further investigation. PMID:20219845

Read, Michael Leonard; Morgan, Philip Bruce; Kelly, Jeremiah Michael; Maldonado-Codina, Carole

2011-07-01

352

Preliminary Mapping of Permanently Shadowed and Sunlit Regions Using the Lunar Reconnaissance Orbiter Camera (LROC)  

NASA Astrophysics Data System (ADS)

The spin axis of the Moon is tilted by only 1.5° (compared with the Earth's 23.5°), leaving some areas near the poles in permanent shadow while other nearby regions remain sunlit for a majority of the year. Theory, radar data, neutron measurements, and Lunar CRater Observation and Sensing Satellite (LCROSS) observations suggest that volatiles may be present in the cold traps created inside these permanently shadowed regions. While areas of near permanent illumination are prime locations for future lunar outposts due to benign thermal conditions and near constant solar power. The Lunar Reconnaissance Orbiter (LRO) has two imaging systems that provide medium and high resolution views of the poles. During almost every orbit the LROC Wide Angle Camera (WAC) acquires images at 100 m/pixel of the polar region (80° to 90° north and south latitude). In addition, the LROC Narrow Angle Camera (NAC) targets selected regions of interest at 0.7 to 1.5 m/pixel [Robinson et al., 2010]. During the first 11 months of the nominal mission, LROC acquired almost 6,000 WAC images and over 7,300 NAC images of the polar region (i.e., within 2° of pole). By analyzing this time series of WAC and NAC images, regions of permanent shadow and permanent, or near-permanent illumination can be quantified. The LROC Team is producing several reduced data products that graphically illustrate the illumination conditions of the polar regions. Illumination movie sequences are being produced that show how the lighting conditions change over a calendar year. Each frame of the movie sequence is a polar stereographic projected WAC image showing the lighting conditions at that moment. With the WAC’s wide field of view (~100 km at an altitude of 50 km), each frame has repeat coverage between 88° and 90° at each pole. The same WAC images are also being used to develop multi-temporal illumination maps that show the percent each 100 m × 100 m area is illuminated over a period of time. These maps are derived by stacking all the WAC frames, selecting a threshold to determine if the surface is illuminated, and summing the resulting binary images. In addition, mosaics of NAC images are also being produced for regions of interest at a scale of 0.7 to 1.5 m/pixel. The mosaics produced so far have revealed small illuminated surfaces on the tens of meters scale that were previously thought to be shadowed during that time. The LROC dataset of the polar regions complements previous illumination analysis of Clementine images [Bussey et al., 1999], Kaguya topography [Bussey et al., 2010], and the current efforts underway by the Lunar Orbiter Laser Altimeter (LOLA) Team [Mazarico et al., 2010] and provide an important new dataset for science and exploration. References: Bussey et al. (1999), Illumination conditions at the lunar south pole, Geophysical Research Letters, 26(9), 1187-1190. Bussey et al. (2010), Illumination conditions of the south pole of the Moon derived from Kaguya topography, Icarus, 208, 558-564. Mazarico et al. (2010), Illumination of the lunar poles from the Lunar Orbiter Laser Altimeter (LOLA) Topography Data, paper presented at 41st LPSC, Houston, TX. Robinson et al. (2010), Lunar Reconnaissance Orbiter Camera (LROC) Instrument Overview, Space Sci Rev, 150, 81-124.

Speyerer, E.; Koeber, S.; Robinson, M. S.

2010-12-01

353

SMART CAMERA NETWORKS IN VIRTUAL REALITY Faisal Qureshi  

E-print Network

demonstrate a smart camera network comprising static and active simulated video surveillance cameras-- Virtual Vision, Computer Vision, Persis- tent Surveillance, Smart Cameras, Camera Networks, Multi- Camera, etc. Multi-camera systems, or camera networks, are a critical com- ponent of any video surveillance

Qureshi, Faisal Z.

354

A Different Angle on Perspective  

ERIC Educational Resources Information Center

When a plane figure is photographed from different viewpoints, lengths and angles appear distorted. Hence it is often assumed that lengths, angles, protractors, and compasses have no place in projective geometry. Here we describe a sense in which certain angles are preserved by projective transformations. These angles can be constructed with…

Frantz, Marc

2012-01-01

355

A mobile gamma camera system for 3D acute myocardial perfusion imaging  

Microsoft Academic Search

A mobile tomographic gamma camera has been developed to enable three-dimensional perfusion studies in the emergency room and in the intensive care environment. The system, Cardioatom, is based on the limited view angle method Ectomography and comprises a modern detector head equipped with a rotating slant hole collimator and specially developed hardware and software for data acquisition, processing and display.

M. Persson; D. Bone; L.-A. Brodin; S. Dale; C. Lindstrom; T. Ribbe; H. Elmqvist

1997-01-01

356

Face Shape Reconstruction from Image Sequence Taken with Monocular Camera using Shape Database  

Microsoft Academic Search

We propose a method for reconstructing 3D face shape from a camera, which captures the object face from various viewing angles. In this method, we do not directly reconstruct the shape, but estimate a small number of parameters which represent the face shape. The parameter space is constructed with principal component analysis of database of a large number of face

Hideo Saito; Yosuke Ito; Masaaki Mochimaru

2007-01-01

357

Note: Simple hysteresis parameter inspector for camera module with liquid lens.  

PubMed

A method to inspect hysteresis parameter is presented in this article. The hysteresis of whole camera module with liquid lens can be measured rather than a single lens merely. Because the variation in focal length influences image quality, we propose utilizing the sharpness of images which is captured from camera module for hysteresis evaluation. Experiments reveal that the profile of sharpness hysteresis corresponds to the characteristic of contact angle of liquid lens. Therefore, it can infer that the hysteresis of camera module is induced by the contact angle of liquid lens. An inspection process takes only 20 s to complete. Thus comparing with other instruments, this inspection method is more suitable to integrate into the mass production lines for online quality assurance. PMID:20515182

Chen, Po-Jui; Liao, Tai-Shan; Hwang, Chi-Hung

2010-05-01

358

Interference-induced angle-independent acoustical transparency  

NASA Astrophysics Data System (ADS)

It is revealed that the Fano-like interference leads to the extraordinary acoustic transmission through a slab metamaterial of thickness much smaller than the wavelength, with each unit cell consisting of a Helmholtz resonator and a narrow subwavelength slit. More importantly, both the theoretical analysis and experimental measurement show that the angle-independent acoustical transparency can be realized by grafting a Helmholtz resonator and a quarter-wave resonator to the wall of a narrow subwavelength slit in each unit cell of a slit array. The observed phenomenon results from the interferences between the waves propagating in the slit, those re-radiated by the Helmholtz resonator, and those re-radiated by the quarter-wave resonator. The proposed design may find its applications in designing angle-independent acoustical filters and controlling the phase of the transmitted waves.

Qi, Lehua; Yu, Gaokun; Wang, Xinlong; Wang, Guibo; Wang, Ning

2014-12-01

359

Find Angle Measures in Polygons  

NSDL National Science Digital Library

This lesson will introduce you to polygons and their angle measures. Focus on the Interior angles and exterior angles and their properties. First, let's discuss diagonals. What is a diagonal in a polygon? Play with and take notes on the following web site: Diagonals of a Polygon Now you are ready to learn the Polygon Interior Angles Theorem. It involves finding the measure of all of the angles inside a polygon, no matter how big or little ...

Mrs. Neubert

2011-02-09

360

21 CFR 892.1110 - Positron camera.  

Code of Federal Regulations, 2013 CFR

...Positron camera. (a) Identification. A positron camera is a device intended to image the distribution of positron-emitting radionuclides in the body. This generic type of device may include signal analysis and display equipment,...

2013-04-01

361

21 CFR 892.1110 - Positron camera.  

Code of Federal Regulations, 2011 CFR

...Positron camera. (a) Identification. A positron camera is a device intended to image the distribution of positron-emitting radionuclides in the body. This generic type of device may include signal analysis and display equipment,...

2011-04-01

362

21 CFR 892.1110 - Positron camera.  

Code of Federal Regulations, 2012 CFR

...Positron camera. (a) Identification. A positron camera is a device intended to image the distribution of positron-emitting radionuclides in the body. This generic type of device may include signal analysis and display equipment,...

2012-04-01

363

21 CFR 892.1110 - Positron camera.  

Code of Federal Regulations, 2014 CFR

...Positron camera. (a) Identification. A positron camera is a device intended to image the distribution of positron-emitting radionuclides in the body. This generic type of device may include signal analysis and display equipment,...

2014-04-01

364

General linear cameras : theory and applications  

E-print Network

I present a General Linear Camera (GLC) model that unifies many previous camera models into a single representation. The GLC model describes all perspective (pinhole), orthographic, and many multiperspective (including ...

Yu, Jingyi, 1978-

2005-01-01

365

Silicone Contamination Camera for Developed for Shuttle Payloads  

NASA Technical Reports Server (NTRS)

On many shuttle missions, silicone contamination from unknown sources from within or external to the shuttle payload bay has been a chronic problem plaguing experiment payloads. There is currently a wide range of silicone usage on the shuttle. Silicones are used to coat the shuttle tiles to enhance their ability to shed rain, and over 100 kg of RTV 560 silicone is used to seal white tiles to the shuttle surfaces. Silicones are also used in electronic components, potting compounds, and thermal control blankets. Efforts to date to identify and eliminate the sources of silicone contamination have not been highly successful and have created much controversy. To identify the sources of silicone contamination on the space shuttle, the NASA Lewis Research Center developed a contamination camera. This specially designed pinhole camera utilizes low-Earth-orbit atomic oxygen to develop a picture that identifies sources of silicone contamination on shuttle-launched payloads. The volatile silicone species travel through the aperture of the pinhole camera, and since volatile silicone species lose their hydrocarbon functionalities under atomic oxygen attack, the silicone adheres to the substrate as SiO_x. This glassy deposit should be spatially arranged in the image of the sources of silicone contamination. To view the contamination image, one can use ultrasensitive thickness measurement techniques, such as scanning variable-angle ellipsometry, to map the surface topography of the camera's substrate. The demonstration of a functional contamination camera would resolve the controversial debate concerning the amount and location of contamination sources, would allow corrective actions to be taken, and would demonstrate a useful tool for contamination documentation on future shuttle payloads, with near negligible effect on cost and weight.

1996-01-01

366

Image quality testing of assembled IR camera modules  

NASA Astrophysics Data System (ADS)

Infrared (IR) camera modules for the LWIR (8-12_m) that combine IR imaging optics with microbolometer focal plane array (FPA) sensors with readout electronics are becoming more and more a mass market product. At the same time, steady improvements in sensor resolution in the higher priced markets raise the requirement for imaging performance of objectives and the proper alignment between objective and FPA. This puts pressure on camera manufacturers and system integrators to assess the image quality of finished camera modules in a cost-efficient and automated way for quality control or during end-of-line testing. In this paper we present recent development work done in the field of image quality testing of IR camera modules. This technology provides a wealth of additional information in contrast to the more traditional test methods like minimum resolvable temperature difference (MRTD) which give only a subjective overall test result. Parameters that can be measured are image quality via the modulation transfer function (MTF) for broadband or with various bandpass filters on- and off-axis and optical parameters like e.g. effective focal length (EFL) and distortion. If the camera module allows for refocusing the optics, additional parameters like best focus plane, image plane tilt, auto-focus quality, chief ray angle etc. can be characterized. Additionally, the homogeneity and response of the sensor with the optics can be characterized in order to calculate the appropriate tables for non-uniformity correction (NUC). The technology can also be used to control active alignment methods during mechanical assembly of optics to high resolution sensors. Other important points that are discussed are the flexibility of the technology to test IR modules with different form factors, electrical interfaces and last but not least the suitability for fully automated measurements in mass production.

Winters, Daniel; Erichsen, Patrik

2013-10-01

367

Method for shaping and aiming narrow beams. [sonar mapping and target identification  

NASA Technical Reports Server (NTRS)

A sonar method and apparatus is discribed which utilizes a linear frequency chirp in a transmitter/receiver having a correlator to synthesize a narrow beamwidth pattern from otherwise broadbeam transducers when there is relative velocity between the transmitter/receiver and the target. The chirp is so produced in a generator in bandwidth, B, and time, T, as to produce a time bandwidth product, TB, that is increased for a narrower angle. A replica of the chirp produced in a generator is time delayed and Doppler shifted for use as a reference in the receiver for correlation of received chirps from targets. This reference is Doppler shifted to select targets preferentially, thereby to not only synthesize a narrow beam but also aim the beam in azimuth and elevation.

Heyser, R. C. (inventor)

1981-01-01

368

Optimising Camera Traps for Monitoring Small Mammals  

PubMed Central

Practical techniques are required to monitor invasive animals, which are often cryptic and occur at low density. Camera traps have potential for this purpose, but may have problems detecting and identifying small species. A further challenge is how to standardise the size of each camera’s field of view so capture rates are comparable between different places and times. We investigated the optimal specifications for a low-cost camera trap for small mammals. The factors tested were 1) trigger speed, 2) passive infrared vs. microwave sensor, 3) white vs. infrared flash, and 4) still photographs vs. video. We also tested a new approach to standardise each camera’s field of view. We compared the success rates of four camera trap designs in detecting and taking recognisable photographs of captive stoats (Mustelaerminea), feral cats (Felis catus) and hedgehogs (Erinaceuseuropaeus). Trigger speeds of 0.2–2.1 s captured photographs of all three target species unless the animal was running at high speed. The camera with a microwave sensor was prone to false triggers, and often failed to trigger when an animal moved in front of it. A white flash produced photographs that were more readily identified to species than those obtained under infrared light. However, a white flash may be more likely to frighten target animals, potentially affecting detection probabilities. Video footage achieved similar success rates to still cameras but required more processing time and computer memory. Placing two camera traps side by side achieved a higher success rate than using a single camera. Camera traps show considerable promise for monitoring invasive mammal control operations. Further research should address how best to standardise the size of each camera’s field of view, maximise the probability that an animal encountering a camera trap will be detected, and eliminate visible or audible cues emitted by camera traps. PMID:23840790

Glen, Alistair S.; Cockburn, Stuart; Nichols, Margaret; Ekanayake, Jagath; Warburton, Bruce

2013-01-01

369

An analysis of IAEA surveillance camera reliability  

SciTech Connect

Reliability estimates for both individual cameras and camera systems assuming a constant hazard function (exponental failure model) are presented. A comparison of ''time-to-failure'' and ''frames-to-failure'' estimates is shown with the conclusion that ''frames-to-failure'' is the more useful metric in describing camera performance. The role that individual camera ''time-to-repair'' plays in overall system reliability is discussed.

Hooper, R.; Kerr, E.

1984-07-01

370

Laser angle sensor  

NASA Technical Reports Server (NTRS)

A laser angle measurement system was designed and fabricated for NASA Langley Research Center. The instrument is a fringe counting interferometer that monitors the pitch attitude of a model in a wind tunnel. A laser source and detector are mounted above the model. Interference fringes are generated by a small passive element on the model. The fringe count is accumulated and displayed by a processor in the wind tunnel control room. This report includes optical and electrical schematics, system maintenance and operation procedures.

Pond, C. R.; Texeira, P. D.

1985-01-01

371

Small Angle Neutron Scattering  

SciTech Connect

Small Angle Neutron Scattering (SANS) probes structural details at the nanometer scale in a non-destructive way. This article gives an introduction to scientists who have no prior small-angle scattering knowledge, but who seek a technique that allows elucidating structural information in challenging situations that thwart approaches by other methods. SANS is applicable to a wide variety of materials including metals and alloys, ceramics, concrete, glasses, polymers, composites and biological materials. Isotope and magnetic interactions provide unique methods for labeling and contrast variation to highlight specific structural features of interest. In situ studies of a material s responses to temperature, pressure, shear, magnetic and electric fields, etc., are feasible as a result of the high penetrating power of neutrons. SANS provides statistical information on significant structural features averaged over the probed sample volume, and one can use SANS to quantify with high precision the structural details that are observed, for example, in electron microscopy. Neutron scattering is non-destructive; there is no need to cut specimens into thin sections, and neutrons penetrate deeply, providing information on the bulk material, free from surface effects. The basic principles of a SANS experiment are fairly simple, but the measurement, analysis and interpretation of small angle scattering data involves theoretical concepts that are unique to the technique and that are not widely known. This article includes a concise description of the basics, as well as practical know-how that is essential for a successful SANS experiment.

Urban, Volker S [ORNL

2012-01-01

372

A New Concept of Security Camera Monitoring  

Microsoft Academic Search

We present a novel framework for encoding images obtained by a security monitoring camera with protecting the privacy of moving objects in the images. We are motivated by the fact that although secu- rity monitoring cameras can deter crimes, they may infringe the privacy of those who and objects which are recorded by the cameras. Moving objects, whose privacy should

Kenichi YABUTA; Hitoshi KITAZAWA; Toshihisa TANAKA

373

Initial laboratory evaluation of color video cameras  

Microsoft Academic Search

Sandia National Laboratories has considerable experience with monochrome video cameras used in alarm assessment video systems. Most of these systems, used for perimeter protection, were designed to classify rather than identify an intruder. Monochrome cameras are adequate for that application and were selected over color cameras because of their greater sensitivity and resolution. There is a growing interest in the

P. L. Terry

1991-01-01

374

An auto-focusing CCD camera mount  

NASA Astrophysics Data System (ADS)

The traditional methods of focusing a CCD camera are either time consuming, difficult or, more importantly, indecisive. This paper describes a device designed to allow the observer to be confident that the camera will always be properly focused by sensing a selected star image and automatically adjusting the camera's focal position.

Arbour, R. W.

1994-08-01

375

CAMERA MOTION STYLE TRANSFER Christian Kurz1  

E-print Network

CAMERA MOTION STYLE TRANSFER Christian Kurz1 Tobias Ritschel2 Elmar Eisemann2 Thorsten Thorm¨ahlen1 is a database of videos filmed by physical cameras. These videos are analyzed with a camera-motion estimation algorithm (structure-from-motion) and labeled manually with a specific style. By considering spectral

376

Matched collimators for pixellated gamma camera  

Microsoft Academic Search

Recent years have witnessed the development of small field of view (FOV) nuclear medicine cameras for applications requiring either high spatial resolution and\\/or compact camera size. Dedicated gamma detectors with high intrinsic spatial resolutions have been built; however, collimation remains perhaps the key determinant of performance for single gamma imaging. In current pixellated gamma cameras, parallel-hole collimators are used with

Priya Raghunathan; Patricia J. Goodale; Jill Klinger; Michael Appleby; James Atkinson; Mark B. Williams

2005-01-01

377

Towards Compressive Camera Networks Kaushik Mitra1  

E-print Network

; camera networks are now found not only in surveillance and access control applications but also in motionTowards Compressive Camera Networks Kaushik Mitra1 , Ashok Veeraraghavan1 , Aswin C University, Pittsburgh, PA Abstract The scale and scope of multi-camera networks has advanced significantly

Mellor-Crummey, John

378

A Prototype of Autonomous Intelligent Surveillance Cameras  

Microsoft Academic Search

This paper presents an architecture and an FPGAbased prototype of an autonomous intelligent video surveillance camera. The camera takes the advantage of high resolution of CMOS image sensors and enables instantly automatic pan, tilt and zoom adjustment based upon motion activity. It performs automated scene analysis and provides immediate response to suspicious events by optimizing camera capturing parameters. The video

Wanqing Li; Igor Kharitonenko; Serge Lichman; Chaminda Weerasinghe

2006-01-01

379

Travelling route of mobile surveillance camera  

Microsoft Academic Search

A video surveillance system is becoming more and more important for investigation and deterrent of crimes, and cameras installed in public space are increasing. However, a number of cameras is required to observe a wide and complex area with cameras installed at fixed positions. In order to efficiently observe a wide and complex area at lower cost, mobile robots have

Yoichi TOMIOKA; Atsushi TAKARA; Hitoshi KITAZAWA

2010-01-01

380

Dynamically Visual Learning with Sparsely Distributed Cameras  

Microsoft Academic Search

We propose a dynamic visual learning method, which aims to identify people by using sparsely distributed multiple surveillance cameras. In learning method for personal identification, many images captured by densely distributed multiple cameras are generally demanded. However, a practical setup of surveillance cameras does not have dense distribution. In the proposed method, densely distributed multiple images are virtually captured by

Hidenori Tanaka; Itaru Kitahara; Hideo Saito; Hiroshi Murase; Kiyoshi Kogure; Norihiro Hagita

381

Distributed Calibration of Smart Cameras John Jannotti  

E-print Network

-grained to allow fu- sion between overlapping camera views. This paper introduces Lighthouse, a distributed calibra). Lighthouse finds matches between cameras, even be- tween distant cameras, without centralizing observa- tions. Lighthouse also contributes several advancements in the cooperative creation of GHTs, including boot

Jannotti, John

382

16 CFR 501.1 - Camera film.  

Code of Federal Regulations, 2010 CFR

...1 2010-01-01 2010-01-01 false Camera film. 501.1 Section 501.1 Commercial Practices...AND PROHIBITIONS UNDER PART 500 § 501.1 Camera film. Camera film packaged and labeled for retail sale is exempt...

2010-01-01

383

16 CFR 501.1 - Camera film.  

Code of Federal Regulations, 2014 CFR

...1 2014-01-01 2014-01-01 false Camera film. 501.1 Section 501.1 Commercial Practices...AND PROHIBITIONS UNDER PART 500 § 501.1 Camera film. Camera film packaged and labeled for retail sale is exempt...

2014-01-01

384

16 CFR 501.1 - Camera film.  

Code of Federal Regulations, 2012 CFR

...1 2012-01-01 2012-01-01 false Camera film. 501.1 Section 501.1 Commercial Practices...AND PROHIBITIONS UNDER PART 500 § 501.1 Camera film. Camera film packaged and labeled for retail sale is exempt...

2012-01-01

385

16 CFR 501.1 - Camera film.  

Code of Federal Regulations, 2013 CFR

...1 2013-01-01 2013-01-01 false Camera film. 501.1 Section 501.1 Commercial Practices...AND PROHIBITIONS UNDER PART 500 § 501.1 Camera film. Camera film packaged and labeled for retail sale is exempt...

2013-01-01

386

16 CFR 501.1 - Camera film.  

Code of Federal Regulations, 2011 CFR

...1 2011-01-01 2011-01-01 false Camera film. 501.1 Section 501.1 Commercial Practices...AND PROHIBITIONS UNDER PART 500 § 501.1 Camera film. Camera film packaged and labeled for retail sale is exempt...

2011-01-01

387

Bio-inspired hemispherical compound eye camera  

NASA Astrophysics Data System (ADS)

Compound eyes in arthropods demonstrate distinct imaging characteristics from human eyes, with wide angle field of view, low aberrations, high acuity to motion and infinite depth of field. Artificial imaging systems with similar geometries and properties are of great interest for many applications. However, the challenges in building such systems with hemispherical, compound apposition layouts cannot be met through established planar sensor technologies and conventional optics. We present our recent progress in combining optics, materials, mechanics and integration schemes to build fully functional artificial compound eye cameras. Nearly full hemispherical shapes (about 160 degrees) with densely packed artificial ommatidia were realized. The number of ommatidia (180) is comparable to those of the eyes of fire ants and bark beetles. The devices combine elastomeric compound optical elements with deformable arrays of thin silicon photodetectors, which were fabricated in the planar geometries and then integrated and elastically transformed to hemispherical shapes. Imaging results and quantitative ray-tracing-based simulations illustrate key features of operation. These general strategies seem to be applicable to other compound eye devices, such as those inspired by moths and lacewings (refracting superposition eyes), lobster and shrimp (reflecting superposition eyes), and houseflies (neural superposition eyes).

Xiao, Jianliang; Song, Young Min; Xie, Yizhu; Malyarchuk, Viktor; Jung, Inhwa; Choi, Ki-Joong; Liu, Zhuangjian; Park, Hyunsung; Lu, Chaofeng; Kim, Rak-Hwan; Li, Rui; Crozier, Kenneth B.; Huang, Yonggang; Rogers, John A.

2014-03-01

388

Cooling the dark energy camera instrument  

SciTech Connect

DECam, camera for the Dark Energy Survey (DES), is undergoing general design and component testing. For an overview see DePoy, et al in these proceedings. For a description of the imager, see Cease, et al in these proceedings. The CCD instrument will be mounted at the prime focus of the CTIO Blanco 4m telescope. The instrument temperature will be 173K with a heat load of 113W. In similar applications, cooling CCD instruments at the prime focus has been accomplished by three general methods. Liquid nitrogen reservoirs have been constructed to operate in any orientation, pulse tube cryocoolers have been used when tilt angles are limited and Joule-Thompson or Stirling cryocoolers have been used with smaller heat loads. Gifford-MacMahon cooling has been used at the Cassegrain but not at the prime focus. For DES, the combined requirements of high heat load, temperature stability, low vibration, operation in any orientation, liquid nitrogen cost and limited space available led to the design of a pumped, closed loop, circulating nitrogen system. At zenith the instrument will be twelve meters above the pump/cryocooler station. This cooling system expected to have a 10,000 hour maintenance interval. This paper will describe the engineering basis including the thermal model, unbalanced forces, cooldown time, the single and two-phase flow model.

Schmitt, R.L.; Cease, H.; /Fermilab; DePoy, D.; /Ohio State U.; Diehl, H.T.; Estrada, J.; Flaugher, B.; /Fermilab; Kuhlmann, S.; /Ohio State U.; Onal, Birce; Stefanik, A.; /Fermilab

2008-06-01

389

Boiling Visualization and Critical Heat Flux Phenomena In Narrow Rectangular Gap  

SciTech Connect

An experimental study was performed to investifate the pool boling critical hear flux (CHF) on one-dimensional inclined rectangular channels with narrow gaps by changing the orientation of a copper test heater assembly. In a pool of saturated water at atmospheric pressure, the test parameters include the gap sizes of 1,2,5, and 10 mm, andthe surface orientation angles from the downward facing position (180 degrees) to the vertical position (90 degress) respectively.

J. J. Kim; Y. H. Kim; S. J. Kim; S. W. Noh; K. Y. Suh; J. Rempe; F. B. Cheung; S. B. Kim

2004-12-01

390

LDEF yaw and pitch angle estimates  

NASA Technical Reports Server (NTRS)

Quantification of the LDEF yaw and pitch misorientations is crucial to the knowledge of atomic oxygen exposure of samples placed on LDEF. Video camera documentation of the LDEF spacecraft prior to grapple attachment, atomic oxygen shadows on experiment trays and longerons, and a pinhole atomic oxygen camera placed on LDEF provided sources of documentation of the yaw and pitch misorientation. Based on uncertainty-weighted averaging of data, the LDEF yaw offset was found to be 8.1 plus or minus 0.6 degrees, allowing higher atomic oxygen exposure of row 12 than initially anticipated. The LDEF pitch angle offset was found to be 0.8 plus or minus 0.4 degrees, such that the space end was tipped forward toward the direction of travel. The resulting consequences of the yaw and pitch misorientation of LDEF on the atomic oxygen fluence is a factor of 2.16 increase for samples located on row 12, and a factor of 1.18 increase for samples located on the space end compared to that which would be expected for perfect orientation.

Banks, Bruce A.; Gebauer, Linda

1992-01-01

391

Pinhole Camera For Viewing Electron Beam Materials Processing  

NASA Astrophysics Data System (ADS)

A very rugged, compact (4x4x10 inches), gas purged "PINHOLE CAMERA" has been developed for viewing electron beam materials processing (e.g. melting or vaporizing metal). The video image is computer processed, providing dimensional and temperature measurements of objects within the field of view, using an IBM PC. The "pinhole camera" concept is similar to a TRW optics system for viewing into a coal combustor through a 2 mm hole. Gas is purged through the hole to repel particulates from optical surfaces. In our system light from the molten metal passes through the 2 mm hole "PINHOLE", reflects off an aluminum coated glass substrate and passes through a window into a vacuum tight container holding the camera and optics at atmospheric pressure. The mirror filters out X rays which pass through the AL layer and are absorbed in the glass mirror substrate. Since metallic coatings are usually reflective, the image quality is not severely degraded by small amounts of vapor that overcome the gas purge to reach the mirror. Coating thicknesses of up to 2 microns can be tolerated. The mirror is the only element needing occasional servicing. We used a telescope eyepiece as a convenient optical design, but with the traditional optical path reversed. The eyepiece images a scene through a small entrance aperture onto an image plane where a CCD camera is placed. Since the iris of the eyepiece is fixed and the scene intensity varies it was necessary to employ a variable neutral density filter for brightness control. Devices used for this purpose include PLZT light valve from Motorola, mechanically rotated linear polarizer sheets, and nematic liquid crystal light valves. These were placed after the mirror and entrance aperture but before the lens to operate as a voltage variable neutral density filter. The molten metal surface temp being viewed varies from 4000 to 1200 degrees Kelvin. The resultant intensity change (at 488 nm with 10 nm bandwidth) is seven orders of magnitude. This surface intensity variation is contrast reduced if the observation wavelength is a narrow band as far red as high intensity blooming will allow an observable picture. A three eyepiece camera allows an image plane where photo gray glass functions as a neutral density filter only over the high intensity portion of the image, thus reducing blooming. This system is enclosed in a water-cooled housing which can dissipate 15 watts/cm2, keeping the camera below 40 degrees Celsius. Single frames of video output are acquired for feature enhancement and location by a Data Translation DT2803 image processing board housed in an IBM PC.

Rushford, M. C.; Kuzmenko, P. J.

1986-10-01

392

Combustion pinhole-camera system  

DOEpatents

A pinhole camera system is described utilizing a sealed optical-purge assembly which provides optical access into a coal combustor or other energy conversion reactors. The camera system basically consists of a focused-purge pinhole optical port assembly, a conventional TV vidicon receiver, an external, variable density light filter which is coupled electronically to the vidicon automatic gain control (agc). The key component of this system is the focused-purge pinhole optical port assembly which utilizes a purging inert gas to keep debris from entering the port and a lens arrangement which transfers the pinhole to the outside of the port assembly. One additional feature of the port assembly is that it is not flush with the interior of the combustor.

Witte, A.B.

1982-05-19

393

Video Analysis in PTZ Camera Networks From master-slave to cooperative smart cameras  

E-print Network

. In modern surveillance systems, one of the major challenges in multi-camera tracking is the consistency needed. The introduction of Pan-Tilt-Zoom (PTZ) cameras brought new capabilities to surveillance networks a1 Video Analysis in PTZ Camera Networks From master-slave to cooperative smart cameras Christian

394

Narrow-band ELF events observed from South Pole Station  

NASA Astrophysics Data System (ADS)

Extremely Low Frequency (ELF) waves are typically in the range of 3 Hz - 3 kHz and can play a role in acceleration and pitch-angle scattering of energetic particles in the radiation belts. Observations of a not uncommon, but not well studied ELF phenomenon are presented with ground-based data from South Pole Station. The narrow-band waves last approximately one or two minutes maintaining bandwidth over the course of the event, begin around 100 Hz, decrease to about 70 Hz, and typically show a higher frequency harmonic. The waves have only been documented at four locations - Heacock, 1974 (Alaska); Sentman and Ehring, 1994 (California); Wang et al, 2005 and Wang et al, 2011 (Taiwan); and Kim et al, 2006 (South Pole). The waves observed at the South Pole are not detected when the Sun drops below a 10 degree elevation angle, which is not true for the other locations. We extend the study of Kim et al, 2006, and explore possible generation mechanisms including sunlit ionosphere and ion cyclotron wave modes, as well as correspondence with energetic particle precipitation.

Heavisides, J.; Weaver, C.; Lessard, M.; Weatherwax, A. T.

2012-12-01

395

Advanced camera image data acquisition system for Pi-of-the-Sky  

NASA Astrophysics Data System (ADS)

The paper describes a new generation of high performance, remote control, CCD cameras designed for astronomical applications. A completely new camera PCB was designed, manufactured, tested and commissioned. The CCD chip was positioned in a different way than previously resulting in better performance of the astronomical video data acquisition system. The camera was built using a low-noise, 4Mpixel CCD circuit by STA. The electronic circuit of the camera is highly parameterized and reconfigurable, as well as modular in comparison with the solution of first generation, due to application of open software solutions and FPGA circuit, Altera Cyclone EP1C6. New algorithms were implemented into the FPGA chip. There were used the following advanced electronic circuit in the camera system: microcontroller CY7C68013a (core 8051) by Cypress, image processor AD9826 by Analog Devices, GigEth interface RTL8169s by Realtec, memory SDRAM AT45DB642 by Atmel, CPU typr microprocessor ARM926EJ-S AT91SAM9260 by ARM and Atmel. Software solutions for the camera and its remote control, as well as image data acquisition are based only on the open source platform. There were used the following image interfaces ISI and API V4L2, data bus AMBA, AHB, INDI protocol. The camera will be replicated in 20 pieces and is designed for continuous on-line, wide angle observations of the sky in the research program Pi-of-the-Sky.

Kwiatkowski, Maciej; Kasprowicz, Grzegorz; Pozniak, Krzysztof; Romaniuk, Ryszard; Wrochna, Grzegorz

2008-11-01

396

Simulation and assessment of stray light effects in infrared cameras using non-sequential ray tracing  

NASA Astrophysics Data System (ADS)

In this article, we have evaluated the stray light effects in infrared cameras operating in the 3-5 ?m waveband having cooled focal plane array detectors. A non-sequential ray tracing method is used to model the multiple reflections from polished lens surfaces as well as diffused scattering from rough mechanical surfaces inside the camera housing. The flux signature on the detector surface due to the self-emission of in-band radiation from the lenses and camera housing is calculated. The flux signature due to sources beyond the nominal field-of-view of the camera is also calculated. Light splitting into multiple diffraction orders from diffractive lens surfaces is taken into account both in the transmission and the reflection paths. The wavelength and angle dependent reflection coefficient of the anti-reflection coatings on the polished lens surfaces is taken into account. The effect of local heating of camera mounts on the stray light signature on the detector surface is also evaluated. The results show that using the non-sequential ray tracing method, useful predictions can be made for the stray light signature in the infrared cameras. Such calculations can be used to identify areas in the camera housing which need to be temperature stabilized to reduce their detrimental stray-light signature.

Pravdivtsev, A. V.; Akram, M. N.

2013-09-01

397

Principal algorithms for the control of Kourovka Observatory SBG camera  

NASA Astrophysics Data System (ADS)

We report the algorithms used in the software of the upgraded SBG camera. Fast-moving satellites are observed in the "rotated" coordinate system where one of the axes points towards the pole of the object's orbit. The ephemeris for this coordinate system is computed based on the ephemeris for the equatorial coordinate system using special transition matrices. The parameters of the matrices are the coordinates of the orbital pole, which are found by averaging the vector products of the radius vectors of the consecutive positions of the satellite. The position angle of the image is computed as the difference between the hour angles of the orbital and celestial poles in the coordinate system, the pole of which coincides with the optical center of the frame. The speed of object tracking is computed via quadratic interpolation of the ephemeris in the "rotated" coordinate system.

Glamazda, D. V.

2012-04-01

398

Vista IR Camera: Conceptual Design  

Microsoft Academic Search

This paper describes the conceptual design for a near infrared camera for the Visible and Infrared Survey Telescope for Astronomy (VISTA). VISTA is a 4m class survey telescope that is being designed to perform pre-planned, ground-based astronomical surveys of the Southern sky from ESO's Cerro Paranal Observatory in Chile. The IR Surveys will be carried out in the J, H

Ian Egan; Eli Atad-Ettedgui; Mark M. Casali; Simon C. Craig; Maureen A. Ellis; Peter R. Hastings; David M. Henry; Ken Laidlaw; John Murray; Malcom Stewart; Will Sutherland

2002-01-01

399

Graphic design of pinhole cameras  

NASA Technical Reports Server (NTRS)

The paper describes a graphic technique for the analysis and optimization of pinhole size and focal length. The technique is based on the use of the transfer function of optical elements described by Scott (1959) to construct the transfer function of a circular pinhole camera. This transfer function is the response of a component or system to a pattern of lines having a sinusoidally varying radiance at varying spatial frequencies. Some specific examples of graphic design are presented.

Edwards, H. B.; Chu, W. P.

1979-01-01

400

The Advanced Camera for Surveys  

NSDL National Science Digital Library

The Johns Hopkins University describes the Advanced Camera for Surveys (ACS), which was installed in the Hubble Space Telescope in 2002 to "detect light from the ultraviolet to the near infrared." Users can view a photo gallery of the filters, detectors, optical bench, astronomers, and other aspects of ACS optical and mechanical components. While some parts of the website are restricted, scientists can find abstracts and full-text scientific papers, explanations of calibration, the coronagraph and other instruments, and press releases.

401

ANIR : Atacama Near-Infrared Camera for the 1.0-m miniTAO Telescope  

E-print Network

We have developed a near-infrared camera called ANIR (Atacama Near-InfraRed camera) for the University of Tokyo Atacama Observatory 1.0m telescope (miniTAO) installed at the summit of Cerro Chajnantor (5640 m above sea level) in northern Chile. The camera provides a field of view of 5'.1 $\\times$ 5'.1 with a spatial resolution of 0".298 /pixel in the wavelength range of 0.95 to 2.4 $\\mu$m. Taking advantage of the dry site, the camera is capable of hydrogen Paschen-$\\alpha$ (Pa$\\alpha$, $\\lambda=$1.8751 $\\mu$m in air) narrow-band imaging observations, at which wavelength ground-based observations have been quite difficult due to deep atmospheric absorption mainly from water vapor. We have been successfully obtaining Pa$\\alpha$ images of Galactic objects and nearby galaxies since the first-light observation in 2009 with ANIR. The throughputs at the narrow-band filters ($N1875$, $N191$) including the atmospheric absorption show larger dispersion (~10%) than those at broad-band filters (a few %), indicating that ...

Konishi, Masahiro; Tateuchi, Ken; Takahashi, Hidenori; Kitagawa, Yutaro; Kato, Natsuko; Sako, Shigeyuki; Uchimoto, Yuka K; Toshikawa, Koji; Ohsawa, Ryou; Yamamuro, Tomoyasu; Asano, Kentaro; Ita, Yoshifusa; Kamizuka, Takafumi; Komugi, Shinya; Koshida, Shintaro; Manabe, Sho; Matsunaga, Noriyuki; Minezaki, Takeo; Morokuma, Tomoki; Nakashima, Asami; Takagi, Toshinobu; Tanabé, Toshihiko; Uchiyama, Mizuho; Aoki, Tsutomu; Doi, Mamoru; Handa, Toshihiro; Kato, Daisuke; Kawara, Kimiaki; Kohno, Kotaro; Miyata, Takashi; Nakamura, Tomohiko; Okada, Kazushi; Soyano, Takao; Tamura, Yoichi; Tanaka, Masuo; Tarusawa, Ken'ichi; Yoshii, Yuzuru

2015-01-01

402

The Spacelab Wide Angle Telescope (SWAT)  

NASA Technical Reports Server (NTRS)

A fast wide angle telescope that will be capable of imaging to the darker sky limit and in the ultraviolet wavelength region available above the atmosphere is described. The telescope (SWAT) has a resolution comparable to that of the large ground-based Schmidt telescope and a field of at least five degrees. A number of astrophysically important investigations can only be accomplished with such a telescope, e.g., detection of hidden, hot objects like hot white dwarfs and subwarfs in stellar binary systems, and energetic regions in globular clusters and galaxy nuclei. It permits unique studies of the UV-morphology of extended objects and allows discovery of very faint extensions, halos, jets, and filaments in galaxies. It can contribute to the investigation of dust in the Milky Way and in other galaxies and, with an objective prism, spectra of very faint objects can be obtained. The SWAT will localize objects for further study with the narrow-field Space Telescope.

West, R. M.; Gull, T. R.; Henize, K. G.; Bertola, F.

1979-01-01

403

Measuring Non-spherical Airborne Dust with Space-based MISR Multi-angle Imaging.  

NASA Astrophysics Data System (ADS)

Some of the world's largest dust plumes emanate from Northern Eurasian deserts and are expected to increasingly affect Asian ergonomics. Together with field experiments, satellite observations of dust outbreaks, placed into the context of large-scale dust transport modeling, can help understand the impact of mineral dust aerosols on past and present climate and climate predictions in North and Central Asia. Multi-angle instruments such as the Multi-angle Imaging SpectroRadiometer (MISR) provide independent constraints on aerosol properties based on sensitivity to the shape of the scattering phase function. We present an analysis of the Multi-angle Imaging SpectroRadiometer (MISR) Standard Aerosol Retrieval algorithm, updated with new non-spherical dust models (Version 16 and higher). We compare the MISR products with coincident AERONET surface sun-photometer observations taken during the passage of dust fronts. Our analysis shows that during such events MISR retrieves Angstrom exponents characteristic of large particles, having little spectral variation in extinction over the MISR wavelength range (442, 550, 672 and 866 nm channels), as expected. Also, the retrieved fraction of non-spherical particles is very high. This quantity is not retrieved by satellite instruments having only nadir-viewing cameras. We assess whether MISR aerosol optical thickness (AOT) acquired at about 10:30 AM local time, can be used to represent daily mean AOT in dust climate forcing studies, by comparing MISR-retrieved aerosol optical thickness (AOT) with AERONET daily-mean values. We also compare the effect of particle shape on MISR and MODIS dust retrievals, using co-located MISR, MODIS, and AERONET AOTs and Angstrom exponents. In most cases obtained for this study, MODIS had no retrievals due to sun-glint when MISR's narrower swath observed AERONET sties on islands surrounded by dark water. For the few coincident MISR-MODIS-AERONET dark-water, dusty condition retrievals we obtained, the MISR retrievals were in better agreement with AERONET than those from MODIS. Over bright desert sites, MODIS AOTs at visible wavelengths was systematically higher than those of AERONET and MISR. MISR-derived aerosol type mixtures for these cases included non-spherical dust components with high frequency in retrievals over dark water, and slightly lower frequency over land. The frequency with which non-spherical dust models were selected by the algorithm also decreased in dusty regions affected by pollution. Both MISR and MODIS retrievals have a high fail rate over optically thick dust plumes.

Kalashnikova, O. V.; Diner, D. J.; Abdou, W.; Kahn, R.; Gaitley, B. J.; Gasso, S.

2004-12-01

404

39 CFR 3001.31a - In camera orders.  

Code of Federal Regulations, 2011 CFR

...in camera information. In camera documents and testimony...to a request, may make in camera documents and testimony available for inspection, copying, or use by... (d) Briefing of in camera information. In...

2011-07-01

405

39 CFR 3001.31a - In camera orders.  

Code of Federal Regulations, 2010 CFR

...in camera information. In camera documents and testimony...to a request, may make in camera documents and testimony available for inspection, copying, or use by... (d) Briefing of in camera information. In...

2010-07-01

406

39 CFR 3001.31a - In camera orders.  

Code of Federal Regulations, 2012 CFR

...in camera information. In camera documents and testimony...to a request, may make in camera documents and testimony available for inspection, copying, or use by... (d) Briefing of in camera information. In...

2012-07-01

407

3D camera tracking from disparity images  

NASA Astrophysics Data System (ADS)

In this paper, we propose a robust camera tracking method that uses disparity images computed from known parameters of 3D camera and multiple epipolar constraints. We assume that baselines between lenses in 3D camera and intrinsic parameters are known. The proposed method reduces camera motion uncertainty encountered during camera tracking. Specifically, we first obtain corresponding feature points between initial lenses using normalized correlation method. In conjunction with matching features, we get disparity images. When the camera moves, the corresponding feature points, obtained from each lens of 3D camera, are robustly tracked via Kanade-Lukas-Tomasi (KLT) tracking algorithm. Secondly, relative pose parameters of each lens are calculated via Essential matrices. Essential matrices are computed from Fundamental matrix calculated using normalized 8-point algorithm with RANSAC scheme. Then, we determine scale factor of translation matrix by d-motion. This is required because the camera motion obtained from Essential matrix is up to scale. Finally, we optimize camera motion using multiple epipolar constraints between lenses and d-motion constraints computed from disparity images. The proposed method can be widely adopted in Augmented Reality (AR) applications, 3D reconstruction using 3D camera, and fine surveillance systems which not only need depth information, but also camera motion parameters in real-time.

Kim, Kiyoung; Woo, Woontack

2005-07-01

408

Angles and Area  

NSDL National Science Digital Library

In this activity (page 10 of PDF), learners approximate the area of the uppermost cross section of an impact crater using a variety of square grids. They conclude which angle of impact results in the greatest area. There are two versions of this activity: Challenge, where students construct a launcher and create their own craters; and Non-Challenge where students analyze pictures of craters. Includes a pre-lesson activity (p54). The Moon Math: Craters! guide follows a 5E approach, applying concepts of geometry, modeling, data analysis to the NASA lunar spacecraft mission, LCROSS.

NASA

2012-05-08

409

Shapes and Angles  

NSDL National Science Digital Library

In this activity (page 7 of PDF), learners will identify the general two-dimensional geometric shape of the uppermost cross section of an impact crater. They will also draw connections between the general two-dimensional geometric shape of an impact crater and the projectile's angle of impact. There are two versions of this activity: Challenge, where students construct a launcher and create their own craters; and Non-Challenge where students analyze pictures of craters. The Moon Math: Craters! guide follows a 5E approach, applying concepts of geometry, modeling, data analysis to the NASA lunar spacecraft mission, LCROSS.

NASA

2012-05-08

410

Optically trapped atomic resonant devices for narrow linewidth spectral imaging  

NASA Astrophysics Data System (ADS)

This thesis focuses on the development of atomic resonant devices for spectroscopic applications. The primary emphasis is on the imaging properties of optically thick atomic resonant fluorescent filters and their applications. In addition, this thesis presents a new concept for producing very narrow linewidth light as from an atomic vapor lamp pumped by a nanosecond pulse system. This research was motivated by application for missile warning system, and presents an innovative approach to a wide angle, ultra narrow linewidth imaging filter using a potassium vapor cell. The approach is to image onto and collect the fluorescent photons emitted from the surface of an optically thick potassium vapor cell, generating a 2 GHz pass-band imaging filter. This linewidth is narrow enough to fall within a Fraunhefer dark zone in the solar spectrum, thus make the detection solar blind. Experiments are conducted to measure the absorption line shape of the potassium resonant filter, the quantum efficiency of the fluorescent behavior, and the resolution of the fluorescent image. Fluorescent images with different spatial frequency components are analyzed by using a discrete Fourier transform, and the imaging capability of the fluorescent filter is described by its Modulation Transfer Function. For the detection of radiation that is spectrally broader than the linewidth of the potassium imaging filter, the fluorescent image is seen to be blurred by diffuse fluorescence from the slightly off resonant photons. To correct this, an ultra-thin potassium imaging filter is developed and characterized. The imaging property of the ultra-thin potassium imaging cell is tested with a potassium seeded flame, yielding a resolution image of ˜ 20 lines per mm. The physics behind the atomic resonant fluorescent filter is radiation trapping. The diffusion process of the resonant photons trapped in the atomic vapor is theoretically described in this thesis. A Monte Carlo method is used to simulate the absorption and fluorescence. The optimum resolution of the fluorescent image is predicted by simulation. Radiation trapping is also shown to be useful for the generation of ultra-narrow linewidth light from an atomic vapor flash lamp. A 2 nanosecond, high voltage pulse is used to excite low pressure mercury vapor mixed with noble gases, producing high intensity emission at the mercury resonant line at 253.7 nm. With a nanosecond pumping time and high electrical current, the radiation intensity of the mercury discharge is increased significantly compared to a normal glow discharge lamp, while simultaneously suppressing the formation of an arc discharge. By avoiding the arc discharge, discrete spectral lines of mercury were kept at narrow bandwidth. Due to radiation trapping, the emission linewidth from the nanosecond mercury lamp decreases with time and produces ultra-narrow linewidth emission 100 ns after of the excitation, this linewidth is verified by absorption measurements through low pressure mercury absorption filter. The lamp is used along with mercury absorption filters for spectroscopic applications, including Filtered Rayleigh Scattering with different CO2 pressures and Raman scattering from methanol.

Qian, Lipeng

411

Mini gamma camera, camera system and method of use  

DOEpatents

A gamma camera comprising essentially and in order from the front outer or gamma ray impinging surface: 1) a collimator, 2) a scintillator layer, 3) a light guide, 4) an array of position sensitive, high resolution photomultiplier tubes, and 5) printed circuitry for receipt of the output of the photomultipliers. There is also described, a system wherein the output supplied by the high resolution, position sensitive photomultipiler tubes is communicated to: a) a digitizer and b) a computer where it is processed using advanced image processing techniques and a specific algorithm to calculate the center of gravity of any abnormality observed during imaging, and c) optional image display and telecommunications ports.

Majewski, Stanislaw (Grafton, VA); Weisenberger, Andrew G. (Grafton, VA); Wojcik, Randolph F. (Yorktown, VA)

2001-01-01

412

Sun and view angle effects on NDVI determination of land cover types in the Brazilian Amazon region with hyperspectral data  

Microsoft Academic Search

Solar zenith and view angle effects on the normalized difference vegetation index (NDVI) of land cover types in the Brazilian Amazon region were analysed. Airborne Hyperspectral Mapper (HYMAP) data were collected in 126 narrow bands (450–2500?nm) with a field of view (FOV) of ±30° from nadir. Data collection was performed initially in two flight lines with solar zenith angles of

F. J. Ponzoni; J. C. N. Epiphanio; B. F. T. Rudorff; A. R. Formaggio

2004-01-01

413

Creation of the ? angle standard for the flat angle measurements  

NASA Astrophysics Data System (ADS)

Angle measurements are based mainly on multiangle prisms - polygons with autocollimators, rotary encoders fo high accuracy and circular scales as the standards of the flat angle. Traceability of angle measurements is based on the standard of the plane angle - prism (polygon) calibrated at an appropriate accuracy. Some metrological institutions have established their special test benches (comparators) equipped with circular scales or rotary encoders of high accuracy and polygons with autocollimators for angle calibration purposes. Nevertheless, the standard (etalon) of plane angle - polygon has many restrictions for the transfer of angle unit - radian (rad) and other units of angle. It depends on the number of angles formed by the flat sides of the polygon that is restricted by technological and metrological difficulties related to the production and accuracy determination of the polygon. A possibility to create the standard of the angle equal to ? rad or half the circle or the full angle is proposed. It can be created by the circular scale with the rotation axis of very high accuracy and two precision reading instruments, usually, photoelectric microscopes (PM), placed on the opposite sides of the circular scale using the special alignment steps. A great variety of angle units and values can be measured and its traceability ensured by applying the third PM on the scale. Calibration of the circular scale itself and other scale or rotary encoder as well is possible using the proposed method with an implementation of ? rad as the primary standard angle. The method proposed enables to assure a traceability of angle measurements at every laboratory having appropriate environment and reading instruments of appropriate accuracy together with a rotary table with the rotation axis of high accuracy - rotation trajectory (runout) being in the range of 0.05 ?m. Short information about the multipurpose angle measurement test bench developed is presented.

Giniotis, V.; Rybokas, M.

2010-07-01

414

From Reduction Machines To Narrowing Machines  

Microsoft Academic Search

Narrowing, the evaluation mechanism of functional logic languages, can be seen as a generalizationof reduction, the evaluation mechanism of purely functional languages. The unidirectionalpattern matching, which is used for parameter passing in functional languages, is simplyreplaced by the bidirectional unification known from logic programming languages. We showin this paper, how to extend a reduction machine, that has been designed for

Rita Loogen; RWTH Aachen

1991-01-01

415

Narrow Feshbach Dance of Two Trapped Atoms  

NASA Astrophysics Data System (ADS)

Near a narrow Feshbach resonance (with magnetic field width 10 mG or smaller) the ultra-cold atom interactions acquire an effective range that can be comparable to the average inter-particle distance. Although requiring a more accurate magnetic field control than their broad counterparts, the narrow Feshbach resonances can free cold atom physics from its straightjacket of the contact interaction paradigm. The finite-range effects can give rise to roton features in the phonon dispersion of dilute Bose-Einstein condensates (BEC's) and BEC's can support a ground state with modulated density patterns that breaks translational symmetry. We show that the finite range interaction is the consequence of the time-delay in atom-atom collisions. The narrow regime is also the parameter region in which the interacting atoms can spend a significant fraction of their time in the spin-rearranged (also called ``closed'') channel. To study the interaction physics we describe two atoms in a harmonic trap, interacting near a narrow resonance. We find the fraction of time that the atoms spend in the closed channel at fixed magnetic field and we study the time evolution of this system under conditions of a time-varying magnetic field.

Lopez Valdez, Nicolas; Timmermans, Eddy; Tsai, Shan-Wen

2012-06-01

416

Adverse effects of prohibiting narrow provider networks.  

PubMed

Many insurers participating in the new insurance exchanges are controlling costs by offering plans with narrow provider networks. Proposed regulations would promote network adequacy, but a pro-provider stance may not be inherently pro-consumer or even pro-patient. PMID:25119604

Howard, David H

2014-08-14

417

Progressive Retropalatal Narrowing Preceding Obstructive Apnea  

Microsoft Academic Search

Pharyngeal occlusion during obstructive apnea is thought to be an inspiratory-related event; however, occlusion also occurs in the absence of negative intrathoracic pressure. We hypothesized that inspira- tory-related pharyngeal occlusion would be preceded by significant expiratory narrowing. Eight sleeping patients with obstructive apnea were studied. Pharyngeal caliber, airflow, and esophageal pressure (Pes) were simultaneously monitored during three to four consecutive

MARY J. MORRELL; YASEEN ARABI; BRIAN ZAHN; M. SAFWAN

1998-01-01

418

The effects of orientation angle, subcooling, heat flux, mass flux, and pressure on bubble growth and detachment in subcooled flow boiling  

E-print Network

The effects of orientation angle, subcooling, heat flux, mass flux, and pressure on bubble growth and detachment in subcooled flow boiling were studied using a high-speed video camera in conjunction with a two-phase flow ...

Sugrue, Rosemary M

2012-01-01

419

The Advanced Camera for the Hubble Space Telescope  

Microsoft Academic Search

The Advanced Camera for the Hubble Space Telescope has three cameras. The first, the Wide Field Camera, will be a high- throughput, wide field, 4096 X 4096 pixel CCD optical and I-band camera that is half-critically sampled at 500 nm. The second, the High Resolution Camera (HRC), is a 1024 X 1024 pixel CCD camera that is critically sampled at

G. D. Illingworth; Paul D. Feldman; David A. Golimowski; Zlatan Tsvetanov; Christopher J. Burrows; James H. Crocker; Pierre Y. Bely; George F. Hartig; Randy A. Kimble; Michael P. Lesser; Richard L. White; Tom Broadhurst; William B. Sparks; Robert A. Woodruff; Pamela Sullivan; Carolyn A. Krebs; Douglas B. Leviton; William Burmester; Sherri Fike; Rich Johnson; Robert B. Slusher; Paul Volmer

1997-01-01

420

Variable angle correlation spectroscopy  

SciTech Connect

In this dissertation, a novel nuclear magnetic resonance (NMR) technique, variable angle correlation spectroscopy (VACSY) is described and demonstrated with {sup 13}C nuclei in rapidly rotating samples. These experiments focus on one of the basic problems in solid state NMR: how to extract the wealth of information contained in the anisotropic component of the NMR signal while still maintaining spectral resolution. Analysis of the anisotropic spectral patterns from poly-crystalline systems reveal information concerning molecular structure and dynamics, yet in all but the simplest of systems, the overlap of spectral patterns from chemically distinct sites renders the spectral analysis difficult if not impossible. One solution to this problem is to perform multi-dimensional experiments where the high-resolution, isotropic spectrum in one dimension is correlated with the anisotropic spectral patterns in the other dimensions. The VACSY technique incorporates the angle between the spinner axis and the static magnetic field as an experimental parameter that may be incremented during the course of the experiment to help correlate the isotropic and anisotropic components of the spectrum. The two-dimensional version of the VACSY experiments is used to extract the chemical shift anisotropy tensor values from multi-site organic molecules, study molecular dynamics in the intermediate time regime, and to examine the ordering properties of partially oriented samples. The VACSY technique is then extended to three-dimensional experiments to study slow molecular reorientations in a multi-site polymer system.

Lee, Y.K. [Univ. of California, Berkeley, CA (United States)]|[Lawrence Berkeley Lab., CA (United States). Chemical Biodynamics Div.

1994-05-01

421

Feasibility study for the application of the large format camera as a payload for the Orbiter program  

NASA Technical Reports Server (NTRS)

The large format camera (LFC) designed as a 30 cm focal length cartographic camera system that employs forward motion compensation in order to achieve the full image resolution provided by its 80 degree field angle lens is described. The feasibility of application of the current LFC design to deployment in the orbiter program as the Orbiter Camera Payload System was assessed and the changes that are necessary to meet such a requirement are discussed. Current design and any proposed design changes were evaluated relative to possible future deployment of the LFC on a free flyer vehicle or in a WB-57F. Preliminary mission interface requirements for the LFC are given.

1978-01-01

422

Angle-resolved scattering spectroscopy of explosives using an external cavity quantum cascade laser  

SciTech Connect

Investigation of angle-resolved scattering from solid explosives residues on a car door for non-contact sensing geometries. Illumination with a mid-infrared external cavity quantum cascade laser tuning between 7 and 8 microns was detected both with a sensitive single point detector and a hyperspectral imaging camera. Spectral scattering phenomena were discussed and possibilities for hyperspectral imaging at large scattering angles were outlined.

Suter, Jonathan D.; Bernacki, Bruce E.; Phillips, Mark C.

2012-04-01

423

Triangles: Finding Interior Angle Measures  

NSDL National Science Digital Library

In this lesson plan, students will start with a hands-on activity and then experiment with a GeoGebra-based computer model to investigate and discover the Triangle Angle Sum Theorem. Then they will use the Triangle Angle Sum Theorem to write and solve equations and find missing angle measures in a variety of examples.

2012-11-25

424

Slitless grism spectroscopy with the HST Advanced Camera for Surveys  

E-print Network

The Advanced Camera for Surveys on-board HST is equipped with a set of one grism and three prisms for low-resolution, slitless spectroscopy in the range 1150 Ang. to 10500 Ang. The G800L grism provides optical spectroscopy between 5500 Ang. and 1 micron with a mean dispersion of 39 Ang./pix and 24 Ang./pix (in the first spectral order) when coupled with the Wide Field and the High Resolution Channels, respectively. Given the lack of any on-board calibration lamps for wavelength and narrow band flat-fielding, the G800L grism can only be calibrated using astronomical targets. In this paper, we describe the strategy used to calibrate the grism in orbit, with special attention to the treatment of the field dependence of the grism flat-field, wavelength solution and sensitivity in both Channels.

A. Pasquali; N. Pirzkal; S. Larsen; J. Walsh; M. Kuemmel

2005-10-14

425

Integrated calibration between digital camera and laser scanner from mobile mapping system for land vehicles  

NASA Astrophysics Data System (ADS)

The paper presents the concept of lever arm and boresight angle, the design requirements of calibration sites and the integrated calibration method of boresight angles of digital camera or laser scanner. Taking test data collected by Applanix's LandMark system as an example, the camera calibration method is introduced to be piling three consecutive stereo images and OTF-Calibration method using ground control points. The laser calibration of boresight angle is proposed to use a manual and automatic method with ground control points. Integrated calibration between digital camera and laser scanner is introduced to improve the systemic precision of two sensors. By analyzing the measurement value between ground control points and its corresponding image points in sequence images, a conclusion is that position objects between camera and images are within about 15cm in relative errors and 20cm in absolute errors. By comparing the difference value between ground control points and its corresponding laser point clouds, the errors is less than 20cm. From achieved results of these experiments in analysis, mobile mapping system is efficient and reliable system for generating high-accuracy and high-density road spatial data more rapidly.

Zhao, Guihua; Chen, Hong; Li, Xingquan; Zou, Xiaoliang

426

Fruit Detectability Analysis for Different Camera Positions in Sweet-Pepper †  

PubMed Central

For robotic harvesting of sweet-pepper fruits in greenhouses a sensor system is required to detect and localize the fruits on the plants. Due to the complex structure of the plant, most fruits are (partially) occluded when an image is taken from one viewpoint only. In this research the effect of multiple camera positions and viewing angles on fruit visibility and detectability was investigated. A recording device was built which allowed to place the camera under different azimuth and zenith angles and to move the camera horizontally along the crop row. Fourteen camera positions were chosen and the fruit visibility in the recorded images was manually determined for each position. For images taken from one position only with the criterion of maximum 50% occlusion per fruit, the fruit detectability (FD) was in no case higher than 69%. The best single positions were the front views and looking with a zenith angle of 60° upwards. The FD increased when a combination was made of multiple viewpoint positions. With a combination of five favourite positions the maximum FD was 90%. PMID:24681670

Hemming, Jochen; Ruizendaal, Jos; Hofstee, Jan Willem; van Henten, Eldert J.

2014-01-01

427

Second-Generation Multi-Angle Imaging Spectroradiometer  

NASA Technical Reports Server (NTRS)

A report discusses an early phase in the development of the MISR-2 C, a second, improved version of the Multi-angle Imaging SpectroRadiometer (MISR), which has been in orbit around the Earth aboard NASA's Terra spacecraft since 1999. Like the MISR, the MISR-2 would contain a pushbroom array of nine charge-coupled- device (CCD) cameras one aimed at the nadir and the others aimed at different angles sideways from the nadir. The major improvements embodied in the MISR-2 would be the following: A new folded-reflective-optics design would render the MISR-2 only a third as massive as the MISR. Smaller filters and electronic circuits would enable a reduction in volume to a sixth of that of the MISR. The MISR-2 would generate images in two infrared spectral bands in addition to the blue, green, red, and near-infrared spectral bands of the MISR. Miniature polarization filters would be incorporated to add a polarization-sensing capability. Calibration would be performed nonintrusively by use of a gimbaled tenth camera. The main accomplishment thus far has been the construction of an extremely compact all-reflective-optics CCD camera to demonstrate feasibility.

Macenka, Steven; Hovland, Larry; Preston, Daniel; Zellers, Brian; Downing, Kevin

2004-01-01

428

Optimising camera traps for monitoring small mammals.  

PubMed

Practical techniques are required to monitor invasive animals, which are often cryptic and occur at low density. Camera traps have potential for this purpose, but may have problems detecting and identifying small species. A further challenge is how to standardise the size of each camera's field of view so capture rates are comparable between different places and times. We investigated the optimal specifications for a low-cost camera trap for small mammals. The factors tested were 1) trigger speed, 2) passive infrared vs. microwave sensor, 3) white vs. infrared flash, and 4) still photographs vs. video. We also tested a new approach to standardise each camera's field of view. We compared the success rates of four camera trap designs in detecting and taking recognisable photographs of captive stoats (Mustelaerminea), feral cats (Felis catus) and hedgehogs (Erinaceuseuropaeus). Trigger speeds of 0.2-2.1 s captured photographs of all three target species unless the animal was running at high speed. The camera with a microwave sensor was prone to false triggers, and often failed to trigger when an animal moved in front of it. A white flash produced photographs that were more readily identified to species than those obtained under infrared light. However, a white flash may be more likely to frighten target animals, potentially affecting detection probabilities. Video footage achieved similar success rates to still cameras but required more processing time and computer memory. Placing two camera traps side by side achieved a higher success rate than using a single camera. Camera traps show considerable promise for monitoring invasive mammal control operations. Further research should address how best to standardise the size of each camera's field of view, maximise the probability that an animal encountering a camera trap will be detected, and eliminate visible or audible cues emitted by camera traps. PMID:23840790

Glen, Alistair S; Cockburn, Stuart; Nichols, Margaret; Ekanayake, Jagath; Warburton, Bruce

2013-01-01

429

Angled cavity photonic crystal laser diodes with tilted sidewalls for improving far-field patterns.  

PubMed

Angled laser diodes based on the longitudinal photonic band crystal (PBC) waveguide are first proposed and fabricated at a wavelength of 905 nm. Tilted sidewalls are utilized to reflect the light downward, thus enlarging the transverse mode size. In the experiment, continuous wave (CW) output power of 630??mW/facet is achieved, and stable and narrow divergence angles are obtained in the fast and slow axes, simultaneously. The transverse angle is reduced by 44% compared with that of the conventional broad area (BA) laser based on the same wafer, and the lateral angle is only 1.65° with on-axis main-lobe emission. This device shows a promising future for laser emission with ultra-narrow divergence and easy fabrication. PMID:24979001

Liu, L; Liu, Y; Qu, H W; Wang, Y F; Wang, H L; Feng, Z G; Zhang, Y J; Zheng, W H

2014-04-15

430

Heterodyne Interferometer Angle Metrology  

NASA Technical Reports Server (NTRS)

A compact, high-resolution angle measurement instrument has been developed that is based on a heterodyne interferometer. The common-path heterodyne interferometer metrology is used to measure displacements of a reflective target surface. In the interferometer setup, an optical mask is used to sample the measurement laser beam reflecting back from a target surface. Angular rotations, around two orthogonal axes in a plane perpendicular to the measurement- beam propagation direction, are determined simultaneously from the relative displacement measurement of the target surface. The device is used in a tracking telescope system where pitch and yaw measurements of a flat mirror were simultaneously performed with a sensitivity of 0.1 nrad, per second, and a measuring range of 0.15 mrad at a working distance of an order of a meter. The nonlinearity of the device is also measured less than one percent over the measurement range.

Hahn, Inseob; Weilert, Mark A.; Wang, Xu; Goullioud, Renaud

2010-01-01

431

A Second Generation Multi-Angle Imaging SpectroRadiometer (MISR-2)  

NASA Technical Reports Server (NTRS)

The Multi-angle Imaging SpectroRadiometer (MISR) has been in Earth orbit since December 1999 on NASA's Terra spacecraft. This instrument provides new ways of looking at the Earth's atmosphere, clouds, and surface for the purpose of understanding the Earth's ecology, environment, and climate. To facilitate the potential future continuation of MISR's multi-angle observations, a study was undertaken in 1999 and 2000 under the Instrument Incubator Program (IIP) of NASA Code Y's Earth Science Technology Office (ESTO) to investigate and demonstrate the feasibility of a successor to MISR that will have greatly reduced size and mass. The kernel of the program was the design, construction, and testing of a highly miniaturized camera, one of the nine that would probably be used on a future space borne MISR-like instrument. This demonstrated that the size and mass reduction of the optical system and camera electronics are possible and that filters can be assembled to meet the miniaturized packaging requirements. An innovative, reflective optics design was used, enabling the wavelength range to be extended into the shortwave infrared. This was the smallest all-reflective camera ever produced by the contractor. A study was undertaken to determine the feasibility of implementing nine (multi-angle) cameras within a single structure. This resulted in several possible configurations. It would also be possible to incorporate one of the cameras into an airborne instrument.

Bothwell, Graham; Diner, David J.; Pagano, Thomas S.; Duval, Valerie G.; Beregovski, Yuri; Hovland, Larry E.; Preston, Daniel J.

2001-01-01

432

A double-layer electrically controlled birefringence liquid-crystal display with a wide-viewing-angle cone  

Microsoft Academic Search

An electrically controlled birefringence (ECB) LCD (liquid crystal display) has the problem of narrow viewing angle. To solve this problem, a double-layer ECB (D-ECB) LCD with a wide-viewing-angle cone under the application of voltage is proposed. In this device, each layer compensates for the variation of retardation as a function of viewing angle. This double-layer optical compensation method can be

H. Seki; Y. Itoh; T. Uchida; Y. Masuda

1991-01-01

433

HDTV camera using digital contour  

NASA Astrophysics Data System (ADS)

The authers have developed the HSC-100 solid-state High-Definition TV(Camera. The canra promises a 6dB S/N and +6dB sensilivity far superior to conventional IIDTV cameras due to an imaging device construction. It also improves picture quality throughusing a digital contour unit. To satisfy IiDTV (SMPTE 240M) requirements a photo-conductive layered semiconductor imaging device (PSID) with 2 pixels has been developed. An amorphous silicon (a-Si) layeris added to the CCD scanner in this device. The a-Si layer carries out photoelectric conversion then interline transfer CCD reads out the photo induced electric charges. This configuraon provides a pixel aperture ratio of 100 thereby improving sensitivity comparedwith existing models. The layer structure also permits a wide dynamic range. A digital contour unit was developed to improve contour corrector characteristics. S/N and frequency response are improved by introducing digital signal processing. The 56dB S/N value is achieved with an 8 bit A/D converter. This S/N is about 10 dB better than that for conventional ultra-sonic delay lines. In addilion digital processing improves frequency response and delay time stability. A more natural contour correction characteristic has been attained with a contour correction signal derived from the luminance signal. 1.

Sugiki, Tadashi; Nakao, Akria; Uchida, Tomoyuki

1992-08-01

434

The Dark Energy Camera (DECam)  

E-print Network

In this paper we describe the Dark Energy Camera (DECam), which will be the primary instrument used in the Dark Energy Survey. DECam will be a 3 sq. deg. mosaic camera mounted at the prime focus of the Blanco 4m telescope at the Cerro-Tololo International Observatory (CTIO). It consists of a large mosaic CCD focal plane, a five element optical corrector, five filters (g,r,i,z,Y), a modern data acquisition and control system and the associated infrastructure for operation in the prime focus cage. The focal plane includes of 62 2K x 4K CCD modules (0.27"/pixel) arranged in a hexagon inscribed within the roughly 2.2 degree diameter field of view and 12 smaller 2K x 2K CCDs for guiding, focus and alignment. The CCDs will be 250 micron thick fully-depleted CCDs that have been developed at the Lawrence Berkeley National Laboratory (LBNL). Production of the CCDs and fabrication of the optics, mechanical structure, mechanisms, and control system for DECam are underway; delivery of the instrument to CTIO is scheduled for 2010.

K. Honscheid; D. L. DePoy; for the DES Collaboration

2008-10-20

435

The Viking Mars lander camera  

NASA Technical Reports Server (NTRS)

The 7.3 kg cameras for the 1976 Viking Mars expedition feature an array of 12 silicon photodiodes, including six spectral bands for color and near-infrared imaging with an angular resolution of 0.12 deg and four focus steps for broadband imaging, with an improved angular resolution of 0.04 deg. The field of view in elevation ranges from 40 deg above to 60 deg below the horizon, and in azimuth ranges to 342.5 deg. The cameras are mounted 0.8 m apart to provide a stereo view of the area accessible to a surface sampler for biological and chemical investigations. The scanning rates are synchronized to the lander data transmission rates of 16000 bits per sec to the Viking orbiters as relay stations and 250 bits per sec directly to earth. However, image data can also be stored on a lander tape recorder. About 10 million bits of image data will be transmitted during most days of the 60-day-long mission planned for each lander.

Huck, F. O.; Taylor, G. R.; Mccall, H. F.; Patterson, W. R.

1975-01-01

436

Dynamic-angle spinning and double rotation of quadrupolar nuclei  

SciTech Connect

Nuclear magnetic resonance (NMR) spectroscopy of quadrupolar nuclei is complicated by the coupling of the electric quadrupole moment of the nucleus to local variations in the electric field. The quadrupolar interaction is a useful source of information about local molecular structure in solids, but it tends to broaden resonance lines causing crowding and overlap in NMR spectra. Magic- angle spinning, which is routinely used to produce high resolution spectra of spin-{1/2} nuclei like carbon-13 and silicon-29, is incapable of fully narrowing resonances from quadrupolar nuclei when anisotropic second-order quadrupolar interactions are present. Two new sample-spinning techniques are introduced here that completely average the second-order quadrupolar coupling. Narrow resonance lines are obtained and individual resonances from distinct nuclear sites are identified. In dynamic-angle spinning (DAS) a rotor containing a powdered sample is reoriented between discrete angles with respect to high magnetic field. Evolution under anisotropic interactions at the different angles cancels, leaving only the isotropic evolution of the spin system. In the second technique, double rotation (DOR), a small rotor spins within a larger rotor so that the sample traces out a complicated trajectory in space. The relative orientation of the rotors and the orientation of the larger rotor within the magnetic field are selected to average both first- and second-order anisotropic broadening. The theory of quadrupolar interactions, coherent averaging theory, and motional narrowing by sample reorientation are reviewed with emphasis on the chemical shift anisotropy and second-order quadrupolar interactions experienced by half-odd integer spin quadrupolar nuclei. The DAS and DOR techniques are introduced and illustrated with application to common quadrupolar systems such as sodium-23 and oxygen-17 nuclei in solids.

Mueller, K.T. (Lawrence Berkeley Lab., CA (United States) California Univ., Berkeley, CA (United States). Dept. of Chemistry)

1991-07-01

437

High Speed Digital Camera Technology Review  

NASA Technical Reports Server (NTRS)

A High Speed Digital Camera Technology Review (HSD Review) is being conducted to evaluate the state-of-the-shelf in this rapidly progressing industry. Five HSD cameras supplied by four camera manufacturers participated in a Field Test during the Space Shuttle Discovery STS-128 launch. Each camera was also subjected to Bench Tests in the ASRC Imaging Development Laboratory. Evaluation of the data from the Field and Bench Tests is underway. Representatives from the imaging communities at NASA / KSC and the Optical Systems Group are participating as reviewers. A High Speed Digital Video Camera Draft Specification was updated to address Shuttle engineering imagery requirements based on findings from this HSD Review. This draft specification will serve as the template for a High Speed Digital Video Camera Specification to be developed for the wider OSG imaging community under OSG Task OS-33.

Clements, Sandra D.

2009-01-01

438

Advantages of improved timing accuracy in PET cameras using LSOscintillator  

SciTech Connect

PET scanners based on LSO have the potential forsignificantly better coincidence timing resolution than the 6 ns fwhmtypically achieved with BGO. This study analyzes the performanceenhancements made possible by improved timing as a function of thecoincidence time resolution. If 500 ps fwhm coincidence timing resolutioncan be achieved in a complete PET camera, the following four benefits canbe realized for whole-body FDG imaging: 1) The random event rate can bereduced by using a narrower coincidence timing window, increasing thepeak NECR by~;50 percent. 2) Using time-of-flight in the reconstructionalgorithm will reduce the noise variance by a factor of 5. 3) Emissionand transmission data can be acquired simultaneously, reducing the totalscan time. 4) Axial blurring can be reduced by using time-of-flight todetermine the correct axial plane that each event originated from. Whiletime-of-flight was extensively studied in the 1980's, practical factorslimited its effectiveness at that time and little attention has been paidto timing in PET since then. As these potential improvements aresubstantial and the advent of LSO PET cameras gives us the means toobtain them without other sacrifices, efforts to improve PET timingshould resume after their long dormancy.

Moses, William W.

2002-12-02

439

Multi-camera calibration based on openCV and multi-view registration  

NASA Astrophysics Data System (ADS)

For multi-camera calibration systems, a method based on OpenCV and multi-view registration combining calibration algorithm is proposed. First of all, using a Zhang's calibration plate (8X8 chessboard diagram) and a number of cameras (with three industrial-grade CCD) to be 9 group images shooting from different angles, using OpenCV to calibrate the parameters fast in the camera. Secondly, based on the corresponding relationship between each camera view, the computation of the rotation matrix and translation matrix is formulated as a constrained optimization problem. According to the Kuhn-Tucker theorem and the properties on the derivative of the matrix-valued function, the formulae of rotation matrix and translation matrix are deduced by using singular value decomposition algorithm. Afterwards an iterative method is utilized to get the entire coordinate transformation of pair-wise views, thus the precise multi-view registration can be conveniently achieved and then can get the relative positions in them(the camera outside the parameters).Experimental results show that the method is practical in multi-camera calibration .

Deng, Xiao-ming; Wan, Xiong; Zhang, Zhi-min; Leng, Bi-yan; Lou, Ning-ning; He, Shuai

2010-10-01

440

Vehicles' Motion Parameters Detection in Intersection Based on Videos of Low View Angle  

Microsoft Academic Search

By taking Visual C++ and OpenCV as tools and processing video image sequences gathered by one camera under low view angle, suitable foreground detection method of intersection was selected. Through improving blob tracking system provided by OpenCV, the program was developed to detect motion parameters of vehicles in intersection. The motion parameters were smoothed by Kalman filter. It made the

Lu Guangquan; Liu Miaomiao; Xia Shaojun; Deng Cheng

2010-01-01

441

Mission report on the Orbiter Camera Payload System (OCPS) Large Format Camera (LFC) and Attitude Reference System (ARS)  

NASA Technical Reports Server (NTRS)

The Orbiter Camera Payload System (OCPS) is an integrated photographic system which is carried into earth orbit as a payload in the Space Transportation System (STS) Orbiter vehicle's cargo bay. The major component of the OCPS is a Large Format Camera (LFC), a precision wide-angle cartographic instrument that is capable of producing high resolution stereo photography of great geometric fidelity in multiple base-to-height (B/H) ratios. A secondary, supporting system to the LFC is the Attitude Reference System (ARS), which is a dual lens Stellar Camera Array (SCA) and camera support structure. The SCA is a 70-mm film system which is rigidly mounted to the LFC lens support structure and which, through the simultaneous acquisition of two star fields with each earth-viewing LFC frame, makes it possible to determine precisely the pointing of the LFC optical axis with reference to the earth nadir point. Other components complete the current OCPS configuration as a high precision cartographic data acquisition system. The primary design objective for the OCPS was to maximize system performance characteristics while maintaining a high level of reliability compatible with Shuttle launch conditions and the on-orbit environment. The full-up OCPS configuration was launched on a highly successful maiden voyage aboard the STS Orbiter vehicle Challenger on October 5, 1984, as a major payload aboard mission STS 41-G. This report documents the system design, the ground testing, the flight configuration, and an analysis of the results obtained during the Challenger mission STS 41-G.

Mollberg, Bernard H.; Schardt, Bruton B.

1988-01-01

442

Ultrathin cameras using annular folded optics  

Microsoft Academic Search

We present a reflective multiple-fold approach to visible imaging for high-resolution, large aperture cameras of significantly reduced thickness. This approach allows for reduced bulk and weight compared with large high-quality camera systems and improved resolution and light collection compared with miniature conventional cameras. An analysis of the properties of multiple-fold imagers is presented along with the design, fabrication, and testing

Eric J. Tremblay; Ronald A. Stack; Rick L. Morrison; Joseph E. Ford

2007-01-01

443

CINEMA: a system for procedural camera movements  

Microsoft Academic Search

This paper presents a general system for camera movement upon which a wide variety of higher-level methods and applica- tions can be built. In addition to the basic commands for camera placement, a key attribute of the CINEMA system is the ability to inquire information directly about the 3D world through which the camera is moving. With this information high-level

Steven M. Drucker; Tinsley A. Galyean; David Zeltzer

1992-01-01

444

Multi-Camera Human Activity Monitoring  

Microsoft Academic Search

With the proliferation of security cameras, the approach taken to monitoring and placement of these cameras is critical. This\\u000a paper presents original work in the area of multiple camera human activity monitoring. First, a system is presented that tracks\\u000a pedestrians across a scene of interest and recognizes a set of human activities. Next, a framework is developed for the placement

Loren Fiore; Duc Fehr; Robert Bodor; Andrew Drenner; Guruprasad Somasundaram; Nikolaos Papanikolopoulos

2008-01-01

445

Stationary Camera Aims And Zooms Electronically  

NASA Technical Reports Server (NTRS)

Microprocessors select, correct, and orient portions of hemispherical field of view. Video camera pans, tilts, zooms, and provides rotations of images of objects of field of view, all without moving parts. Used for surveillance in areas where movement of camera conspicuous or constrained by obstructions. Also used for closeup tracking of multiple objects in field of view or to break image into sectors for simultaneous viewing, thereby replacing several cameras.

Zimmermann, Steven D.

1994-01-01

446

The Mars Science Laboratory Engineering Cameras  

NASA Astrophysics Data System (ADS)

NASA's Mars Science Laboratory (MSL) Rover is equipped with a set of 12 engineering cameras. These cameras are build-to-print copies of the Mars Exploration Rover cameras described in Maki et al. (J. Geophys. Res. 108(E12): 8071, 2003). Images returned from the engineering cameras will be used to navigate the rover on the Martian surface, deploy the rover robotic arm, and ingest samples into the rover sample processing system. The Navigation cameras (Navcams) are mounted to a pan/tilt mast and have a 45-degree square field of view (FOV) with a pixel scale of 0.82 mrad/pixel. The Hazard Avoidance Cameras (Hazcams) are body-mounted to the rover chassis in the front and rear of the vehicle and have a 124-degree square FOV with a pixel scale of 2.1 mrad/pixel. All of the cameras utilize a 1024×1024 pixel detector and red/near IR bandpass filters centered at 650 nm. The MSL engineering cameras are grouped into two sets of six: one set of cameras is connected to rover computer "A" and the other set is connected to rover computer "B". The Navcams and Front Hazcams each provide similar views from either computer. The Rear Hazcams provide different views from the two computers due to the different mounting locations of the "A" and "B" Rear Hazcams. This paper provides a brief description of the engineering camera properties, the locations of the cameras on the vehicle, and camera usage for surface operations.

Maki, J.; Thiessen, D.; Pourangi, A.; Kobzeff, P.; Litwin, T.; Scherr, L.; Elliott, S.; Dingizian, A.; Maimone, M.

2012-09-01

447

Multiplex imaging with multiple-pinhole cameras  

NASA Technical Reports Server (NTRS)

When making photographs in X rays or gamma rays with a multiple-pinhole camera, the individual images of an extended object such as the sun may be allowed to overlap. Then the situation is in many ways analogous to that in a multiplexing device such as a Fourier spectroscope. Some advantages and problems arising with such use of the camera are discussed, and expressions are derived to describe the relative efficacy of three exposure/postprocessing schemes using multiple-pinhole cameras.

Brown, C.

1974-01-01

448

Surveillance Camera Tracking of Geo positioned Objects  

Microsoft Academic Search

A system for tracking moving objects with a known GPS position using a set of PTZ cameras with limited areas of coverage is\\u000a presented. The basic idea of the system and its possible applications are discussed. The proposed camera calibration technique\\u000a is used to transform the GPS position to camera settings. The current position of the tracked object is predicted

Piotr Dalka; Andrzej Ciarkowski; Piotr Szczuko; Grzegorz Szwoch; Andrzej Czy?ewski

449

Narrow-band imaging: potential and limitations.  

PubMed

The purpose of introducing optical electronics into video endoscopes is to improve the accuracy of diagnosis through image processing and digital technology. Narrow-band imaging (NBI), one of the most recent techniques, involves the use of interference filters to illuminate the target in narrowed red, green and blue (R/G/B) bands of the spectrum. This results in different images at distinct levels of the mucosa and increases the contrast between the epithelial surface and the subjacent vascular network. NBI can be combined with magnifying endoscopy with an optical zoom. The aim of this new technique is to characterize the surface of the distinct types of gastrointestinal epithelia - e. g., intestinal metaplasia in Barrett's esophagus. The technique may also make it possible to demonstrate disorganization of the vascular pattern in inflammatory disorders of the gastrointestinal mucosa and in superficial neoplastic lesions in the esophagus, stomach, and large bowel. PMID:16429359

Kuznetsov, K; Lambert, R; Rey, J-F

2006-01-01

450

One-angle fluorescence tomography with in-and-out motion  

PubMed Central

The usual tomography is achieved by acquiring measurements around an object with multiple angles. The possibility of obtaining a fluorescence tomographic image from measurements at only one angle is explored. Instead of rotating around the object, the camera (or the objective lens) moves toward (or away from) the object and takes photographs while the camera’s focal plane passes through the object. The volume of stacked two-dimensional pictures forms a blurred three-dimensional image. The true image can be obtained by deconvolving the system’s point spread function. Simplified computer simulations are carried out to verify the feasibility of the proposed method. The computer simulations indicate that it is feasible to obtain a tomographic image by using the in-and-out motion to acquire data. PMID:25520544

Zeng, Gengsheng L.

2014-01-01

451

Upgrading optical information of rotating mirror cameras  

NASA Astrophysics Data System (ADS)

To date, rotating mirror (RM) cameras still serve as indispensable imaging equipment for the diagnosis of microsecond transient processes due to their excellent characteristics. This paper, for upgrading the optical information capacity of the cameras, presents the new optical acceleration principle to increase the framing frequency or scanning velocity, the new design theory without principle errors instead of the classical theories with some flaws in principle to have applied to design our simultaneous streak and framing rotating mirror camera with continuous access, and the new rotating mirror with novel structure, made of an aluminum alloy, to have considerably reduced lateral deformation of the RM and improved the performance of the camera.

Li, Jingzhen; Sun, Fengshan; Huang, Hongbin; Gong, Xiangdong; Chen, Hongyi; Lu, Xiaowei; Cai, Yi

2014-11-01

452

Advanced High-Definition Video Cameras  

NASA Technical Reports Server (NTRS)

A product line of high-definition color video cameras, now under development, offers a superior combination of desirable characteristics, including high frame rates, high resolutions, low power consumption, and compactness. Several of the cameras feature a 3,840 2,160-pixel format with progressive scanning at 30 frames per second. The power consumption of one of these cameras is about 25 W. The size of the camera, excluding the lens assembly, is 2 by 5 by 7 in. (about 5.1 by 12.7 by 17.8 cm). The aforementioned desirable characteristics are attained at relatively low cost, largely by utilizing digital processing in advanced field-programmable gate arrays (FPGAs) to perform all of the many functions (for example, color balance and contrast adjustments) of a professional color video camera. The processing is programmed in VHDL so that application-specific integrated circuits (ASICs) can be fabricated directly from the program. ["VHDL" signifies VHSIC Hardware Description Language C, a computing language used by the United States Department of Defense for describing, designing, and simulating very-high-speed integrated circuits (VHSICs).] The image-sensor and FPGA clock frequencies in these cameras have generally been much higher than those used in video cameras designed and manufactured elsewhere. Frequently, the outputs of these cameras are converted to other video-camera formats by use of pre- and post-filters.

Glenn, William

2007-01-01

453

Optimum Projection Angle for Attaining Maximum Distance in a Rugby Place Kick  

PubMed Central

This study investigated the effect of projection angle on the distance attained in a rugby place kick. A male rugby player performed 49 maximum-effort kicks using projection angles of between 20 and 50°. The kicks were recorded by a video camera at 50 Hz and a 2 D biomechanical analysis was conducted to obtain measures of the projection velocity and projection angle of the ball. The player’s optimum projection angle was calculated by substituting a mathematical expression for the relationship between projection velocity and projection angle into the equations for the aerodynamic flight of a rugby ball. We found that the player’s calculated optimum projection angle (30.6°, 95% confidence limits ± 1.9°) was in close agreement with his preferred projection angle (mean value 30.8°, 95% confidence limits ± 2.1°). The player’s calculated optimum projection angle was also similar to projection angles previously reported for skilled rugby players. The optimum projection angle in a rugby place kick is considerably less than 45° because the projection velocity that a player can produce decreases substantially as projection angle is increased. Aerodynamic forces and the requirement to clear the crossbar have little effect on the optimum projection angle. Key Points The optimum projection angle in a rugby place kick is about 30°. The optimum projection angle is considerably less than 45° because the projection velocity that a player can produce decreases substantially as projection angle is increased. Aerodynamic forces and the requirement to clear the crossbar have little effect on the optimum projection angle. PMID:24570626

Linthorne, Nicholas P.; Stokes, Thomas G.

2014-01-01

454

Employing Light Field Cameras in Surveillance: An Analysis of Light Field Cameras in a Surveillance Scenario  

NASA Astrophysics Data System (ADS)

The light field cameras are becoming a new trend in photography. They have characteristics that could overcome some problems present in surveillance videos, like refocusing and depth estimation. The main advantage is that the light field camera allows focus reconstruction, increasing the depth of field. However, these new cameras require different processing power and storage than conventional cameras. In this paper, it is shown how these cameras work and how they could be employed in surveillance. Images of a standard surveillance camera versus the plenoptic camera are compared and some issues about the design of the light field camera are discussed. Finally, we compare the performance of video compression standards applied to plenoptic image sequences. The power signal noise ratio results show that MPEG4 is the best choice.

Higa, Rogério Seiji; Iano, Yuzo; Leite, Ricardo Barroso; Chavez, Roger Fredy Larico; Arthur, Rangel

2014-03-01

455

Generalization of the Euler Angles  

NASA Technical Reports Server (NTRS)

It is shown that the Euler angles can be generalized to axes other than members of an orthonormal triad. As first shown by Davenport, the three generalized Euler axes, hereafter: Davenport axes, must still satisfy the constraint that the first two and the last two axes be mutually perpendicular if these axes are to define a universal set of attitude parameters. Expressions are given which relate the generalized Euler angles, hereafter: Davenport angles, to the 3-1-3 Euler angles of an associated direction-cosine matrix. The computation of the Davenport angles from the attitude matrix and their kinematic equation are presented. The present work offers a more direct development of the Davenport angles than Davenport's original publication and offers additional results.

Bauer, Frank H. (Technical Monitor); Shuster, Malcolm D.; Markley, F. Landis

2002-01-01

456

Illumination box and camera system  

DOEpatents

A hand portable, field-deployable thin-layer chromatography (TLC) unit and a hand portable, battery-operated unit for development, illumination, and data acquisition of the TLC plates contain many miniaturized features that permit a large number of samples to be processed efficiently. The TLC unit includes a solvent tank, a holder for TLC plates, and a variety of tool chambers for storing TLC plates, solvent, and pipettes. After processing in the TLC unit, a TLC plate is positioned in a collapsible illumination box, where the box and a CCD camera are optically aligned for optimal pixel resolution of the CCD images of the TLC plate. The TLC system includes an improved development chamber for chemical development of TLC plates that prevents solvent overflow.

Haas, Jeffrey S. (San Ramon, CA); Kelly, Fredrick R. (Modesto, CA); Bushman, John F. (Oakley, CA); Wiefel, Michael H. (La Honda, CA); Jensen, Wayne A. (Livermore, CA); Klunder, Gregory L. (Oakland, CA)

2002-01-01

457

Explosive Transient Camera (ETC) Program  

NASA Technical Reports Server (NTRS)

Since the inception of the ETC program, a wide range of new technologies was developed to support this astronomical instrument. The prototype unit was installed at ETC Site 1. The first partially automated observations were made and some major renovations were later added to the ETC hardware. The ETC was outfitted with new thermoelectrically-cooled CCD cameras and a sophisticated vacuum manifold, which, together, made the ETC a much more reliable unit than the prototype. The ETC instrumentation and building were placed under full computer control, allowing the ETC to operate as an automated, autonomous instrument with virtually no human intervention necessary. The first fully-automated operation of the ETC was performed, during which the ETC monitored the error region of the repeating soft gamma-ray burster SGR 1806-21.

Ricker, George

1991-01-01

458

Wind dynamic range video camera  

NASA Technical Reports Server (NTRS)

A television camera apparatus is disclosed in which bright objects are attenuated to fit within the dynamic range of the system, while dim objects are not. The apparatus receives linearly polarized light from an object scene, the light being passed by a beam splitter and focused on the output plane of a liquid crystal light valve. The light valve is oriented such that, with no excitation from the cathode ray tube, all light is rotated 90 deg and focused on the input plane of the video sensor. The light is then converted to an electrical signal, which is amplified and used to excite the CRT. The resulting image is collected and focused by a lens onto the light valve which rotates the polarization vector of the light to an extent proportional to the light intensity from the CRT. The overall effect is to selectively attenuate the image pattern focused on the sensor.

Craig, G. D. (inventor)

1985-01-01

459

Gesture recognition on smart cameras  

NASA Astrophysics Data System (ADS)

Gesture recognition is a feature in human-machine interaction that allows more natural interaction without the use of complex devices. For this reason, several methods of gesture recognition have been developed in recent years. However, most real time methods are designed to operate on a Personal Computer with high computing resources and memory. In this paper, we analyze relevant methods found in the literature in order to investigate the ability of smart camera to execute gesture recognition algorithms. We elaborate two hand gesture recognition pipelines. The first method is based on invariant moments extraction and the second on finger tips detection. The hand detection method used for both pipeline is based on skin color segmentation. The results obtained show that the un-optimized versions of invariant moments method and finger tips detection method can reach 10 fps on embedded processor and use about 200 kB of memory.

Dziri, Aziz; Chevobbe, Stephane; Darouich, Mehdi

2013-02-01

460

Autofocus system for space cameras  

NASA Astrophysics Data System (ADS)

A novel autofocusing technique using a self-imaging approach is developed for a space camera having large aperture and long focus. Its optical system consists of an emitting system, receiving system, and two pentagonal prisms. The emitting system has three parts: a light source, beamsplitter, and aim slit. The receiving system includes a field lens, two separate lenses, and CCD. Using two little pentaprisms instead of a large planar mirror makes the autofocus system simple and compact. A defocused signal is acquired by measuring the change of the slit self-image position imaged on the CCD. The algorithm to calculate the movement of the slit self-image position is analyzed to achieve subpixel position estimation. The results of the experiments show a +/-10-µm precision of autofocusing in a range of +/-5-mm defocusing. The system demonstrates high resolution, easy adjustment, and high reliability. Also this method can meet the requirements of various accurate focus measurement systems.

Li, Zhaohui; Wu, Keyong

2005-05-01

461

Experimental Investigation of the Cutting Front Angle during Remote Fusion Cutting  

NASA Astrophysics Data System (ADS)

Remote fusion cutting represents a promising tool in the field of laser material processing. To further promote the practical application of this process, it is necessary to gain additional know-how concerning the process dynamics. In this contribution an experimental investigation of the cutting front angle is presented. This angle is calculated by evaluating recordings of the process with a high speed camera. Analysis reveals the varying influences of the different process parameters regarding the cutting front angle. Furthermore, a process model based on the achieved results is introduced and discussed. Finally, the relevant findings are summarised and an outlook concerning further investigations is given.

Schober, A.; Musiol, J.; Daub, R.; Feil, J.; Zaeh, M. F.

462

Theodolite with CCD Camera for Safe Measurement of Laser-Beam Pointing  

NASA Technical Reports Server (NTRS)

The simple addition of a charge-coupled-device (CCD) camera to a theodolite makes it safe to measure the pointing direction of a laser beam. The present state of the art requires this to be a custom addition because theodolites are manufactured without CCD cameras as standard or even optional equipment. A theodolite is an alignment telescope equipped with mechanisms to measure the azimuth and elevation angles to the sub-arcsecond level. When measuring the angular pointing direction of a Class ll laser with a theodolite, one could place a calculated amount of neutral density (ND) filters in front of the theodolite s telescope. One could then safely view and measure the laser s boresight looking through the theodolite s telescope without great risk to one s eyes. This method for a Class ll visible wavelength laser is not acceptable to even consider tempting for a Class IV laser and not applicable for an infrared (IR) laser. If one chooses insufficient attenuation or forgets to use the filters, then looking at the laser beam through the theodolite could cause instant blindness. The CCD camera is already commercially available. It is a small, inexpensive, blackand- white CCD circuit-board-level camera. An interface adaptor was designed and fabricated to mount the camera onto the eyepiece of the specific theodolite s viewing telescope. Other equipment needed for operation of the camera are power supplies, cables, and a black-and-white television monitor. The picture displayed on the monitor is equivalent to what one would see when looking directly through the theodolite. Again, the additional advantage afforded by a cheap black-and-white CCD camera is that it is sensitive to infrared as well as to visible light. Hence, one can use the camera coupled to a theodolite to measure the pointing of an infrared as well as a visible laser.

Crooke, Julie A.

2003-01-01

463

HRSC: High resolution stereo camera  

USGS Publications Warehouse

The High Resolution Stereo Camera (HRSC) on Mars Express has delivered a wealth of image data, amounting to over 2.5 TB from the start of the mapping phase in January 2004 to September 2008. In that time, more than a third of Mars was covered at a resolution of 10-20 m/pixel in stereo and colour. After five years in orbit, HRSC is still in excellent shape, and it could continue to operate for many more years. HRSC has proven its ability to close the gap between the low-resolution Viking image data and the high-resolution Mars Orbiter Camera images, leading to a global picture of the geological evolution of Mars that is now much clearer than ever before. Derived highest-resolution terrain model data have closed major gaps and provided an unprecedented insight into the shape of the surface, which is paramount not only for surface analysis and geological interpretation, but also for combination with and analysis of data from other instruments, as well as in planning for future missions. This chapter presents the scientific output from data analysis and highlevel data processing, complemented by a summary of how the experiment is conducted by the HRSC team members working in geoscience, atmospheric science, photogrammetry and spectrophotometry. Many of these contributions have been or will be published in peer-reviewed journals and special issues. They form a cross-section of the scientific output, either by summarising the new geoscientific picture of Mars provided by HRSC or by detailing some of the topics of data analysis concerning photogrammetry, cartography and spectral data analysis.

Neukum, G.; Jaumann, R.; Basilevsky, A.T.; Dumke, A.; Van Gasselt, S.; Giese, B.; Hauber, E.; Head, J. W., III; Heipke, C.; Hoekzema, N.; Hoffmann, H.; Greeley, R.; Gwinner, K.; Kirk, R.; Markiewicz, W.; McCord, T.B.; Michael, G.; Muller, J.-P.; Murray, J.B.; Oberst, J.; Pinet, P.; Pischel, R.; Roatsch, T.; Scholten, F.; Willner, K.

2009-01-01

464

Camera handoff with adaptive resource management for multi-camera multi-object tracking  

E-print Network

-object tracking Resource management Surveillance system a b s t r a c t Camera handoff is a crucial step to obtain a continuously tracked and consistently labeled trajectory of the object of interest in multi-camera surveillance of a surveillance system, it becomes increasingly difficult for a single camera to accomplish object tracking

Abidi, Mongi A.

465

NARROW-K-BAND OBSERVATIONS OF THE GJ 1214 SYSTEM  

SciTech Connect

GJ 1214 is a nearby M dwarf star that hosts a transiting super-Earth-size planet, making this system an excellent target for atmospheric studies. Most studies find that the transmission spectrum of GJ 1214b is flat, which favors either a high mean molecular weight or cloudy/hazy hydrogen (H) rich atmosphere model. Photometry at short wavelengths (<0.7 ?m) and in the K band can discriminate the most between these different atmosphere models for GJ 1214b, but current observations do not have sufficiently high precision. We present photometry of seven transits of GJ 1214b through a narrow K-band (2.141 ?m) filter with the Wide Field Camera on the 3.8 m United Kingdom Infrared Telescope. Our photometric precision is typically 1.7 × 10{sup –3} (for a single transit), comparable with other ground-based observations of GJ 1214b. We measure a planet-star radius ratio of 0.1158 ± 0.0013, which, along with other studies, also supports a flat transmission spectrum for GJ 1214b. Since this does not exclude a scenario where GJ 1214b has an H-rich envelope with heavy elements that are sequestered below a cloud/haze layer, we compare K-band observations with models of H{sub 2} collision-induced absorption in an atmosphere for a range of temperatures. While we find no evidence for deviation from a flat spectrum (slope s = 0.0016 ± 0.0038), an H{sub 2}-dominated upper atmosphere (<60 mbar) cannot be excluded. More precise observations at <0.7 ?m and in the K band, as well as a uniform analysis of all published data, would be useful for establishing more robust limits on atmosphere models for GJ 1214b.

Colón, Knicole D.; Gaidos, Eric, E-mail: colonk@hawaii.edu [Department of Geology and Geophysics, University of Hawaii at Manoa, Honolulu, HI 96822 (United States)

2013-10-10

466

8.G Find the Angle  

NSDL National Science Digital Library

This is a task from the Illustrative Mathematics website that is one part of a complete illustration of the standard to which it is aligned. Each task has at least one solution and some commentary that addresses important asects of the task and its potential use. Here are the first few lines of the commentary for this task: In triangle $\\Delta ABC$, point $M$ is the point of intersection of the bisectors of angles $\\angle BAC$, $\\angle ABC$, and $\\angle ACB$. The measure o...

467

Wrist Camera Orientation for Effective Telerobotic Orbital Replaceable Unit (ORU) Changeout  

NASA Technical Reports Server (NTRS)

The Hydraulic Manipulator Testbed (HMTB) is the kinematic replica of the Flight Telerobotic Servicer (FTS). One use of the HMTB is to evaluate advanced control techniques for accomplishing robotic maintenance tasks on board the Space Station. Most maintenance tasks involve the direct manipulation of the robot by a human operator when high-quality visual feedback is important for precise control. An experiment was conducted in the Systems Integration Branch at the Langley Research Center to compare several configurations of the manipulator wrist camera for providing visual feedback during an Orbital Replaceable Unit changeout task. Several variables were considered such as wrist camera angle, camera focal length, target location, lighting. Each study participant performed the maintenance task by using eight combinations of the variables based on a Latin square design. The results of this experiment and conclusions based on data collected are presented.

Jones, Sharon Monica; Aldridge, Hal A.; Vazquez, Sixto L.

1997-01-01

468

Be Foil "Filter Knee Imaging" NSTX Plasma with Fast Soft X-ray Camera  

SciTech Connect

A fast soft x-ray (SXR) pinhole camera has been implemented on the National Spherical Torus Experiment (NSTX). This paper presents observations and describes the Be foil Filter Knee Imaging (FKI) technique for reconstructions of a m/n=1/1 mode on NSTX. The SXR camera has a wide-angle (28{sup o}) field of view of the plasma. The camera images nearly the entire diameter of the plasma and a comparable region in the vertical direction. SXR photons pass through a beryllium foil and are imaged by a pinhole onto a P47 scintillator deposited on a fiber optic faceplate. An electrostatic image intensifier demagnifies the visible image by 6:1 to match it to the size of the charge-coupled device (CCD) chip. A pair of lenses couples the image to the CCD chip.

B.C. Stratton; S. von Goeler; D. Stutman; K. Tritz; L.E. Zakharov

2005-08-08

469

Upgrading a CCD camera for astronomical use  

E-print Network

Existing charge-coupled device (CCD) video cameras have been modified to be used for astronomical imaging on telescopes in order to improve imaging times over those of photography. An astronomical CCD camera at the Texas A&M Observatory would...

Lamecker, James Frank

1993-01-01

470

Making a room-sized camera obscura  

NASA Astrophysics Data System (ADS)

We describe how to convert a room into a camera obscura as a project for introductory geometrical optics. The view for our camera obscura is a busy street scene set against a beautiful mountain skyline. We include a short video with project instructions, ray diagrams and delightful moving images of cars driving on the road outside.

Flynt, Halima; Ruiz, Michael J.

2015-01-01

471

Mobile Robot Geometry Initialization from Single Camera  

E-print Network

Mobile Robot Geometry Initialization from Single Camera Daniel Pizarro1 , Manuel Mazo1 , Enrique to achieve robot localization has been widely proposed in the area of Intelligent Spaces. Recently, an online approach that simul- taneously obtains robot's pose and its 3D structure using a single external camera has

Paris-Sud XI, Université de